Exploring Search Types in Azure AI Search Service Indexes

I'm exploring replacing the WordPress search engine with an AI-powered search that uses AI models to better understand human queries beyond simple keywords, delivering more natural and comprehensible responses. After indexing and embedding my content using text-embedding-ada-002 in the Azure AI Search Service index and performing some queries, I noticed that the query responses varied […]

Read More
Integrating xAI’s Grok into rotrafic.xyz

AI is the future — this is clear. However, besides features like engaging voices and provocative humor, significant work remains before AI can shoulder the responsibility of shaping humanity’s destiny. Currently, AI can be used to incrementally enhance our lives, with augmentation as the central concept, and this approach will guide its development in the […]

Read More
Integrating an Azure AI Foundry Chatbot into Your Website

In a previous post, I detailed the process of deploying an AI model with Azure AI Foundry. Now, in this article, I’ll guide you through integrating it into a website as a chatbot. The idea is to have a PHP file on the server that will deal with the AI endpoint. This file will receive […]

Read More
AI Unleashed: Deploy AI Models to Your Tenant with Azure AI Foundry

In addition to the public services provided by AI players and their APIs, there are situations where we prefer to deploy a model within our own tenant to ensure better alignment with legislation, data compliance, and data regionalization requirements. Microsoft offers a straightforward solution for this through Azure AI Foundry. The Azure AI Foundry portal […]

Read More
AI companies scrapping content are DDoS-ing servers

How should we deal with the exponential increase in scrappers? The server admins will now need to daily search the logs and update robots.txt or the firewall to block all the new crawlers? The owners will need to pay more for increased infrastructure resources to deal with all this artificial visits. The traffic analytics will have a bad day trying to filter the human visits from the crawlers.

Read More
The Internet and .txt files in the era of AI, ML and data scrapping

For the non developers, the web industry is using .txt files at the root of a domain to provide guidance for different kind of bots wandering on the Internet and indexing/scrapping data. The bots can ignore the instructions, declare a different User-Agent or other kind of evasion methods. robots.txt - it instructs crawlers on which […]

Read More