AI companies scrapping content are DDoS-ing servers

How should we deal with the exponential increase in scrappers? The server admins will now need to daily search the logs and update robots.txt or the firewall to block all the new crawlers? The owners will need to pay more for increased infrastructure resources to deal with all this artificial visits. The traffic analytics will have a bad day trying to filter the human visits from the crawlers.

Read More
Migrating from Get-Msol to Microsoft Graph Get-Mg (Get-MsolUser, Get-MsolDevice, Get-MgUser, Get-MgDevice, Get-AzureADAuditSignInLogs, Get-MgBetaAuditLogSignIn)

Migrating to PowerShell Microsoft Graph to monitor users and devices in Entra. Looking for device join type (Hybrid joined, Entra joined or Entra registered), Entra roles assigments, login logs and M365 provisioning errors.

Read More
The Internet and .txt files in the era of AI, ML and data scrapping

For the non developers, the web industry is using .txt files at the root of a domain to provide guidance for different kind of bots wandering on the Internet and indexing/scrapping data. The bots can ignore the instructions, declare a different User-Agent or other kind of evasion methods. robots.txt - it instructs crawlers on which […]

Read More