Bypass-ing the requirement to use an online account during Windows 11 OOBE.
We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.
The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ...
Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.
Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.
Bypass-ing the requirement to use an online account during Windows 11 OOBE.
I was invited by Netdata to be part of their User Panel discussion and to share my experiences with Netdata.
I am doing it at least once a year. As a friendly reminder, backup your data that is stored on cloud providers. They are not accountable if the data is lost and your account can be terminated at any time. Have you read the terms and conditions? The big platforms have the option to ask […]
How should we deal with the exponential increase in scrappers? The server admins will now need to daily search the logs and update robots.txt or the firewall to block all the new crawlers? The owners will need to pay more for increased infrastructure resources to deal with all this artificial visits. The traffic analytics will have a bad day trying to filter the human visits from the crawlers.
Migrating to PowerShell Microsoft Graph to monitor users and devices in Entra. Looking for device join type (Hybrid joined, Entra joined or Entra registered), Entra roles assigments, login logs and M365 provisioning errors.
For the non developers, the web industry is using .txt files at the root of a domain to provide guidance for different kind of bots wandering on the Internet and indexing/scrapping data. The bots can ignore the instructions, declare a different User-Agent or other kind of evasion methods. robots.txt - it instructs crawlers on which […]
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.