Tell AI crawlers not to scan your site to train their AI models.
You can request AI web crawlers exclude your site from scans, while leaving it available for visitors to access.
What are AI crawlers?
AI crawlers are used by AI companies to train their large language models (LLMs). These models learn from the content of websites they scan, and use the data to improve the accuracy and general capabilities of their AI models.
Exclude your site from AI crawler scans
By default, AI crawlers scan all content published on the internet, whether it's published on a Squarespace website or not, but if you'd like to exclude your website from these scans, you can tell certain AI crawlers not to crawl your site.
To request AI crawlers not scan your site:
- Open the Settings panel.
- Click Crawlers.
- Switch the Artificial Intelligence Crawlers toggle off.
Switching the Artificial Intelligence Crawlers toggle off updates your robots.txt file to tell the following bots not to crawl your site:
- Anthropic AI
- Google Extended
- GPTBot and ChatGPT-User
Note: It's currently not possible to request that AI crawlers only scan specific pages. Switching the Artificial Intelligence Crawlers toggle off doesn't retroactively remove content previously scraped from a site from AI model training data.