Cloudflare launches Content Signals Policy to fight AI crawlers and scrapers
Cloudflare is giving publishers and site owners a way to declare preferences on how AI companies use their content, expanding robots.txt with new policy language.
Cloudflare is giving publishers and site owners a way to declare preferences on how AI companies use their content, expanding robots.txt with new policy language.
Cloudflare is now blocking AI crawlers by default, putting control back in the hands of content creators. This new permission-based model stops unauthorized scraping and helps protect the future of original work online.