found you from the no AI webring, check your robots.txt, hashtags are still present at the beginning of your User-agent and Disallow tags. AI will treat these tags as comments and still scrape you.
Seen your site from the no AI webring. The User-agent and Disallow: strings in your robots.txt still have hashtags in them, if you don't remove them, AI scrapers will treat what you're trying to block as comments and still scrape your site.
You're on the no AI webring, I found you from there. The User-agent and Disallow: strings in your robots.txt file still seem to contain hashtags. Scrapers treat these as comments and still scrape your site. If your goal is to block AI scraping your content, you may want to remove them.
Found you from the no AI webring, your robots.txt file has hashtags infront of the User-agent and Disallow: strings, scrapers treat these as comments and will still scrape your site.
Found you from the no AI ring, your robots.txt needs to remove the hashtag from Disallow: or scrapers will interpret it as a comment and scrape your site anyways.
I just found your site from the no AI webring. Your robots.txt file has hashtags infront of the User-agent and Disallow: tags and will be interpreted by bots as comments and will still scrape your site. If you don't want those user agents scraping your site, you may want to remove the hashtags from them. Good luck with the site creation!
Hey, I like the design of your site! I'm currently going through the no AI webring and I noticed that your robots.txt file has hashtags in front of the User-agent and Disallow: strings, if you intend to block AI scrapers from accessing your site, you may want to remove them.
Hey, I found you from the no ai webring. I don't know if you're meaning to do this or not, but I'm just letting you know that inside of your robots.txt, the AI 'User-agent' and 'Disallow:' strings have hashtags infront of them, they may be interpreted as comments and disregarded by AI and your site may still be scraped.
Hey, Nice site! I might be wrong, but I think the Disallow: line in your robots.txt needs to have the hashtag removed, it might be treated as a comment by scrapers and not executed. Im going through the no AI webring and I found you so I thought I'd let you know.
thanks, it has been fixed.