I'd actually reccomend checking out this page in specific: https://baccyflap.com/res/robots/. They have a great list for a robots.txt file, along with a few other measures to assist in bot proofing a site!
I didn't realise newer Neocities sites automatic have a robots.txt file to prevent them from getting snooped by ChatGPT and stuff, are you supposed to leave the # on the last line or not?
This configuration is very effective at blocking AI bots, but it has a major drawback: it also blocks several services that are not just AI-related. Consider exemptions.
isn't there a basic one that everyone uses? unless that's not 'proper'
I'm not tech savvy on that level, mine wasn't correct but I don't know which crawls to block or allow
I'd actually reccomend checking out this page in specific: https://baccyflap.com/res/robots/. They have a great list for a robots.txt file, along with a few other measures to assist in bot proofing a site!