I've been thinking of imbedding garbage text to poison the data that wont be visible on the user end, but that would increase the size of the website, and users would still be downloading that text. (not that the website is all that beg to begin with. doubling or tripling the size of the website would be handled by any machine rn)
Seems like a lot of people edit their sites directly inside the neocities editor. Absolutely wild. Remember to backup your site every now and then. You never know when a server will go kaput.
robots.txt and a script but a good chunk of these crawlers will just ignore the robots.txt.
There's no perfect shielding from botss and crawlers yet, I'm still looking around for a solution for a basic website such as neocities.
I've been thinking of imbedding garbage text to poison the data that wont be visible on the user end, but that would increase the size of the website, and users would still be downloading that text. (not that the website is all that beg to begin with. doubling or tripling the size of the website would be handled by any machine rn)