Bad news: the 1GB bandwidth usage that some of us are seeing is probably AI scrapping our sites. Neocities don't have any measures to prevent this, Nekoweb does.
Just realized this when I saw that my favorite astrology site has banned unregistered users from making charts due to AI bots overwhelming their servers and eating up a lot of bandwidth with all the scrapping.
It can also be crawlers indexing and updating your urls for search engines and your pages and assets being retrieved in search results. If you create a new site your robots.txt have a full list of known AI crawlers being blocked (probably the same being used by Nekoweb), but it would be up to you to keep updating it - and if you created your site before Kyle updating the default robots.txt, then you don't have it.
@pirahxcx I didn't know that robots.txt existed, thanks. But I don't think all of the bandwidth usage it's just search engines, considering the AI shithole we're living in.
Just realized this when I saw that my favorite astrology site has banned unregistered users from making charts due to AI bots overwhelming their servers and eating up a lot of bandwidth with all the scrapping.
It can also be crawlers indexing and updating your urls for search engines and your pages and assets being retrieved in search results. If you create a new site your robots.txt have a full list of known AI crawlers being blocked (probably the same being used by Nekoweb), but it would be up to you to keep updating it - and if you created your site before Kyle updating the default robots.txt, then you don't have it.
Yeah, Nekoweb approach to it is way better, but you can prevent it on Neocities by updating the robots.txt.
@pirahxcx I didn't know that robots.txt existed, thanks. But I don't think all of the bandwidth usage it's just search engines, considering the AI shithole we're living in.