@eam-archives I saved this website in october for the lols but someone keeps saving this site as well as my first one. I saved my side project (glowstick galore) willingly but like.. i dont understand why my site(s) is/are being saved
i thought that the wayback machine like did automatic scans of the web 4 stuff to save, like by themselves? idk i may be wrong, but liek maybe its bc the site itself did it yknow?
You can manually submit a site to the WaybackMachine. But once a site is submitted, I'm pretty sure they log it, and scan it for changes at least once every 2 months. I just recently watched a documentary about it. They also crawl the web and archive sites automatically. And WaybackMachine has a partnership with NeoCities too, I believe, to help backup and archive the web.
if you don't want it archivd by bots or humans you can add tags like META NAME="ROBOTS" CONTENT="NOARCHIVE" to prevent this, but most likely these are bots crawling and archiving your site
who knew that under an hour of html research due to wanting to make a website because i found out about neocities nearly a year ago would lead me to where I am now
this made me look and they saved mine in october of last yr loll, it looked so trash
Not me. Satan forbid.
@eam-archives I saved this website in october for the lols but someone keeps saving this site as well as my first one. I saved my side project (glowstick galore) willingly but like.. i dont understand why my site(s) is/are being saved
@gundham even if it was you, it's not an extremely big deal. I'm asking out of curiosity mostly
i thought that the wayback machine like did automatic scans of the web 4 stuff to save, like by themselves? idk i may be wrong, but liek maybe its bc the site itself did it yknow?
@eam-archives I'm thinking that might be the case
someone has been saving mine since starting and they saved my embarrassing diary entries from high school
@yupthatsme oh no...
You can manually submit a site to the WaybackMachine. But once a site is submitted, I'm pretty sure they log it, and scan it for changes at least once every 2 months. I just recently watched a documentary about it. They also crawl the web and archive sites automatically. And WaybackMachine has a partnership with NeoCities too, I believe, to help backup and archive the web.
@james02 that explains it! thanks! that's pretty interesting to know.
if you don't want it archivd by bots or humans you can add tags like META NAME="ROBOTS" CONTENT="NOARCHIVE" to prevent this, but most likely these are bots crawling and archiving your site
@omfg thanks!