A quick-and-dirty manual count says 45 but probably only 35 are actually accessibly linked-to content
one way could be to use a website downloader. it'll get all the pages that anyone on your site could get to.
any suggestions on one to try? I've googled em before but struggled with getting one to work
If you have access to a unix terminal and the neocities CLI... `neocities list -a | grep .html | wc -l` (without the backquotes) should work
Hey thanks alot, I've put a lot of time in this (mostly cuz I suck at html/css), so it means a lot to me c: