I don't know if it is a scraper involved. However the facts of the matter are that Kyle tried at least twice to add "coding assistants" to neocities despite understanding that AI/ML A: Contributes significantly to climate change and B: The neocities crowd is against this kind of stuff, we came here to avoid some the issues that plauge the modern web
C: Kyle has stated on reddit that the plan is only on hold, it could very well still happen
So for the sake of being fair, Kyle acted pretty shady and could have potentially thought about scraping the data or selling it to AI companies to use BUT so far there is no proof of that so thats not what I am saying.
What there is proof of is coding assistants being added to the neocities text editor! These got added under the guise that they help and assist new webdevs HOWEVER the code produced by "coding assistants" is notorious for presenting code that is way more trouble then it's worth. What this will realistically result in is more lower quality web pages that don't function or require more time to fix then code that you-
Anyways no, it's was not just autocorrect. When it was functional you would ask it a question, it would give a response based off that question.
Put the new album to the fresh tunes window! [https://elysiantunes.bandcamp.com/album/bad-future-hi-speed-gabba-party] if people are curious
Yeah, i know how to make registers and ALU and the control unit etc out of basic logic chips. I can make them. I can make my own assembly language that translates into machine code for that exact abomination. That would be fun! And i know how to do it in good detail! But. WHY exactly would i do it except for to maniacally laugh when it finally works
Fair enough... just thinking if wasting one of my two action slots per day is worth it. BUT i'm pretty sure it is. I'm teaching this shit at a university, and how dare i, if i never built an actual computer
mark mothersbaugh would be proud of you (the highest compliment i can think of at the moment)
take care of yourself first. take all the time you need, we'll still be here (>^.^)> ~~<3
AUBERY 1 : SCREENSHOTTER 0! Take that ya robo! Now the screenshots will be clean and fullmode, while new visitors will get a splash and returning visitors will keep their calm mode preference
Made a web tutorials index and reformatted the first article. Feedback is much appreciated! I think it reads much better now. Also i decided to keep the styling tame and more groundy than the rest of the site, so papery skeuo it is
It's available at [https://auberylis.moe/webtut/buttons/] - but not from any link within the site itself. Check it out if curious and tell me if i have to majorly change something about how i write. I plan on making more web tutorials, so feedback is important
It's not a foolproof anti-scraping solution, but we can use robots.txt (and/or meta name=robots tags) to tell the major cloud LLMs to stay off our sites: https://coryd.dev/posts/2024/go-ahead-and-block-ai-web-crawlers/
neocities ai will be trained on making the same sadgrl website template over and over
sure, robots.txt is a must, but it doesn't mean a lot. neocities hosts our sites, the data is already there, they don't need to webscrape in order to train. I want a warranty from Kyle that we're not used as training data.