And on a similar note, cross-post from Mastodon:
What I love about HTML and HTTP is that it can degrade rather gracefully on old browsers.
My website isn’t spectacular but I don’t think it looks horrible, either. And it’s still usable just fine all the way down to WfW 3.11:
It’s not perfect, but it’s usable. And that makes me happy. Almost 30 years of compatibilty.
The biggest sacrifice is probably that I don’t enforce TLS and that HTTP 1.0 has no Host:
header, so no vhosts (or rather, everything must come from the default vhost). (Yes, some old browsers send Host:
, even though they predate HTTP 1.1. Netscape does, but not IBM WebExplorer, for example.)
(On the other hand, it might completely suck on modern mobile devices. Dunno, I barely use those. 🤪)
So, the “AI” bots have reached my website. Looks like they’re just slowly crawling everything at the moment – no DDoS-like attack yet. I wonder if that has something to do with my website being 100% static HTML. There are no GET parameters they can tweak and, at the end of the day, there’s not that much data on my server anyway … And maybe they have no idea what stagit is, so it doesn’t trigger “standard behavior”, like “this is a Gitea instance, let’s crawl this like crazy!”?
@tx@0x1a4.1337.cx #stagit looks really cool, thanks for mentioning it.