nullzer0's profile picture

Published by

published
updated

Category: Web, HTML, Tech

Diatribe about modern web and expensive assumptions

Some of us are old enough to remember one downside of the greater web before the core web metastasized - the scourge of dead links. And I think it left our generation a bit traumatized at how ephemeral the web was, so we overcorrected, and now we expect all sites to hold everything forever.

The problem is that the web is a living medium. It (we) generates new content exponentially. The web eats non-stop but there's no exit at the other end. It can only inevitably burst.

So now we've got what should be arguably very light operations getting bloated forever, and storage costs soar ever higher, wasting cash and resources on content that's been completely abandoned on the off chance that someone someday will find a link to it and be frustrated that it's gone.

Tumblr is going to be running on a skeleton crew, X is losing millions a day, facebook monetizes itself through abusive marketing strategies, youtube is in a losing battle with ad blockers, the core web itself is now threatened by the largess of it all.

I think we need to course-correct. I think netizens need to get comfortable with letting go. The web was never supposed to be an infinite repository, it was meant to be dynamic and changing. Because the alternative is a need for ever more monetization, ever more ads, and all the places we love never being sustainable. This is a necessity for any sort of indie web to survive long-term.

In a more practical sense, I think websites like this should implement regular purges, with notification to allow participants to back up anything they care about, after a timer. Say, no activity for 2 years, then add it to a purge queue with a timer to allow users to back it up or interrupt the time, if no further interaction before the timer then it's gone.

"but then we'll be back to the old days of every other google search result leading to a 404" - So? Google's stale results are google's responsibility. If they can't be bothered to prune their shit, the frustration should lie at their feet. Same with dead links between sites. Especially with dead links between sites - it's easier than ever to listen to a list of links for changes or 404s and link pruning can be automated. 302s exist for a reason and should be used more regularly when content is simply moved and not deleted. Duplication should be flattened through hashing and reference - if two images have the same hash and dimensions for instance it's probably safe to assume they're the same image, keep metadata of both but where the data of one should be, a reference to the other exists instead.


2 Kudos

Comments

Displaying 0 of 0 comments ( View all | Add Comment )