Also, receiving a virtual gift from LiveJournal to celebrate a 20th anniversary on a site on which ones own history spans less than five posts and a similar amount of contacts that remained. (Still, can't help remembering xkcd#77 ...)
Been in my playlist this morning, and, though obvious in some ways, still added some points and leaves me pondering that particular issue more than usual:
At the outset, including for the IBM System three sixty, the computer with a single CPU was not actually multitasking. It was doing the same thing that our brains do. It was jumping back and forth between tasks really fast to make sure it was doing them both adequately. So when we kind of figured out that there was a problem with processing, that there were limits to it. We established very quickly something called a processing bottleneck. That, yeah, there is. It's documented humans do not multitask to begin with, and when we try to multitask, the results are terrible.
From the "what-could-possibly-go-wrong" department: "New Google Release Exposes AI Upgrade For Messages Users".
The issue, of course, is that when your AI chatbot is driven by an advertising giant, you’re risking a limited and far from independent experience—a Google search window without the immediate option to scan beyond the advertiser results.
Well. The only thing really interesting about is that this very point of view needs explicit emphasizing.
From the "odd ways to fill your spare time" department: Reviving Netscape(R) Composer(R) in contemporary internet. Which feels like unearthing, respawning something horrible. Too: HoTMeTaL, anyone...? Interesting read nevertheless:
As a geek born in the early 1990s, who has been playing with computers from a young age, I think fondly of what tech looked like in the late 1990s and early 2000s.
So, naturally, when I got my hands on an old computer a few months ago, I installed Windows 98 on it as a way to revive software from my childhood and play around with it. Among the gems I wanted to revisit was Netscape Communicator, a software suite from 1997 centered around Netscape Navigator, which was the first web browser I ever used. One of the other applications included in that suite was a WYSIWYG web page editor named Netscape Composer.
https://plbrault.com/blog-posts/i-used-netscape-composer-in-2024-en/
Reading DHHs musings on technology and running services always is a good way to start a day. This one is no different:
https://world.hey.com/dhh/keeping-the-lights-on-while-leaving-the-cloud-be7c2d67
The magic of Basecamp 2’s incredible two-year 100% uptime, as well as all the other applications hitting 99.99%, come in part from picking boring, basic technologies. We run on F5s, Linux, KVM, Docker, MySQL, Redis, Elastic Search, and of course Ruby on Rails. There’s nothing fancy about our stack, and very little complexity either. We don’t need people with PhDs in Kubernetes or specialists in exotic data stores. And neither do you, most likely.
But programmers are attracted to complexity like moths to a flame. The more convoluted the systems diagram, the greater the intellectual masturbation. Our commitment to resisting that is the key ingredient in this uptime success.
Quite a strong wording in this conclusion but generally it's hard to disagree here. Just too many people throw in too much external and third-party complexity for the sake of it, stuff they don't completely know or understand, because that's "how you do it these days". In the end, still, reliability, stability, security of a system in day-to-day operations to quite some degree depends on whether (or not) people know how to handle their environment in as much detail as possible.