❌

Reading view

There are new articles available, click to refresh the page.

Coding then and now

Recently I watched a few videos about how programmers back in the early days of video gaming, as well as demo coders today developing for old hardware, were using every trick in the book (and quite a few tricks that weren't in any books) to make video games and demos as visually appealing and interesting as they possibly could and to make it appear as if the hardware was doing things that it shouldn't technically be able to do.

Here are a few examples of videos to illustrate what I mean:

In order to make these games and demos look the way they do, the coders had to write highly optimised code in assembly language that often needed to be accurate down to the individual clock cycle of the CPU because some events have to occur at the exact right time for the effect to look good. On CRT displays, for which these devices are designed for, the screen is drawn from top to bottom with an electron beam. For some effects to look right, the content of the video buffer, which holds the image that is drawn on the screen, needs to be manipulated while the image is being drawn on the screen, which requires very precise timing.

This got me thinking that optimising code and writing lean and resource-efficient code in the first place seems to be a dying art. It's still necessary for programming embedded devices which typically have a very limited amount of program memory, RAM and processing power, but PCs (and Macs) these days have almost unlimited processing power and memory compared with the systems of the past. Terabytes of drive space, gigabytes of memory and clock speeds measured in gigahertz, not to mention multicore CPUs with 8 or more cores on a single chip would have blown every 80s programmer's mind; they would have had no idea what to do with this any more than I would know what to do if somebody landed a spaceship in front of my house and handed me the keys.

But not to fear, coders have found ways to use all these resources by writing gigantic applications that require huge amounts of disk space, RAM and processing power and still manage to feel slow even on high end computers. I get to enjoy the full Microsoft 365 experience at work with Windows 11, Outlook, Teams, Sharepoint, the entire office suite and on and on, and even on a powerful machine all of this just feels way more sluggish than it should. If I installed a copy of Windows 7 with contemporary versions of Outlook, Office and Skype on this computer, it would absolutely fly and I could do everything I'm doing now, but faster. I mean, even just a right click on the desktop sometimes takes an entire second before the menu appears. That doesn't seem to me like there's particularly well written code running behind it. Microsoft's CEO agrees ;)

Ok, I'm starting to rant now. My point is, programmers these days (and I'm including myself in this too) might do well to occasionally take some inspiration from coders of the past or demoscene coders and after implementing something, take a moment to look over their code and ask themselves "is there maybe a more efficient way to do this". Because chances are there is, and at the end of the day a lean and well optimised codebase is something that everybody benefits from. Unfortunately it seems that with the advent of AI assistants and vibe coding we're moving further and further from this idea, but one can hope...

❌