Why is Software Progressing so Slowly Now?

If you took someone who'd last seen a computer in 1982, and sat her down in front of a computer from 1997, she'd be rightfully astonished. But I think that if you were to repeat this exercise with someone who's last seen a computer in 2007, she would be... underwhelmed, at best. Maybe except for how easily available Korean recipes are now that TikTok and Instagram are a thing, that is, I think, legitimately great and maybe even worth the whole privacy shenanigans. Erm. Sorry, I am easily distracted by Korean food.

Anyway. It seems that, with some exceptions -- computer vision applications, for example, but also some applications of computer graphics, video encoding and decoding, interactive and augmented content generation, to name just a few -- the rate at which a lot of user-facing software improve has slowed down a bit. Windows 1.01 (1985) and Windows 2000 (well, 2000) are worlds apart, whereas Windows Vista (2007) and Windows 11 (2021) are recognizably the same OS, except the latter has a really boring theme and a dysfunctional taskbar.

I want to stress, of course, that this tendency is by no means universal. Some software has improved. I just think that a lot of it hasn't.

Various reasons have been proposed for this. Some of the valid ones include:

I don't want to argue against any of these. I want to posit one other reason, and to claim that one thing that we sometimes take as a reason is, in fact, a symptom.

What do I Mean by Lack of Progress?

Even without Moore's law, the improvement in computer speed in the last fifteen years has been tremendous. Not just on the CPU end. Faster memory, massively faster interconnects and GPUs have made entry-level phones comparable with top-of-the-line 2007 laptops, if not faster.

Yet the software -- again, with the exception outlined before -- is... lagging behind a bit. UIs are faster to write and more portable thanks to Electron but also slower and choppier, and not just because of the extraneous animation. Even with SSD storage, lots of software is slower to start up than we'd wish, not the least because it's also chatting a lot over the network. And plenty of software, from word processors to mail clients, has only been incrementally improved, despite going through multiple major releases.

The last fifteen years saw the development of hundreds of WYSIWYG web-based editors, most of them in various states of abandonment by now, with high-ranking backers like The Guardian. All of them were slower and less capable than any free Office clone from the 1990s, and most of them were abandoned because at one point there was a better, faster one just coming up, which was way faster and way better, while still being slower and less capable than any free Office clone from the 1990s. Google Docs has a word processor that's marginally better than WordPad (and way slower) and the worst Powerpoint clone there is, and both are pretty much the golden standard in this regard.

It's not just features that I have in mind when I'm thinking of lack of progress -- although, yes, I'm thinking of features as well. But also of things that were once mentioned in software reviews from major computer magazines but are now dreadfully absent, like slow UIs. The kind of latency that Office 365 has on a top-of-the-line system would've been considered a bug twenty years ago. Now we're happy they're at least hiding it behind nice transitions.

That being said, I do want to stress once more that I think this is by no means an universal trend. Some software is massively better. For example, things like the camera filters most easily seen on Zoom, Instagram and TikTok would've been wizardry back in 2007. Now it's stuff that high school kids can play with if they're willing to gnaw their way through some math.

Shifting to New Platforms

I think it's easy to see one of the reasons why some software hasn't improved if you look at what what it's used the new devices and new hardware capabilities for.

Some software -- including, yep, things with a really bad rep, like Instagram -- have used the new platforms and their new hardware to do cool new shit. It's enabled new means of expression, like reels, and introduced new ways of integrating real-life pictures and sounds with computer-generated pictures and sounds. It gave rise, or popularized, new means of communication, like Snapchat one-liners (of variable lewdness, and with new risks -- I don't mean to say it had universally good effects).

And then some software -- like office suites or email clients -- have used the new platforms... just to do what they were doing before, ideally in a way that's as close to the old one as possible. Companies have aimed low and boring -- Office 365 is, in many ways, just like "old Office" except storing things is now a lot more confusing -- which is, by definition, not progress -- at best, it's a form of stagnation. And this reorientation has swallowed tremendous development effort. The amount of money poured into things like WebAssembly and WebGL is eye-watering, to the point where it's hard not to wonder if the obvious benefits to be derived from web platforms (portability, especially among vastly different mobile platforms, easy installation, UI design flexibility) were really worth it.

The usual story is that, well, companies saw the potential and they were willing to do the work -- as much as I like GTK, Qt and GNUstep, Electron is probably the easiest way to write cross-platform applications with brand recognition bling.

But, realistically, no program manager in their right mind downgrades their applications for ten years while targeting a platform that swallows money just to run your (downgraded!) app on top of it, just because they recognize that the underlying technology has potential. Not risking or sacrificing your product's market viability in order to advance a technology that's not even yours is literally program management 101. Also, an embarrassing number of program managers out there can't, or put up a very convincing display of an inability to, tell a technology with good potential from a moderately clever scam -- as evidenced by, say, the NFT boom.

No, I think the far more plausible explanation is that they saw the potential in owning not just the software, but also the means to use it, access it, and store its output. That was worth it.

So a lot of companies out there poured eye-watering money into cloud-backed and/or web-based application-like stubs not as a means to improve their software, but as a means to improve their strength relative to that of their users. Since that -- rather than the development of new capabilities -- ate most of the development funds, software understandably progressed a lot slower than before.

Thing is, if you engage in this very silly exercise for long enough, you gradually lose your ability to innovate and develop good software, too. Stagnation is invasive and malignant. It starts as a deliberate, temporary shift in priorities, but it quickly develops into modus operandi, because people who like to innovate and develop good software are not fond of bikeshedding, and happily get out of the way.

You can see this at the other end of the software - platform relation, too. Despite widespread, mounting effort from the industry -- between 2013 and 2016 or so you could play the "drink once" game and get everyone shit-faced within minutes if the drinking phrase was "the post PC era" -- some things just didn't catch on. Laptops with touch screens are neat but everyone whose work doesn't consist of forwarding emails would rather use a mouse with Excel. As for running Excel on a tablet, well, that hasn't really caught on, except among tech influencers who ~~were paid really well~~ really wanted to prove that you can do it, everyone, you can really do it, why isn't everyone doing it already, look, this is how we'll all be doing it two years from now.

The success of platforms like smartphones hasn't been driven by embarrassing exercises like these. It's been driven by applications that legitimately used these platforms for genuinely new things, and which could in turn further influence their development by mediating between their users' daring, original and occasionally wishful thinking and the people doing the hardware. This isn't a new phenomenon: the development of gaming consoles and home PCs has ultimately not been driven by companies that tried to emulate arcade cabinets and time-shared computers, but by companies that leveraged the strong points of these new technologies.

In this regard, I think the emergence of new platforms -- mobile platforms, in particular -- and the resurgescence of old ones, like game consoles, is not really responsible for "sucking the air out of" old, boring computers. I think their newfound prevalence is a symptom: some companies were in it for their new capabilities (and didn't mind their closed nature), but most of them were in it for their closed nature. Of course, the interest in pushing these platforms is practically universal, but the fact that it enables better software is, I think, a niche case. Most of those who push them put out worse software than before.

If you'll pardon the really bad pun on a really bad subject: lots of software has stagnated because its developers and publishers no longer poured money into developing their software, but into seizing the means of computation. Competition is expensive and hard. User lock-in is cheap and really easy.

So is this where we all become depressed and yell at clouds?

Nah!

Every business model becomes obsolete at one point. Various things about it go old, but obsolescence -- which always comes, sooner or later -- sets in once one or more fundamental aspects of its inner workings become hopefully inadequate. Nothing is eternal, and that includes:

All good ideas can be exploited to turn them into terrible systems, which develop in parallel with all the good things that good ideas spawn. And all terrible systems are eventually exposed for the shams that they are. If the age of Web 3.0 seems like a bad rehash of the 1960s -- all of the leased time and weird licensing agreements, and none of the blinkenlights -- do not despair: all ages of technology have their Homebrew Computer Club moment.

Back to blag archive

Back to blag index