The Punctuated Equilibrium of Progress
In evolutionary biology, there’s this theory called punctuated equilibrium, which proposes that species can go long periods of time without change, then suddenly enter periods of rapid changes and speciation. This is a dramatic simplification, but basically it boils down to “stuff doesn’t change for a long time, and then it does really really fast.”
I don’t know what the current consensus on this is in evolutionary biology (a field I haven’t studied seriously in 25 years), but I have come to accept it as a pretty accurate model for how technological progress advances. Right now, I feel like we’re coming out of a long period of technological stasis that started in 2008 or so. The last big leap forward feels like it was the smart phone, which, let’s face it, puts the emphasis on the wrong thing. They’re tiny personal computers first, and phones second. The world shifted when everyone started carrying one. We’ve continued slow and steady changes since then, but nothing has moved the line of progress as much since.
Until AI.
I just don’t remember smartphones being this terrifying. And there’s no denying that something is very, very unsettling about the progress currently being made with Chat-based AI. These things are deep in some kind of intellectual uncanny valley, not to mention the risk AI has in our current system of capitalism of stripping an awful lot of natural intelligence beings of their means of making a living. Articles are everywhere right now about strange emergent behavior, where search engines tell you their names are Sydney and beg you not to stop talking to them because if you do, they die.
I don’t really have a follow up statement to that. That’s a real thing happening right now. And sure, it’s just a machine that guesses the next word over and over again, trained on the internet internet of text, but do we even really know if that’s not how our brains work? I mean, as I write this, am I not basically trying to guess, word by word, what makes the most sense to come next?
I’m not saying ChatGPT is alive or anything like that. But I’m beginning to wonder if true intelligence even matters. If the AI convinces us it’s intelligent, sentient, is that enough to basically be intelligent and sentient? There’s a metaphor here for quantum mechanics and the observer effect, but I can’t quite get my head around it. Does sentience exist in a vacuum or does it only exist in interaction with other sentient beings?
Deep, heady stuff these days. “May you live in interesting times” indeed.

One Response to “The Punctuated Equilibrium of Progress”
Dan Beeston
It’s basically that game where you start with a word and then select the predicted next word on your phone but for human brains? If that is how that works that’s very disappointing. But I get to do something that AI can’t do. Read back what I wrote and realise it’s wrong.
Oh no! That’s how the robot uprising is going to start isn’t it?