The other day I left a comment on one of the now-many posts holding court about the impending AI apocalypse and why it will/won’t/can’t/must happen and that’s a good thing.
I suppose that made me what the kids call a reply guy, despite my best efforts to reel in my “Uh, akschually” cred these days.
My comment was a bit of tongue in cheek, but the point is real: I don’t think anyone is positioned to predict the coming impacts of AI. I’m better informed than most, certainly in the historical and philosophical aspects, and I won’t put money down on the next 12 months, much less the next 12 years.
Since I play by Andy Kaufman rules on the internet — the performance stands as you find it, not packaged up with a clever /s to flag it as such for the unsophisticated regions of the bell curve — it’s little surprise that I got reply-guyed in my turn.
I don’t participate in comments past the first level reply, exactly to avoid wasting finite time on internet-people. That said, his frustrated attempt to dunk on me did open a handy door into a topic that demands further exploration.
Why is it that we’re so damn sure we can peer down the rivers of time and return to the present with anything more than our own fears and fantasies?
I don’t know what the internet, or what’s left of public education, is teaching the masses these days, but somewhere in the pipeline they’re picking up this idea that the only way to conduct a civilization is to acquire absolute certainty about future events.
Anything less than perfect certainty is cause for hyper-ventilation, fear, doubt, and furthermore, frustration.
Reader, I don’t know or much care where you stand on this question at present, but it’s vitally important that you understand that it is a pernicious, systematic falsehood; the kind of lie that leads lemmings off of cliffs and “gullible” types into mass movements.
Gullible applies to all of us in some measure. None of us can ever prevent gullibility. At best we can exercise it with a modicum of freedom, but never with total mastery.
The human mind needs to believe, and in the absence of proper objects of belief, it will conjure gods of its own making.
There’s the great irony of the faith in certainty: the people most likely to hold their noise at the first whiff of religious faith are most likely to buy into the Myth of Prediction, brought to you by science and technology.
Hold that thought. Here’s three ideas for your entertainment:
1/ Every prophecy is wrong.
There is no certainty in anything outside of maybe geometry and mathematics and possibly certain areas of logic.
Understand that the rules of constructed games are objective and impartial. We agree on what the "+” and “-” signs mean when doing mathematics. If one follows the rules, then the same operations on the same inputs will yield the same results.
So long as everyone follows the conventions, they “just work”.
We’ve taken this model of certainty as the model of all knowledge. This is bad enough in realms where no such laws apply. Beyond certain pretensions of classical physics, that is “everywhere”. The worlds of life and mind, which we inhabit, contain no absolute laws of the kind discovered by Newton.
You can’t plug in the inputs, run a model, and expect outputs that reflect reality. The last three years have taught us all a harsh lesson about that piece of fantasy.
So naturally, our civilization — those that run it — hold it as self-evident that there are such laws, and if we concentrate really hard with The Science, we’ll discover them and then happily ever after.
All our problems are really a matter of ignorance. If we only had more certainty about the world, everything would be fine.
But there’s a major problem with that. Certainty can’t see the future.
2/ Certainty looks backward.
Over 20 years ago the late David Sackett wrote of two sins of “expertness”. The first sin is that
adding our prestige to our opinions gives the latter far greater persuasive power than they deserve on scientific grounds alone. Whether through deference, fear, or respect, others tend not to challenge them, and progress towards the truth is impaired in the presence of an expert.
The second sin is
committed on grant applications and manuscripts that challenge the current expert consensus. Reviewers face the unavoidable temptation to accept or reject new evidence and ideas, not on the basis of their scientific merit, but on the extent to which they agree or disagree with the public positions taken by experts on these matters.
The cult(ure) of expertise is not based on better knowledge, skills, experience, or such things of merit, but rather on the more familiar patterns of human deference to authority and consensus.
Experts traffic in the certainty of the status quo ante. Hard won as that expertise may be, it is based on the accumulation of facts and responses from the past.
Expertness is ill-equipped to address the future, which is populated by shadows, demons, and invisible monsters.
Much like the meetings full of clever MBAs ready to maximize next quarter’s balance sheet at the price of their existing customer base, the cult of expertise prioritizes last year’s news to fight last year’s war.
No signs of long-term vision, much less an awareness of higher-order effects or the Rumsfeldian unknown-unknowns.
3/ Certainty is a Chinese finger trap.
The bona fide genius John von Neumann wrote of a thorny problem back at the dawn of the computer age.
As machines grow more complicated, there comes a point beyond which our models of its behavior become more complex than the machine’s actual behaviors.
A simple machine like a thermostat can be predicted completely by a wiring diagram and some verbal instructions. With vastly complicated mechanisms, the situation filps around. Any computational models we can build would be both slower and more expensive. The fastest and most efficient way to model the performance of the stock market is to watch the stock ticker.
Von Neumann’s point was that, beyond the threshold of complexity, there’s no point trying to model the system — any model you build would have to be at least as complex, and therefore just as inscrutable. You’d need another model to model the model, and it would need to be as complex as the first two.
These AI widgets everyone’s talking about right now fall just inside this category. Their designers don’t really know how they reach the conclusions they reach. They’re using a combination of brain-like hardware and complicated probability tools, which don’t work anything like our mental machinery of ideas and reasoning.
And that’s not even the beginning of it. The LLMs are just one type of AI mechanism. There are more and different types, in various stages of development.
Our ability to build the thinking-tools might be far beyond our ability to understand them and predict their behavior. By “beyond”, to be clear, I don't mean a straightforward matter of solving an engineering problem with more data. I mean beyond as in trying to build a triangle with five sides -- a fundamental limit in human cognition.
When it comes to predicting the predictability of AI, few consider this patch o’ thorns.
The punchline is that, if we insist on an arm’s race for absolute certainty over the future behaviors of AI, we’re going to need AI to help us. Which means…
The more we crave certainty over the AI, the more we need the AI. We’re tightening the knot around our necks while scrambling for air.
Speaking of prediction, I can foresee the objection:
“You’re saying that there’s no point in trying to predict the future. Should we just give up? That sounds defeatist.”
Yes, the first part is pretty much what I think. The rank pessimism of the second part is not at all what I think.
To handle the first:
You can’t control what AI does. You can’t control what the economy does, the Yellowstone supervolcano, the asteroids crossing Earth’s orbit, or pretty much anything else that happens, anywhere, ever.
You’re pretty pathetic, if you think about it.
This need for certainty is a psychological blankie that some of us use to pretend we’re in charge of nature and fate.
I’m all for having an inner locus of control. But you have to balance your sense of autonomy and mastery with an appreciation of your place in a natural order that, at best, is indifferent to your existence.
To handle the second part, on my anti-pessimism:
There’s a time and a place for forecasting. We call this “prudence”. Don’t step in front of a speeding bus if you want to remain alive and ambulatory. The best forecasts are simple, time-tested rules of thumb.
The issue is overconfidence in our abilities to divine the yet-to-be, the misplaced faith in our shiny tools. There’s an excessive focus on achieving outcomes, which you can’t control, instead of placing your attention on your own behaviors, which you can control.
There’s little place for imagination or vision, with all of the technology worship and faith in the growing unreason of science. The speculation of science fiction, which was once part of the imagination, has fossilized into inevitable fate for our decaying culture.
Humans lived for many, many ages without belief in our own all-powerful mastery over nature as a safety blanket.
Paul Atreides got high on ayahuasca or something and had oracular visions of his own future. That didn’t end well for him or anyone else really.
Frank Herbert made a good point in the first Dune trilogy that won’t make much sense if you still live out in normie-land, where science is honest search for the truth instead of bureaucrats hustling for grant money; where the TV news tells you the truth instead of a manufactured narrative; and we’re all heading for a luxury communist Star Trek future, instead of a long decline as our easy access to energy sputters away.
Herbert echoed an idea with a pedigree in Nietzsche and Heidegger, that human freedom demands a large degree of ignorance. The closer a mind comes to Absolute Knowledge, the fewer degrees of freedom available to it. We are free in our lacking.
Sophisticated scientists and philosophers-of are not so dense as the cult of “I Effing ❤ Science!” Real science is not a game of establishing certainties, but a process of expanding our sphere of ignorance. Karl Popper probably gave this the most well-known formulation when he claimed that science is in the business of making conjectures and aiming to refute established theories.
Good theories raise more questions than they answer.
Once Paul Muad’Dib caught a glimpse of his future, the future caught a glimpse of him. The real and the ideal entered a dance which neither of them could escape. Free will evaporated under the cleansing sunlight of the Absolute.
Certainty, she is overrated.
Thanks for reading.
-Matt
p.s. Share it pls ⤵
I enjoyed this essay! While not sure I agree with absolutely every aspect, it raised interesting questions and viewpoints and that's pretty much exactly what I like in my Substacks? Thank you!