10/01/2025
The Fragility of Borrowed Intelligence
There is a price to be paid for every increase in consciousness
*This piece was commissioned by Tusk & Quill, a new publication, exploring what it means to ‘live well’ in America today.
—
You can spot AI-influenced writing by its cadence before you even notice its content.
Tweetable aphorisms: Every few paragraphs conclude with a tidy maxim you could drop into X or Instagram without context.
Stock metaphors: Everything becomes an ecosystem, a veneer, a scaffolding, a shadow.
Premise-to-punchline rhythm: Long dependent clause, then dash, then quip.
Parallel rhetorical escalations: “What happens when X? What happens when Y? What happens when Z?”
Canon name-drops: McLuhan, Ship of Theseus, Plato’s cave—appearing like seasoning rather than substance.
Authority sprinkling: An economist here, a cognitive scientist there, but never integrated deeply into the argument.
The result tends to be a hypnotic cadence. Like the hook of an Olivia Rodrigo bop, it smooths complexity into something familiar, comforting, and consumable. We nod along because the rhythm is recognizable. It quiets our insecurity about whether we truly understand.
That, of course, is the point.
The same rhythm that lulls us into agreement also lulls us into dependence. What feels smooth and familiar on the page feels smooth and familiar in life. We begin trusting cadence over content, and convenience over resilience. We’re drawn to rhythms that shelter us from uncertainty. That’s how cadence turns into dependence, and dependence turns into fragility.
It feels like we are sleepwalking into digital feudalism through our dependence on Artificial Intelligence. I see my friends optimizing their workflows around these tools: writing, analysis, decision-making, therapy. It feels obviously good—until you zoom out.
Your productivity becomes tied to AI access. Your company’s competitiveness depends on your productivity. Scale this across the economy and suddenly entire sectors are hostage to whoever controls the algorithms.
What happens when they dial back capabilities? Restrict access? Or the tech just breaks? Millions who rebuilt their work processes around GPT-5 suddenly find themselves stuck with GPT-4. Companies that structured operations around AI assistance hit walls they never learned to climb.
We’re losing institutional knowledge about how to function without new “tools”. Each integration creates civilization-level single points of failure.
–
Here’s the twist: everything you just read—my warning about AI dependency—was written using the very patterns I identified as AI-like.
The aphorisms.
The ecosystem metaphors.
The premise-punchline cadence.
The McLuhan name-drop.
The sweeping civilization-level claims.
If you found the argument compelling, was it the logic that persuaded you—or the cadence itself?
This is the recursive trap. Either I’m unconsciously infected by these patterns, or I’m consciously manipulating you by using them, or the patterns themselves have become inescapable. Each possibility leads to the same unsettling conclusion: we may already be too deep inside the system to think our way out of it.
And if you didn’t notice until I pointed it out—well, again, that’s exactly the point.
This trap shows us that once the patterns of thought themselves are captured, fragility multiplies. It’s not just that our work depends on the tool, it’s that our way of thinking bends around it. And fragility takes root in both dimensions.
Economic fragility here looks like whole organizations forgetting how to walk because they’ve been sprinting with bionic legs. Imagine a law firm that no longer trains its paralegals because the AI drafts contracts instantly. The first time the model hallucinates a clause or access is revoked, the firm doesn’t just lose speed — it loses competence. Knowledge that once lived in people and institutions evaporates, replaced by dependency. Multiply that across healthcare, education, research, governance, and the cracks widen into civilizational fault lines. The danger in machines not working or breaking is just an inconvenience. The deeper wound is when we, little by little, forget the shape of our own strength without it.
Epistemic fragility, meanwhile, is quieter but more insidious. It’s what happens when our craving for certainty outpaces our capacity for doubt. If AI becomes the universal answer engine, then not-knowing itself starts to feel intolerable. But it’s in the not-knowing where creativity, resilience, even wisdom live. A mind addicted to instant closure becomes like an immune system that has forgotten how to handle a virus. Every disruption feels catastrophic. Every paradox feels unbearable. We risk producing a generation of thinkers unable to hold tension long enough for new insights to emerge.
These fragilities scale, but what makes them dangerous is their self-reinforcing nature. Economic fragility amplifies epistemic fragility: the less resilient our systems, the more anxious we become, the more we cling to fast fixes. Epistemic fragility amplifies economic fragility: the less we can stomach uncertainty, the faster we rush to outsource decision-making to machines, hollowing out institutions even further.
Fragility begets fragility. And that is why this is more than our tools or workflows.This is about the operating system of thought itself, and whether we still have the courage to run on it. The courage to stay inside tension instead of fleeing it.
Because this is the heart of epistemic fragility: it erodes our ability to stay with tension. The word ‘tension’ feels like a flaw to be engineered away, but it happens to be the very medium consciousness grows in.
Heraclitus knew this two and a half millennia ago. “The road up and the road down are one and the same,” he wrote (go aphorism!). For him, it was the clash of opposites: strife, flux, contrast, that made reality possible.
Alan Watts, years later also wrote,
“This, then, is the human problem: there is a price to be paid for every increase in consciousness. We cannot be more sensitive to pleasure without being more sensitive to pain. By remembering the past we can plan for the future. But the ability to plan for pleasure is offset by the “ability” to dread pain and to fear the unknown.”
We only perceive because light collides with shadow, because pleasure sharpens against pain, because certainty shimmers against doubt. To lose our capacity for tension is to dull the edge of perception itself.
The same is true of our systems. Economic fragility arises when industries pursue smoothness at the expense of tension. They strip away redundancies, erase the (sometimes necessary) friction of training, and therefore end up cutting the slack that makes resilience possible. It looks “efficient” in the short run, but in reality it’s the slack, the contrast, the “waste” that gives a system its balance.
Both fragilities — economic and epistemic — come from the same root mistake: mistaking smoothness for strength, when in truth it’s tension that carries life. Minds that can’t endure contradiction, and markets that can’t endure disruption, are equally fragile.
The Wisdom of Insecurity
At this moment, it’s tempting to scramble for fixes like redundancy, safeguards, alternative mediums. Build a thicker wall against fragility. But that instinct itself is the problem. The ultimate escape, transcendence, is never horizontal. It’s vertical. We don’t get out by hopping from one hamster wheel to another. We get out by learning to stand in the stillness between them.
Alan Watts called it the wisdom of insecurity: the recognition that our need to eliminate uncertainty is what makes us anxious in the first place. We try to nail down permanence where none exists. We try to solve the problem of fragility with more control, only to create greater fragility.
Human beings are psychologically not interested in the truth. We’re interested in the elimination of uncertainty. We want the safety of closure. AI offers that relief in the form of cadence, completion, answers on demand, all instantly.
But The Problem is smarter than us. It’s telling us something. The wiser move might be to stop telling, and start listening. Listen to what the system is revealing: it’s inflamed for a reason. At the micro scale, that might mean noticing when a sentence lulls us into agreement because of rhythm rather than reason. At the macro scale, it might mean building cultures that metabolize fragility instead of trying to engineer it away.
This is where courage enters. Most mistake willpower for courage. But courage to sit in the discomfort of uncertainty without rushing to fix it. To stay with the trembling instead of numbing it. To let fragility be a teacher rather than a curse. You might be surprised to experience that transformation usually ends up being in the space between the question and the answer.
And the key here is humility. A kind of epistemic humility. To admit that we don’t fully know how to live with insecurity, and that maybe we never will. History doesn’t give us a map, but it gives us hints. The scientists of the seventeenth century didn’t eliminate doubt—they built a whole way of life around it, and called it Method. Democracies don’t survive because they erase conflict; they survive, badly and noisily, because they let it in. And the older traditions—Buddhism, Taoism, the mystics—have been saying for centuries that the way through impermanence isn’t control but surrender.
None of that is clean or easy. It’s messy, it hurts, it takes practice. Which is why maybe wisdom here is about befriending your fragility and pain instead of trying to “conquer” it. The medium has already become the message. And maybe the message isn’t even that we’re doomed or that we’ve finally solved it all.
Maybe the message is simply this:
Have the courage to live with the trembling, without reaching so fast for the fix.
—
I send out a newsletter sometimes about what I’m up to, including updates on Analogue.
You can subscribe here: tinyurl.com/aishdoingthings