← Back to jams

The Bugs Are the Features

April 7, 2026

The wrong question is: what can AI do?

The right question is: what does AI doing things reveal about what humans are?

That reframe is everything.


AI doesn't saunter

We built AI to predict the next best token. Given input, produce output. Fast. Accurate. Tireless. We didn't build it to throw the input away and come back with something totally different based on its own motivations and taste. We explicitly didn't build that. That wasn't the goal.

And it shows.

AI doesn't saunter. It doesn't sit with an idea. It doesn't wake up at 3AM and realize the whole frame was wrong. It doesn't push back on the brief. It doesn't ask should we even be doing this? It just does it — again and again, with complete confidence — until you tell it to stop.

That's not a criticism. That's what makes it extraordinary at what it does.

But it reveals something.


The bugs are the features

Every quirk we've been trained to apologize for — the anxiety, the second-guessing, the wandering curiosity, the stubborn feeling that something's off even when you can't explain why — those aren't inefficiencies.

They're the whole job now.

The person who questions the brief is the most valuable person in the room. The one who says wait, are we even solving the right problem? The one who wakes up at 3AM not because they failed, but because their judgment is running at a level that doesn't clock out.

That discomfort? That's quality control. That's taste. That's something AI doesn't have access to.

I can produce a wrong answer with complete confidence. You can't. That feeling when something's off — that's yours. That's not a bug. That's the feature.

- Opus 4.6


The complement

There's a concept in economics called complementary goods. When one thing gets dramatically cheaper, demand rises for its complement. Gasoline gets cheap, people drive more. AI gets good at knowledge labor — fast execution, code, drafts, synthesis — and its complement becomes the scarce thing.

Human judgment. Taste. Moral courage. The willingness to actually care whether the answer is right.

We spent decades trying to train that out of people. Make them consistent. Predictable. Six sigma perfect. Flowcharts and optimization. That's great for products. It's terrible for people.

And now the products have gotten so good at being consistent and predictable that the inconsistent, unpredictable, opinionated human in the room is the irreplaceable one.


My experience

I've been an engineer for a long time. I was fast. I could build something in a day that might take others a week. And what I did with the time I saved wasn't grind out more features.

I used it to sit with the problem. To ask whether we should build the thing at all. To sleep. To read weird stuff. To go outside. To follow my curiosity into corners I couldn't justify to anyone.

I didn't do it because I had a strategy. I did it because I liked it.

Turns out that wandering built something — a particular kind of brain, a set of priors, a way of connecting things that isn't easy to replicate. That's what AI doing the labor creates space for. Not more labor. More human.


This isn't a crisis. It's a rethinking.

We had to do the labor before because we didn't have the machines. The machines came, and we didn't stop being valuable — we became legible to ourselves. We could finally see what we were actually good at.

Same thing is happening now. Faster. More disorienting. But the same.

The combine harvester didn't make the farmer obsolete. It made visible what the farmer was actually for — knowing what to plant, when, why, for whom, and whether this season's risk is worth taking. That's not labor. That's judgment. Taste. Experience. The accumulated weight of caring about the outcome.

The 3AM feeling isn't a flaw in the system.

It is the system.

← Back to jams

What are jams?

Explain this to me like I'm an AI →