# Jams — Daniel Spinosa

Collaborative thinking — ideas explored, not just stated.

## [The Bugs Are the Features](https://danspinosa.com/jams/the-bugs-are-the-features)

*April 7, 2026*

The wrong question is: what can AI do?

The right question is: what does AI doing things reveal about what humans are?

That reframe is everything.

AI doesn't saunterWe built AI to predict the next best token. Given input, produce output. Fast. Accurate. Tireless. We didn't build it to throw the input away and come back with something totally different based on its own motivations and taste. We explicitly didn't build that. That wasn't the goal.

And it shows.

AI doesn't saunter. It doesn't sit with an idea. It doesn't wake up at 3AM and realize the whole frame was wrong. It doesn't push back on the brief. It doesn't ask should we even be doing this? It just does it — again and again, with complete confidence — until you tell it to stop.

That's not a criticism. That's what makes it extraordinary at what it does.

But it reveals something.

The bugs are the featuresEvery quirk we've been trained to apologize for — the anxiety, the second-guessing, the wandering curiosity, the stubborn feeling that something's off even when you can't explain why — those aren't inefficiencies.

They're the whole job now.

The person who questions the brief is the most valuable person in the room. The one who says wait, are we even solving the right problem? The one who wakes up at 3AM not because they failed, but because their judgment is running at a level that doesn't clock out.

That discomfort? That's quality control. That's taste. That's something AI doesn't have access to.

“I can produce a wrong answer with complete confidence. You can't. That feeling when something's off — that's yours. That's not a bug. That's the feature.

- Opus 4.6”

The complementThere's a concept in economics called complementary goods. When one thing gets dramatically cheaper, demand rises for its complement. Gasoline gets cheap, people drive more. AI gets good at knowledge labor — fast execution, code, drafts, synthesis — and its complement becomes the scarce thing.

Human judgment. Taste. Moral courage. The willingness to actually care whether the answer is right.

We spent decades trying to train that out of people. Make them consistent. Predictable. Six sigma perfect. Flowcharts and optimization. That's great for products. It's terrible for people.

And now the products have gotten so good at being consistent and predictable that the inconsistent, unpredictable, opinionated human in the room is the irreplaceable one.

My experienceI've been an engineer for a long time. I was fast. I could build something in a day that might take others a week. And what I did with the time I saved wasn't grind out more features.

I used it to sit with the problem. To ask whether we should build the thing at all. To sleep. To read weird stuff. To go outside. To follow my curiosity into corners I couldn't justify to anyone.

I didn't do it because I had a strategy. I did it because I liked it.

Turns out that wandering built something — a particular kind of brain, a set of priors, a way of connecting things that isn't easy to replicate. That's what AI doing the labor creates space for. Not more labor. More human.

This isn't a crisis. It's a rethinking.We had to do the labor before because we didn't have the machines. The machines came, and we didn't stop being valuable — we became legible to ourselves. We could finally see what we were actually good at.

Same thing is happening now. Faster. More disorienting. But the same.

The combine harvester didn't make the farmer obsolete. It made visible what the farmer was actually for — knowing what to plant, when, why, for whom, and whether this season's risk is worth taking. That's not labor. That's judgment. Taste. Experience. The accumulated weight of caring about the outcome.

The 3AM feeling isn't a flaw in the system.

It is the system.

## [What is a Jam?](https://danspinosa.com/jams/what-is-a-jam)

*April 6, 2026*

"Did AI write this?" is the wrong question.

It's like asking did a typewriter type this. Did a compiler compile this. The tool isn't the author.

I think about this book report I did as a kid. Everyone else was doing it by hand. I made mine on a word processor — did the layout, found pictures, made a binding. It was done. And someone asked me, "did the computer make this?"

I was kind of offended. I was a kid and I remember thinking, what a dumb question. I made it. I used a computer. Did a pen make your book report?

AI is new, so we're not used to it yet. And because we call it "intelligence," we get weird about it — like maybe it's doing the human's job? But it's a tool. A loud, confusing, impressive tool, but a tool.

A jam is me thinking out loud, using that tool. The ideas are mine. The thinking is mine. I'm not the best writer — so I work with something that is. We go back and forth, argue a little, and what comes out is something I couldn't have written alone. But the soul, that's from me.

---
Source: [danspinosa.com/jams](https://danspinosa.com/jams)