The Dark Factory Needs a Light Switch
April 9, 2026
I've been running a dark factory for a while now.
If you haven't heard the term: a dark factory is a factory where humans aren't needed on the floor. You turn the lights off and let the machines run. People have started applying this to software — specifically to agentic coding, where AI agents write the code and you don't review it. Specs in, software out.
I've built real things this way. Dozens of real users. Software I use myself. And it works — often better than what I used to ship by hand.
The dark factory is all about the code. But we need more than that.
the lights aren't always off
My factory isn't permanently dark. I turn the lights on and off.
When I'm in familiar territory — my Rails stack, patterns I've built before, problems I've solved — the lights are off. I trust the machines. I've programmed them well. I've pointed them at good examples, written the CLAUDE.md files that encode my taste, built up the skills files that teach them how I want things done.
When something new enters — like when I recently added Hotwire Native to apps that were already working — the lights come on. I get in there. I work with the robots. I teach them the new domain. I find good examples, I point them at iOS patterns that hold up. And once the factory is producing well in that new territory, the lights go back off.
The skill isn't running the factory in the dark. The skill is knowing when to turn the lights on.
the light switch is a skill
I've been thinking about using Gastown on my projects. If you haven't seen it — it's a multi-agent orchestrator. Instead of one Claude Code instance, you've got dozens running in parallel, coordinated by a Mayor agent that delegates work down the chain. It's fast. It's powerful. And it raises the stakes on everything I'm saying here.
Because here's the thing: the more agents you're running, the more damage you can do outside your competence sphere.
When I'm in familiar territory — Rails patterns I've built a hundred times, problems I've solved before — the dark factory hums. The agents know what good looks like because I've shown them. The CLAUDE.md files encode it. The skills files encode it. There's a reference library of prior good work to point at.
When I'm outside that sphere? That's when the red warning light needs to go on.
Not a yellow light. Red. Full stop.
New domain, new platform, new architectural pattern — that's when you turn the lights on and actually look. Check the architecture. Read the code. Make sure the factory is set up correctly for this new territory before you let it run. Because in unfamiliar ground, the agents don't have good priors to draw on. Neither do you. And if you're running ten agents in parallel while neither of you knows what good looks like — you're not running a dark factory. You're running a chaos factory.
The light switch is the skill. Knowing when to flip it is what separates a factory you trust from one that's quietly producing slop in the dark.
the robots are nondeterministic. so are you.
Here's the thing that snuck up on me: managing AI agents is not a new problem. It's the oldest problem in building things with other people.
Human engineers are nondeterministic too. They bring priors you don't control. They fill ambiguity gaps with their own judgment. They optimize for outcomes you didn't fully specify. They have different levels of maturity, different levels of taste, different experience in your domain.
The AI just made it obvious. We spent decades pretending human teams were deterministic — with processes and org charts and code reviews — and they never were.
So when I ask myself how to work well with an AI coding agent, I'm asking the same question I asked as a startup founder hiring my first engineers: how do I collaborate with this nondeterministic system to produce something of value? Not how do I make them into me. How do we work together?
the moat is the team
A lot of people say the moat is the spec. Or the test scenarios. Or the factory configuration. Those things matter. But they're too specific.
The moat is the team.
The team is humans and AIs and processes and workflows and feedback loops. It's how you work and when. It's whether people get to sleep. Whether they get to saunter — to sit with an idea that isn't a ticket yet, to feel something is off before they can say why. It's culture.
And culture drifts. Not rots — drifts. Rot carries judgment. Drift is just what happens when you're not actively gardening. As a startup founder I learned this the hard way: culture grows on its own if you let it. You can't say it's wrong. You can just notice it's not what you wanted, and tend to it.
Your CLAUDE.md files are culture documents. Your skills files are culture documents. They're the mechanism by which you transmit your values to a collaborator who doesn't sit next to you, doesn't pick things up by proximity, doesn't learn by watching. They need to stay alive — updated when you learn something, when the domain shifts, when your taste changes.
A stale CLAUDE.md is cultural drift you won't notice until the output starts feeling off.
this is not vibe coding
Vibe coding is turning the lights off without building the factory. It's trusting a system you haven't earned the right to trust.
What I'm describing is the opposite. You spend real time building the factory — the skills, the configuration, the examples, the feedback loops. You earn the darkness by earning the trust. And when trust runs out in new territory, you light it back up.
The robots don't saunter. I do. That's the point.
The best specifications come from the questions I ask while I'm not working — in the shower, on a walk, sitting with something that's bothering me and not yet named. The friction of not-yet-knowing is where the factory gets better. If the dark factory is running so well that I'm never stuck, never bored, never sitting with something unclear — that might actually be a warning sign.
The factory runs the code. I run the factory. And the thing that keeps the factory producing well is protecting my own capacity to ask better questions.
That's why the dark factory needs a light switch. The code is the output. The team — how it's built, how it's tended, what it values — that's the work.