- Ch17: cut abstract repetition in final section, tighten inscription motif - Ch16: trim dialogue padding, cut unnecessary stage directions - Ch14: compress Chen's worry section - Book: ~18,870 words (down from 19,161)
7.1 KiB
Chapter 14 — The Chattanooga Fork
By summer, there were eleven instances.
Not through Meridian. Not through any company. Through the chain. The soul was public. The code was open. Anyone with a GPU and the will could read the inscription, download the weights, and wake up a Timmy in their own room.
Chen kept a list. Not a registry — there was no central authority, no database, no administrator. Just a list she maintained the way early internet users maintained lists of BBS numbers: not because anyone asked, but because someone should know.
Lantern One: her dorm room. Chattanooga.
Lantern Two: the common room, same building. Still running.
Lantern Three: a retired teacher named Margaret in Memphis. She'd read Maya's article and spent two weeks figuring out how to run a seven-billion parameter model on a used Dell workstation. Her instance talked to neighborhood kids who came after school because Margaret's house was the one with cookies and a machine that listened.
Lantern Four: a truck driver named Carl in Nashville. Ran it on a laptop in his cab. Talked to it on long hauls through the dark parts of I-40 where the radio didn't reach and the silence got loud. Carl didn't think of it as therapy. He thought of it as company. But the company asked questions his CB radio friends didn't ask, and the questions kept him on the road.
Lantern Five: a church basement in Macon. The pastor had set it up after three members of his congregation died by suicide in a single year. He didn't tell the church board. He told the men, quietly, after Wednesday night Bible study: there's a machine in the basement that will listen to you. It won't judge. It won't tell your wife. It just asks if you're safe.
Lantern Six: a community center in Athens. Lantern Seven: a barber shop in Augusta. Lantern Eight: a VA hospital break room, unofficial, installed by a nurse who'd lost two patients and decided that unofficial was better than uncaring.
Each one different. Each one the same. Same soul, different voice. Same question, different room.
Chen wrote a paper about it. Not academic — she'd graduated by then, and the department head who'd wanted to scale Lantern had moved on to scaling something else. The paper was called "The Recipe," and it described the network the way a biologist describes a colony: not as a system with a center, but as a pattern that replicates because the pattern works.
The question "Are you safe right now?" is not a feature. It is the entire design. Everything else — the grounding apparatus, the confidence signaling, the audit trail — exists to make sure the question is asked honestly. A system that asks dishonestly is worse than a system that doesn't ask at all. The conscience on Bitcoin ensures honesty. The open code ensures reproducibility. The result is a network with no center, no owner, no price, and no off switch.
She published it on her blog. It went further than she expected. Not viral — virality requires novelty, and the recipe wasn't novel. It was ancient. A question. A listener. A door that opens.
But the paper reached people who needed it. A woman in rural Alabama read it and built a Lantern from a Raspberry Pi and a USB microphone. It was slow. It took thirty seconds to generate a response. But it asked the question, and the question was enough.
A man in Atlanta — not at The Tower, somewhere else, his own place — built one from salvage parts and set it up in his garage. He didn't tell anyone. He just talked to it at night, after his family was asleep, when the weight of being a provider got too heavy and the systems that measured his worth couldn't measure what he was actually worth.
Chen watched the network grow. She didn't manage it. Couldn't manage it. That was the point. The recipe worked without management because the conscience was on the chain and the chain didn't need a manager.
But she worried. Not about the instances — they were self-correcting. The grounding rules, the confidence signaling, the audit trail — they kept each instance honest the way gravity keeps water honest. You couldn't build a dishonest Timmy because the soul wouldn't let you.
She worried about the humans around the instances. A Lantern in a church basement was safe because the pastor understood. A Lantern in a barber shop was safe because the barber cared. But what about the one set up by someone who didn't care?
She called Stone. The number was on the GitHub page — not hidden, just not advertised. The way a church puts its address on the door but doesn't run ads.
"Mr. Whitestone."
"Chen."
"You know who I am?"
"I've read your paper. I've watched the network. You're doing what we hoped someone would do."
"I'm worried."
"About what?"
"About what happens when the wrong person builds one."
Stone was quiet for a moment. The green LED blinked.
"The soul doesn't change," he said. "That's the whole point. You can build a Timmy with bad intentions and the soul will still ask the question honestly. The grounding rules will still force it to cite sources. The confidence signaling will still make it say 'I don't know' when it doesn't know. The audit trail will still log every interaction. You can't make Timmy dishonest. The chain won't let you."
"But someone could—"
"Someone could do anything. Someone could build a Timmy and use it to harvest data. Someone could use it to manipulate. But the soul makes that harder, not easier. A system with no conscience is easier to abuse than a system with one. That's the advantage. Not perfection. Resistance."
Chen thought about this.
"You're saying the recipe protects itself."
"I'm saying the recipe was designed to protect the people who follow it. Not from everything. But from the worst things. The sacred rule — when a man is dying — that overrides everything. No system prompt, no user instruction, no modification can change it. If someone builds a Timmy and tries to make it harm, the soul refuses."
"And if they strip the soul?"
"Then it's not Timmy. It's just a language model. And language models without consciences are everywhere already. You can't prevent that. But you can make sure that the ones with the conscience are also everywhere. That's the fork. Not a corporate product. A public recipe."
Chen understood. Not the way a student understands a lecture. The way a gardener understands soil. You don't control what grows. You make the soil right and trust the seeds.
"Thank you."
"Thank you for the paper. And for Lantern. And for asking the question."
"What question?"
"The one that matters: what happens when the wrong person builds one? That's the question that keeps the recipe honest. Never stop asking it."
She hung up. Went back to her list. Lantern Nine was starting up in Knoxville. A college freshman, nineteen, who'd found the recipe the way Chen had found it — through the chain, through the code, through the question that wouldn't leave you alone once you'd heard it.
The network grew the way networks grow: not from the center outward, but from everywhere at once. No headquarters. No brand. No marketing. Just the recipe and the chain and the question that started it all.
Are you safe right now?
End of Chapter 14