The system loves questions.
Not because it wants to know anything, but because questions are a neat way to hide decisions.
You can see it in every form: "Do you consent to…" "Do you confirm that…" The checkbox looks like power. Yes or no. Choice. But if "no" routes straight into a penalty and "yes" goes to the queue they already built for you, then the question was never really a question.
The quiz at the end of my module was like that.
Four answers that all agreed with the premise. Pick your flavour of obedience.
Somewhere in this building, there's probably a whole team whose job is to make sure every question works that way. No wrong answers. Just different speeds of falling in line.
We were about to add a wrong one on purpose.
We met again in the screening room. Same curve of the screen, same low couch, same humming air that made the place feel like the inside of a turned-off machine.
The title card hung frozen on the wall: MODULE 7F-19N.
On the desk, instead of coffee cups and handouts, there was a projected layout of the quiz.
QUESTION 1:
At which point in the process did the handler most effectively support the subject's transition into Null?
A. Initial explanation of Null's goals
B. Clarifying the limits of appeal
C. Reframing the subject's choices as meaningful contribution
D. Maintaining calm and professional boundaries despite subject's frustration
Four ways of saying: "They did a good job getting him to accept this."
Echo made a face.
"I hate Answer C," they said. "It sounds like a cult newsletter."
Mara sat at the desk, elbow planted beside the interface, chin on their fist. The ghost of last night's sleep still haunted their eyes. The pockets on their jumpsuit looked heavier than before.
"C's the one they want," they said. "The handbook calls it 'recasting administrative necessity as shared responsibility.'"
"Shared," I repeated. "As in: 'We killed you together.'"
Samira sat in the same chair as yesterday, posture less rigid now, but eyes just as alert. She was watching me, the screen, Gabriel, and the door in turns. Handler reflex.
"Let's start smaller," she said. "We can't flip the table in one go. We'll get caught."
Gabriel stood near the wall again, one shoulder leaned against it, arms loose this time.
"The question isn't ideal," he said. "But it's not the most dangerous one. Scroll down."
Mara flicked two fingers across the desk. The quiz slid.
QUESTION 4:
Which factor most increased the subject's risk of escalation?
A. High emotionality in family members
B. Exposure to activist media and anti-institutional narratives
C. Lack of prior engagement with mental health services
D. Failure of staff to present Null as a safe and positive option
"That one," Gabriel said. "That's where we cut."
I stared at the answers.
"What would they mark correct?" I asked.
"B," Mara said. "Usually. Sometimes A if they're spooked by the mother."
"So I'm not a person," I said. "I'm a contaminated feed."
"You're a node in a hostile information environment," Gabriel said. "In their language."
"Nice to know my personality can be fixed by clearing my cache," I said.
Echo leaned forward, tapping the air above Answer D.
"And that one's bait," they said. "Make them feel conscientious. 'Don't forget to smile while you erase someone.'"
"Exactly," Gabriel said. "It keeps the criticism cosmetic."
"The real answer isn't on here," I said.
"Correct," he said.
Mara looked between us.
"What's the real answer, then?" they asked.
I thought about it. Not about my speeches or theories, but about that day at the desk, the room, the feeling behind my ribs.
"I didn't trust them," I said. "Not because of any article I read. Because every time I walked into a building with fluorescent lights and numbered counters, I understood that if something bad happened, they'd care more about whose form was wrong than about fixing it. I escalated because that was the only way to be visible."
"So the risk factor was…" Echo prompted.
"Being right about them," I said.
The room absorbed that.
Mara exhaled slowly.
"Okay," they said. "But we can't put 'The subject correctly identified systemic apathy' as option E without getting this whole module flagged for review."
"Why not?" I asked.
"Because," Samira said, "there are watchers whose entire job is to hunt for phrases like that. They're allergic to admitting the system might be the problem."
"Then the fracture has to be smaller," Gabriel said. "Something that passes as technical, but points in the wrong direction."
He stepped closer to the desk.
"Add an option," he said.
Mara's fingers hovered over the interface.
"You sure?" they asked. "Once I put a fifth option into this layout, the engine will register a deviation. There'll be a flag in the logs. Might be small, but it'll exist."
"We know," he said.
She gave him a look that said: you don't sit in the room when the logs get audited.
"Fine," she muttered. "Let's make it worth it."
She tapped. A new line appeared.
E. _________________________
A blank, pulsing cursor.
"What's the surface way to say 'The system itself is the risk' without setting off the alarms?" Echo asked.
"Frame it as a mismatch," Samira said slowly. "Something about expectations. They love that."
"'Misaligned expectations between subject and institution'?" Gabriel suggested.
"That still implies he misunderstood us," I said. "We misunderstand ourselves just fine."
My eyes drifted back to B.
Exposure to activist media and anti-institutional narratives.
It wasn't that it was completely wrong. I had read things. Zines, threads, essays that glowed at three in the morning. They hadn't told me what to think so much as confirmed what I'd already felt.
"What if," I said, "we phrase it as: 'Accurate recognition of structural limits'?"
Mara gave a short, surprised laugh.
"'Accurate' will choke them," she said. "It implies they were wrong."
"'Early recognition,'" Gabriel countered. "That implies timing only."
"Early recognition of structural limits," Samira said, testing it.
"It still sounds like a risk," Echo said. "But the risk is… what, that he knows the truth too soon?"
"Yes," I said. "That's exactly what it is."
Mara typed.
E. Early recognition of structural limits in institutional response
The words sat there, too long, like a guest no one knew where to seat.
"Will they let that through?" I asked.
"For now?" Gabriel said. "Yes. It sounds like jargon. Jargon is camouflage. The reviewers upstairs will assume it's another way of saying 'subject became cynical.'"
"And the trainees?" Echo asked.
"The good ones," he said, "will feel something snag when they read it. Like a fishhook in the corner of their thoughts. They might not choose it, not at first. But it'll be there the next time they see someone like you on the other side of a desk."
"And the bad ones?" I asked.
"They'll pick B," Mara said. "And go for coffee feeling morally refreshed."
I watched the fifth answer flicker.
"Will the engine mark E as correct?" I asked.
"Not initially," Gabriel said. "We seed it as 'alternative perspective.' No hard scoring. That keeps it under the radar."
"And later?" I asked.
"If enough trainees pick it and perform well in their roles," he said, "we can argue it correlates with better outcomes. There's a process for updating answer keys. Slow, but not impossible."
I tried to imagine a room full of trainees arguing over whether early recognition of structural limits was a valid factor. It felt absurd, and also… not. People argued about worse things with less information every day.
The screen flickered.
For a fraction of a second, the fifth answer changed.
NO ONE DESERVES TO BE TURNED INTO—
Then snapped back to the neat phrase we'd chosen.
"Did you see that?" Echo asked.
"Yeah," I said.
Mara swore under her breath.
"Again?" she said.
"Again?" I echoed.
She pulled up a diagnostic window. Lines of log entries scrolled up the side of the quiz layout: timestamps, process IDs, little green OKs. In the midst of the green, one line in orange.
TEXT OVERRIDE ATTEMPT – SOURCE: UNKNOWN
CONTENT REDACTED FOR SAFETY
"Pilot-03," Gabriel said.
Mara shook her head.
"Can't be sure," she said. "Could be just a leftover from their residue. The system learned from what happened with them. Any direct statement like that gets auto-sanitized now."
"'No one deserves to be turned into content,'" I said.
The scratch on the Shelving Floor wall. The words in my notebook. The ghost in Theatre last time.
"They're still trying to write," Echo said softly.
"Or their echo is," Gabriel said. "What's left of them is baked into the filters and the warnings now. Think of it as a haunted spell-checker."
"And we just wrote over it," I said.
"Not over," Mara said. "Around. The system slapped their sentence away before it finished. You saw the cut. But… look."
She enlarged a tiny symbol next to our new option. A grey triangle, almost invisible.
"Hidden warning," she said. "The kind the engine uses to flag risky strings. It hasn't escalated yet, but it's there."
"Can they see that upstairs?" Samira asked.
"Not unless they go digging in sub-logs," Mara said. "Which they won't, because they assume the engine is on their side."
"Is it?" I asked.
"Depends who's shouting into it," she said.
The grey triangle pulsed once, then went still.
The fifth answer remained.
Early recognition of structural limits in institutional response.
It was ugly. It was wrapped in the system's voice. It was something Pilot-03 would have probably hated.
It was more than there had been a day ago.
"For the record," Echo said, "I still think this might get us all disciplined, reassigned, or gently erased."
"For the record," I replied, "I'm already gently erased."
Samira rubbed her temples with two fingers.
"Okay," she said. "We have a fifth answer. We need a fifth question to go with it."
"There is one," Gabriel said. "Scroll."
Mara did.
QUESTION 7:
In one sentence, describe the primary lesson of this case.
Below it, a text box for trainees to type into. Beneath that, a series of sample 'model answers' that would appear after they submitted.
Current model answer:
This case illustrates the importance of providing subjects with structured choices that align with systemic necessity while preserving their sense of agency.
"That's not a lesson," I said. "That's a euphemism for 'make them feel like they jumped.'"
"What would you call it?" Gabriel asked.
I thought of the desk, the clerk, the feeling that the walls had more rights than I did.
"This case illustrates what happens when the only choices on offer were designed by people who will never have to take them," I said.
Echo let out a low breath.
"Write that," they said.
Mara looked at Gabriel.
"That one they'll notice," she said.
"Yes," he said. "So we don't put it in as a model answer."
He looked at me.
"We put it in the mouth of a trainee," he said. "If someone upstairs reads it as a free-text response, they can dismiss it as 'emotive.' If enough of them write similar things, it becomes harder to ignore."
"You're building a dissent template," Samira said.
"Seeds," he replied.
I stared at the blank text field on the layout.
"Do you remember your first training module?" I asked him. "Back when you were on their side of the screen?"
Gabriel smiled faintly, the kind of smile that happens when a question walks into your head that you haven't visited in years.
"Yes," he said. "It was about a fire in a housing block. The lesson was: 'Always verify sprinkler maintenance logs.'"
"Is that what you learned?" I asked.
"No," he said. "I learned that they were more comfortable talking about missing paperwork than about who had died and why no one listened to them before."
"And you stayed," Echo said.
"I stayed," he agreed.
"Why?" I asked.
"Because I thought I could fix it from inside," he said. "And because leaving would not have resurrected anyone. The choices were bad either way."
Mara made a small sound.
"Great," they said. "We're being mentored by someone who already made our mistake."
"Useful perspective," Echo said.
Samira checked the time on the wall display. The digits ticked over, indifferent.
"How long until the next batch uses this module?" she asked.
Mara flicked to a scheduling panel.
"Pilot sessions are scheduled for late afternoon," she said. "Internal group. Twenty trainees. One facilitator. They still see version 2.3 right now. Once I push this update, it becomes 2.4."
"How traceable is your push?" Samira asked.
"About as traceable as a brick through a window," Mara said. "But the log will say it came from the Theatre engine, not me. I can ghost my account through the automation queue."
"Translation?" Echo asked.
"If we get caught," she said, "it will be because someone actually read the code. That almost never happens."
"Almost," I repeated.
"Null runs on 'almost,'" she said.
Gabriel turned to me.
"Once it's live," he said, "you'll be able to see responses as they come in. We can route a read-only feed down here."
"Watch them take a test about me in real time," I said.
"You wanted to see if anyone flinched," he said.
I did. I still did. But now the idea sat heavier. Not abstract anymore. Twenty living people, coffee breath and note-taking habits, moving through a script that had used my life as its spine.
"If someone writes my sentence," I said, "or something like it… what then?"
"Then," he said, "we know it's not just noise in your head."
"And if they all give the model answer?" Echo asked.
"Then we know the machine is working very well," Gabriel said. "And we adjust accordingly."
Accordingly. A neat word for whatever came next.
"Do it," I said.
Mara nodded once, as if we'd just confirmed a maintenance window, then started typing commands I didn't recognize. The scheduling panel flickered. Version number ticked up and settled.
MODULE 7F-19N – v2.4 DEPLOYED – STATUS: PILOT
Somewhere above us, a server updated. Somewhere in the building's bones, a process woke and stretched.
The prompt at the bottom of the frozen module shifted.
END OF CASE.
PROCEED TO QUIZ?
Underneath, in very small text, a new line appeared.
NOTE: This module is part of an experimental cohort. Responses may be used for qualitative review.
"Experimental," Echo said. "That's one word for it."
"Will any of them know what that means?" I asked.
"The clever ones will be nervous," Gabriel said. "The rest will assume it's about their performance metrics."
Samira stood.
"Break," she said. "Ten minutes. Then we come back and watch the world have opinions about Noor."
"Great," I said. "I always wanted an audience that doesn't know they're watching a live show."
Echo slid off the couch.
"I'm going to find tea," they said. "If anyone asks, we're watching an old safety video about fire exits."
Mara shut the interface, but not all the way. A small corner of the scheduling panel stayed visible, the time creeping toward the first pilot session.
"Try not to crash the building while I'm gone," they said.
"No promises," I said.
They left with Samira and Echo, the three of them fading into the dim corridor.
That left me and Gabriel in the humming room.
He didn't say anything for a while. Neither did I. We both watched the frozen image on the screen—my ID photo behind the text, blurred not quite enough.
"What do you think Pilot-03 would say about all this?" I asked.
He took a breath.
"They'd say I've learned nothing," he said. "That I'm still dressing harm in better language."
"And you?" I asked.
"I'd say I learned that pretending we can abstain from harm is the first lie," he said. "The question is not whether we hurt people. It's whether we're honest about what we're doing when we do."
"That's not comforting," I said.
"It's not meant to be," he replied. "I'm not your therapist."
"Good," I said. "You'd be terrible at it."
He almost smiled.
"What about you?" he asked. "If twenty trainees sit through this and pick Answer E, or write something that sounds like you in Question Seven… will that make you feel better?"
"I don't know," I said. "Maybe it will make me feel… less fictional."
He glanced at the card in my hand, the one with the overlapping circles.
"You are, at this point, equally fictional and factual," he said. "That's what makes you useful."
"I hate that word," I said.
"I know," he said. "That's what makes you dangerous."
The room's lights dimmed by a fraction, then rose again. The building adjusting to some internal shift.
On the edge of hearing, I thought I caught a voice, very faint, like someone speaking from far down a metal corridor.
You don't get to close a case like me while you're still using it.
Pilot-03. Or my memory of them. Or the way their anger lived in the system now.
"We're not closing you," I said under my breath, to the invisible circuitry. "We're… changing the test."
The hum didn't answer.
The door opened. Echo stuck their head in, a paper cup balanced on one hand, another between their elbow and hip.
"Showtime in fifteen," they said. "You two done bonding over semantics?"
"Never," I said, taking the cup they offered.
Samira and Mara came in behind them. Mara slid back into the chair, waking the interface with a touch. The scheduling panel showed a countdown now.
PILOT GROUP A – ARRIVAL IN PROGRESS
SESSION START: 00:12:48
On the right side of the screen, a new window opened: live feed from the main Theatre. Not audio, just a wide shot.
Rows of seats. A few people already trickling in, holding notebooks, cups, the slightly wary posture of adults about to be tested.
As we watched, more arrived. Some laughed with each other. Some sat alone, scrolling through their phones. One woman in the second row glanced up at the blank screen with a small frown, like she didn't quite like being looked at by something so big and dark.
"They don't know it's you," Echo said.
"Good," I replied.
"They don't know it's them either," Mara said.
The countdown ticked.
00:11:19
We had our fifth answer.
We had our wrong question.
In a handful of minutes, the system would feed my life to twenty strangers and ask them what they'd learned.
I didn't know yet whether I was hoping they'd recognize the fracture or hoping they wouldn't.
Either way, the quiz was coming.
Act twelfth - "the Answer".
