Two days after the meeting with the liaison, the headlines shifted from cautious analysis to alarm.
"Province X Implements Longevity Index for Civil Service Hiring," one front-page blared.
"Government Uses Lifespan Data in Recruitment. Is This the Future?" asked another.
A smaller paper was more blunt: "Not Young Enough? You're Out."
Lin Ze sat at his desk, reading the articles one by one, feeling his chest tighten with each line. The province—one of the poorer ones, with limited resources and a history of corruption—had announced a pilot program for its civil service exams. Candidates would be required to submit health records, educational transcripts, and a lifespan projection. Those with a projected lifespan below 60 would be "carefully considered for lower-intensity roles," while those with projections above 75 would be prioritized for positions requiring "long-term investment."
The policy cited Lin's white paper as inspiration.
"They call it a Longevity Eligibility Adjustment," Zhang Yu said, dropping a thick folder onto Lin's desk. "They claim it improves efficiency. They used your name three times."
Su Yanli paced near the window. "They framed it as modern, data-driven governance. People are split. Some think it's innovative. Others are furious. And the province's governor is on television saying, 'We have limited resources. We must invest wisely.' Sound familiar?"
Lin did not flinch at her tone. He knew the words were a mirror, not an accusation. He had said something similar once. Context mattered. But context was optional when repurposing.
"What's the legal basis?" Lin asked Zhang.
"Gray," Zhang replied. "They're using publicly available data. They're not violating privacy laws because applicants are consenting—if they want the job, they have to consent. There's no law banning predictive metrics in hiring. Not yet."
"And ethically?" Lin asked, though he knew.
"It's discrimination," Su said. "You're ranking people's worthiness to work based on how long you think they'll live. It entrenches health inequities. It punishes the poor and those with pre-existing conditions."
"The province argues it's fair," Zhang added. "They say they're matching roles to projected capacity. They talk about preserving public funds."
"And what do we do?" E. Liu asked, standing in the doorway. "Our name is on this. Everyone assumes we approved."
Lin met her eyes. He saw anger there, and guilt, and fear. He felt the same.
"We have to respond publicly," Lin said. "Condemn misuse. Clarify that our model is not designed for employment decisions. We need to be loud and clear."
"But we can't stop them," Zhang said. "Not legally. At least not now."
"We can pressure them," Su said. "We can mobilize donors, academics, ethicists. We can create public outrage. No governor wants to be known as the one who ranks citizens like cattle."
"And if they double down?" E. Liu asked.
"We make it politically painful," Su replied, eyes hard. "We go to the national media. We go international. We show that this is not innovation; it's dystopia."
Lin listened to them strategize. He appreciated their pragmatism. But he also knew this was only the beginning. The model was out there. It would be used in ways he could not control. He had promised transparency, not governance. Now people were using his transparency as a blueprint for discrimination.
"Call Professor Qin," Lin said. "We need her voice. And call the provincial governor. I want to speak to him directly."
Zhang arched an eyebrow. "He may not take your call."
"Then he can explain to the media why he refused," Lin said. "Also, draft a statement. Use strong language. Don't be diplomatic. This is unethical."
Su nodded. "On it."
As they dispersed, his phone buzzed. An unknown number. He answered.
"You see it?" Han's voice asked, no preamble.
"Yes," Lin said.
"My father is pleased," Han continued. "He thinks this proves we should never have released the model. He thinks the chaos benefits him."
"Does it?" Lin asked.
"Short-term," Han said. "But long-term, this is bad for everyone. It normalizes using data to gatekeep lives. And if people accept this, they won't question when corporations do it. Mei Zhao is already writing an op-ed praising the province for 'taking control away from algorithms and giving it to elected officials.'"
Lin closed his eyes briefly. Mei was clever. She would position herself as the protector of human judgment, even if the judgment was just as flawed.
"What do you suggest?" Lin asked.
"Lean into the ethics," Han said. "Become the standard-setter again. Offer them a better way. If they insist on using longevity data, insist on oversight, appeal process, weighting for socio-economic factors. Make it burdensome. Make them abandon it."
"And if they don't?" Lin asked.
"Then we fight them publicly," Han said. "We get other provinces to condemn them. We make it unfashionable. I'll help. Dongyang Shipping doesn't operate there. I have nothing to lose."
Lin smiled despite himself. "You always have something to lose."
"Not on this," Han said. "I need the world not to go that dark. Because if it does, my father will thrive."
Professor Qin arrived at Harbor Tower an hour later. She read the articles, clicked her tongue, then wrote a paragraph that became the centerpiece of their statement:
"The misuse of predictive lifespan models in hiring is a betrayal of the very principles of fairness and transparency these models were intended to serve. The Harbor Private Trust categorically condemns any application of longevity projections in employment decisions. Such practices perpetuate existing inequities, punish the vulnerable, and erode the dignity of work. Our model was built to allocate scholarship resources equitably. It was never designed to rank the worth of human beings as employees. We call on the Province of X to rescind this policy immediately and engage in an open dialogue about ethical uses of data."
They sent the statement to every media outlet. They posted it on their site. Su called journalists she trusted. Zhang drafted letters to regulators. Lin called the governor's office and left a message: "I would like to discuss your new hiring policy. I believe it is a mistake." There was no immediate reply.
That evening, a prominent news anchor invited Lin to debate the issue live on television with a representative from the province. Lin agreed. On air, he spoke about the dangers of applying a model built for scholarships to hiring. He quoted Professor Qin's line about dignity. He admitted that releasing the model had opened the door to misuse. He did not apologize for transparency. He apologized for not anticipating the speed of exploitation.
The representative from the province defended the policy with bureaucratic language. "We are optimizing resource allocation," he said. "We are facing budget constraints. Longevity projections help us plan."
Lin responded calmly. "Optimizing budgets does not mean optimizing humanity. You are using a blunt tool on a delicate system. You risk reducing people to numbers and ignoring their potential beyond a projected number of years."
The debate went viral. Social media exploded with hashtags both supporting and condemning the province. Civil rights organizations issued statements of their own. An open letter signed by a hundred academics called the policy "a discriminatory, pseudo-scientific nightmare."
The next morning, the governor announced a suspension of the program "pending further review." Lin knew the fight was not over. The idea had been seeded. It would sprout again, elsewhere, in different forms.
He sat in his office, head in his hands, and thought of the father in the elevator. "I don't hate you." The words had been simple, but they carried a weight of resigned understanding. People didn't hate him. They hated what the world was becoming. And in their eyes, he was part of that world.
His phone buzzed.
An email from his mother.
Subject: That province
Body: "I saw the news. You're right to be angry. If our government had told your grandfather he couldn't be a teacher because he wouldn't live long enough, you wouldn't be here."
He smiled. Then he cried, softly, quietly, letting the emotion wash through him. He had fought hard to build something transparent and fair. But fairness in theory was not enough. Fairness had to be defended, adapted, contextualized. It had to be wrestled back from those who would turn it into tyranny.
There was a knock at his door. E. Liu stepped in, hesitation on her face.
"There's a group of students outside," she said. "They came to protest the province's policy. They want to speak to you."
He wiped his eyes. "Let them in," he said.
In the lobby, a small group of university students held handmade signs. "We are not our scores." "Data is not destiny." "Solidarity with the vulnerable." They looked nervous when he approached.
"Thank you for coming," he said. "I agree with you."
A young woman stepped forward. "We know you didn't mean for this to happen," she said. "But we need you to fight it with us. You have the platform."
"I will," he said.
"And," she added, biting her lip, "we… we also want to ask about how socio-economic factors are weighted. We read your paper. Some of us think it still punishes us. Can we talk about that?"
He nodded. "Of course. The ethics committee meets next week. We would like you to attend."
Her eyes widened. "Really?"
"Really," he said. "Transparency isn't just open code. It's open conversation."
As the students filed into the building, Lin felt a slight easing of the pressure. Volume Two was clearly not going to be about villains in boardrooms. It was going to be about systems, structures, unintended consequences, and the messy, necessary work of collective ethics.
He returned to his desk. The emails kept coming. Another province. A corporation. An international NGO. Each with their own spin on longevity data. Each with a potential misuse.
As night fell, he walked to the river again. The reflections on the water were blurred by wind. He thought of Mei Zhao's op-ed, published that afternoon, titled "When Algorithms Become Weapons, Humanity Must Defend Itself." She argued for human judgment. She omitted that human judgment had created worse injustices long before data arrived. She was winning hearts.
He thought of Han's father, calculating profit. He thought of Professor Qin, demanding responsibility. He thought of E. Liu, standing firm despite threats. He thought of Chen's sister, starting her studies. He thought of the students chanting. He thought of the governor rescinding. He thought of the unknown liaisons, the international interest, the looming regulations.
This was his fault.
This was his achievement.
This was his burden.
This was his opportunity.
The river kept flowing. The city kept moving. Above their heads, the numbers of predicted life spun beyond anyone's grasp. He could not stop them. He could not control them. But he could try to guide them, to ensure they did less harm than good.
He took a deep breath, squared his shoulders, and walked home, ready to plan for the next misuse, the next conversation, the next battle for a just application of the thing he had unleashed.
Volume Two was underway, and he would not turn away.
