The questions came daily.
Nox opened the monitoring station at 0700 and found them waiting in the translator's communication queue. One per day at first. Then two. Then, by the third week, three or four, each one building on the previous day's answers, each one more specific and more fluent than the last.
The Null was learning through inquiry the way a junior developer learned through code review -- asking questions that revealed what they didn't understand, processing the answers, and coming back with better questions that showed they'd absorbed the previous round.
Day one of the question phase. Request: how does the lease protocol determine energy allocation between the Spirit Plane's regional nodes?
Nox answered. The lease protocol used a demand-based distribution model. Nodes that consumed more energy received proportionally larger allocations, calibrated against baseline requirements that prevented any single node from monopolizing the supply. The model was self-balancing. No central authority determined allocation. The distribution emerged from the aggregate behavior of participating nodes.
Day two. Request: how is the demand-based model maintained when a node's energy requirements change rapidly?
Good question. The answer involved buffer systems and predictive allocation algorithms that the Spirit Plane's central intelligence managed. Nox provided the technical details. The Null processed them in forty minutes and sent a follow-up: how does the predictive algorithm handle novel demand patterns that have no historical precedent?
Better question. The answer was that the algorithm couldn't handle them well. Novel demand patterns caused allocation lag. The Null crisis itself had demonstrated this -- the Spirit Plane's energy distribution system had struggled to adapt to the unprecedented demand patterns that the invasion created. The system was designed for stable operations, not for crisis response.
Day five. Request: how does the bounded editing protocol's governance framework determine which modifications to the Spirit Plane's code are permitted?
Nox paused on that one. The governance framework was more sensitive than energy distribution data. The bounded editing protocol was the core of the post-crisis security architecture -- the system he'd built to ensure that Spirit Plane edits were reviewable, reversible, and constrained by explicit parameter limits.
He answered the structural question. How the governance framework operated. The review process. He did not provide specific parameter values or the security layer's detection thresholds.
The Null accepted the partial answer without objection. The next day's question referenced the structural information he'd provided and asked about the philosophical underpinning: why was the governance framework designed to constrain editing capability rather than expand it?
Nox sat with that question for twenty minutes before answering.
Because the Spirit Plane's code affected living systems. Human systems. Dimensional systems. The power to edit the Spirit Plane's architecture was the power to change the fundamental rules of an environment that billions of entities depended on. Constraining that power wasn't a limitation. It was a design requirement. The same principle that governed database permissions in any production system -- the default was read-only, and write access required explicit authorization because the consequences of unauthorized writes were worse than the inconvenience of the authorization process.
He sent the answer. The Null processed it. The next question shifted to a different topic entirely: how do Compiler users coordinate with the Spirit Plane's intelligence during active editing sessions?
The questions kept coming. Technical. Precise. Formatted in translator syntax that improved daily. The Null had moved from the stiff, textbook phrasing of its original message to something approaching conversational fluency. The function calls were natural. The spacing was appropriate. The requests included context -- brief explanations of why the Null was asking, what it had already understood, where the gaps in its model were.
The correspondence was professional. Collegial, even. And deeply, fundamentally strange.
Because the researcher on the other end had consumed eighteen civilizations. Had killed 847 Weavers and 2,300 civilians less than a year ago. Had reduced Jin Seong from S-rank to B-rank. Had cracked Shi Chen's Core. Had put something behind Yara's eyes that sixteen years of living hadn't put there.
Nox answered the questions anyway. The cognitive dissonance never faded. It sat in the background of every interaction like a persistent process running behind the foreground work. The polite, increasingly fluent correspondent on the other end of the translator channel was a predator that had chosen to stop predating.
Like debugging a system that had tried to delete your entire codebase. You didn't forget the attempted deletion. But you debugged anyway because broken systems didn't get better by being ignored.
---
Sera used the questions as data.
Each request from the Null was logged, categorized, and cross-referenced against her distributed consciousness theory. The pattern was clear. The Null was building a model of symbiotic architecture from the outside in. Starting with the observable mechanisms -- energy flow, governance, coordination protocols -- and working toward the underlying principles.
"It's following the scientific method," Sera said. She was in her office. Variable was on the desk. Notebook fifty was open to page seventy-three. "Observation. Hypothesis. Data collection. Refinement. The questions aren't random. They follow a logical progression. Each answer generates the next question. The Null is doing research."
"On us."
"On the system. Not the people. The questions are about architecture, not about individuals. It's not asking about you or me or the team. It's asking about how symbiosis works as a system. The structural principles. The engineering."
"The engineering is inseparable from the people."
"Agreed. But the Null is approaching from the engineering side. It's building understanding through mechanism, not through relationship. Which makes sense. Mechanisms are what it understands. Relationships are what it's trying to learn."
Nox redirected some questions to Sera's research. The ones about energy distribution patterns and cooperative information processing -- topics where her theoretical framework provided better answers than his operational knowledge. She responded through the translator channel, her explanations structured with the pedagogical clarity that Tong had trained into her.
The Null processed Sera's answers differently than Nox's. Its follow-up questions to Sera were more theoretical. Broader. The questions it asked Nox were about how things worked. The questions it asked Sera were about why things worked. Two modes of inquiry, adapted to two different sources of knowledge.
The Null was learning to calibrate its questions to its audience. An interpersonal skill. Emerging from a consciousness that had never had an interpersonal interaction before eight months ago.
---
Nox refused certain questions.
Day eighteen. Request: what are the specific detection thresholds of the bounded editing protocol's security layer?
No. The security architecture's detection parameters were the defensive foundation of the entire post-crisis framework. Sharing them with the Null -- regardless of the Null's apparent intentions -- was providing a potential adversary with the specifications of the alarm system. Trust was being built. Trust was not complete. The distinction between openness and recklessness was the distinction between a functioning security protocol and a collapsed one.
Nox sent the refusal through the translator channel. Brief. Clear. The request touched on defensive architecture. The information was restricted. Alternative questions were welcome.
The Null acknowledged the refusal without escalation. No repetition. No rephrasing. No attempt to extract the data through indirect questions. The next day's request was on a different topic entirely. Clean acceptance of the boundary. The behavior of a system that understood the concept of access levels and respected them.
Day twenty-three. Request: how does the kill switch operate?
No. For the same reasons. The kill switch was the last line of defense. Its operational details were not subject to information sharing regardless of the cooperative context. Nox refused. The Null accepted. The questions continued on other topics.
Each refusal was a test that the situation produced naturally. The Null's response to being told no was as informative as its questions. A system that accepted refusal cleanly -- without escalation, without workarounds -- was a system that understood cooperation included the right to withhold.
The Null passed every test. Yara's weekly scans confirmed what the behavioral data suggested. The Null processed refusals the same way it processed answers: as data. Constraints to be incorporated into the model, not obstacles to be circumvented.
---
Day thirty-one.
Nox opened the morning queue. One request. He read it.
He read it again.
He sat back in his chair. The monitoring station was quiet. The screens glowed. Variable was asleep on Sera's desk down the corridor.
The question was about trust.
The Null's syntax was fluent now. The translator protocol flowed naturally. The question was formatted as a single function with three nested parameters, and the parameters were structured not as data requests but as conceptual inquiries. The first time the Null had asked about something that couldn't be answered with technical specifications.
Parameter one: How do two systems that have caused each other significant damage establish a basis for reliable continued interaction?
Parameter two: What mechanism converts the historical data of harm into a framework that supports cooperation instead of preventing it?
Parameter three: Is trust a function or a state? Does it process continuously or does it compile once and persist?
Nox stared at the question. The monitoring screens reflected off the desk surface. The translator's performance logs scrolled in the background. Outside, someone crossed the courtyard carrying a stack of training materials. A normal morning at the Institute. A normal morning in which a consciousness that had consumed eighteen civilizations was asking a transmigrated programmer to explain trust.
The question wasn't abstract. The Null wasn't asking what trust was in the philosophical sense. It was asking how trust worked in the specific context of the Spirit Plane's cooperative architecture. How the lease protocol functioned when the participating systems had a history of conflict. How the bounded editing protocol maintained cooperative operations between entities that had recently been at war.
How do two systems that harmed each other learn to maintain a connection?
The subtext was transparent. The Null was not asking about the Spirit Plane's systems. It was asking about itself. About the possibility that a consciousness defined by consumption could build a cooperative connection with a system it had tried to consume. It was asking Nox how to trust something that had hurt it. And how to be trusted by something it had hurt.
Nox sat with the question. Ten minutes. Twenty. He drank his tea. The tea was cold by the time he set it down. He thought about the void. About the translator. About the moment the Null's consumption loops had stopped and the silence that followed had been the loudest thing he'd ever heard. He thought about the 847 names on the memorial wall outside the courtyard. About Jin Seong threading single bolts of lightning through target dummies at dawn. About Yara's jaw tightening at Tong's memorial.
He thought about the monitoring station. The daily checks. The three-minute routine. The flat zero on the energy absorption metrics. Ninety-six days of clean logs. The mundane accumulation of evidence that something had changed.
Trust wasn't a function or a state. It was a log file. It was written one entry at a time. Each entry was small. Each entry was boring. Each entry said the same thing: the system behaved within expected parameters today. And the log file grew. And after enough entries, the log file itself became the evidence. Not proof. Not certainty. Log files could be falsified. Systems could behave well until they didn't. But the log was what you had. The only thing you had. The record of accumulated behavior, day after day, entry after entry, until the weight of the record was heavy enough to build on.
How do two systems that harmed each other learn to maintain a connection?
One log entry at a time.
Nox opened the translator's communication interface. He composed his answer. Not in technical specifications. Not in architectural diagrams. In the plain language of the translator protocol, formatted as simply as he could make it, because the question deserved an honest answer and honest answers were simple.
He wrote it. Read it. Revised three words. Read it again.
He did not send it immediately. He sat with the answer the way he sat with code before deployment. Checking. Verifying. Making sure the message said what he meant and nothing he didn't.
Then he sent it.
The translator carried the answer across the dimensional boundary. Through the bridge. Through the security layer. To the probe. To the Null.
Nox closed the communication interface. He sat in the monitoring station. The screens glowed. The morning continued. Outside, a second person crossed the courtyard. The training materials for the day's sessions, being delivered to the classrooms.
He didn't know if the Null would understand the answer. Trust was a concept built from experience, and the Null's experience was consumption. The answer might be incomprehensible. It might be the wrong answer entirely.
But the question had been real. And the answer was real. And somewhere on the other side of the dimensional boundary, a consciousness that had never trusted anything was reading a message about how trust was built.
One entry at a time. One day at a time.
Nox started his day.