Artificial Condition (The Murderbot Diaries, #2)(6)



“I like parts of my function.” I liked protecting people and things. I liked figuring out smart ways to protect people and things. I liked being right.

Then why are you here? You are not a “free bot” looking for your guardian, who presumably cannot simply be sent a message via the public comm relay on the transit ring we recently departed.

The question caught me by surprise, because I hadn’t thought it was interested in anything besides itself. I hesitated, but it already knew I was a SecUnit, and it already knew there was just no circumstance where it was legal and okay that I was here. It might as well know who I was. I sent my copy of the Port FreeCommerce newsburst into the feed. “That’s me.”

Dr. Mensah of PreservationAux purchased you and allowed you to leave?

“Yes. Do you want to watch WorldHoppers again?” I regretted the question an instant later. It knew that was an attempt at a distraction.

But it said, I am not allowed to accept unauthorized passengers or cargo, and have had to alter my log to hide any evidence of your presence. There was a hesitation. So we both have a secret.

I had no reason not to tell it, except fear of sounding stupid. “I left without permission. She offered me a home with her on Preservation, but she doesn’t need me there. They don’t need SecUnits there. And I … didn’t know what I wanted, if I wanted to go to Preservation or not. If I want a human guardian, which is just a different word for owner. I knew it would be easier to escape from the station than it would from a planet. So I left. Why did you let me onboard?”

I thought maybe I could distract it by getting it to talk about itself. Wrong again. It said, I was curious about you, and cargo runs are tedious without passengers. You left to travel to RaviHyral Mining Facility Q Station. Why?

“I left to get off Port FreeCommerce, away from the company.” It waited. “After I had a chance to think, I decided to go to RaviHyral. I need to research something, and that’s the best place to do it.”

I thought the mention of research might stop its questions, since it understood research. No, not so much. There were public library feeds available on the transit ring, with information exchange to the planetary archives. Why not do the research there? My onboard archives are extensive. Why haven’t you sought access to them?

I didn’t answer. It waited thirty whole seconds, then it said, The systems of constructs are inherently inferior to advanced bots, but you aren’t stupid.

Yeah, well, fuck you, too, I thought, and initiated a shutdown sequence.





Chapter Three

I JOLTED AWAKE FOUR hours later, when my automatic recharge cycle started. The transport said immediately, That was unnecessarily childish.

“What do you know about children?” I was even more angry now because it was right. The shutdown and the time I had spent inert would have driven off or distracted a human; the transport had just waited to resume the argument.

My crew complement includes teachers and students. I have accumulated many examples of childishness.

I just sat there, fuming. I wanted to go back to watching media, but I knew it would think it meant I was giving in, accepting the inevitable. For my entire existence, at least the parts I could remember, I had done nothing but accept the inevitable. I was tired of it.

We are friends now. I don’t understand why you won’t discuss your plans.

It was such an astonishing, infuriating statement. “We aren’t friends. The first thing you did when we were underway was threaten me,” I pointed out.

I needed to make certain you didn’t attempt to harm me.

I noticed it had said “attempt” and not “intend.” If it had cared anything about my intentions it wouldn’t have let me onboard in the first place. It had enjoyed showing me it was more powerful than a SecUnit.

Not that it was wrong about the “attempt.” While watching the episodes I had managed to do some analysis of it, using the schematics in its own public feed and the specs of similar transports available on the unsecured sections of its database. I had figured out twenty-seven different ways to render it inoperable and three to blow it up. But a mutually assured destruction scenario was not something I was interested in.

If I got through this intact, I needed to find a nicer, dumber transport for the next ride.

I hadn’t responded and I knew by now it couldn’t stand that. It said, I apologized. I still didn’t respond. It added, My crew always considers me trustworthy.

I shouldn’t have let it watch all those episodes of Worldhoppers. “I’m not your crew. I’m not a human. I’m a construct. Constructs and bots can’t trust each other.”

It was quiet for ten precious seconds, though I could tell from the spike in its feed activity it was doing something. I realized it must be searching its databases, looking for a way to refute my statement. Then it said, Why not?

I had spent so much time pretending to be patient with humans asking stupid questions. I should have more self-control than this. “Because we both have to follow human orders. A human could tell you to purge my memory. A human could tell me to destroy your systems.”

I thought it would argue that I couldn’t possibly hurt it, which would derail the whole conversation.

But it said, There are no humans here now.

I realized I had been trapped into this conversational dead end, with the transport pretending to need this explained in order to get me to articulate it to myself. I didn’t know who I was more annoyed at, myself or it. No, I was definitely more annoyed at it.

Martha Wells's Books