Warcross (Warcross #1)(96)
I stare, incredulous. “These are all the minds of your users? You can see into their thoughts? Their brains?”
“I can do more than just see. The NeuroLink has always interfaced with the human brain,” Hideo continues. “That is what makes its virtual reality so efficient and so realistic. That’s what made the glasses special. You knew this. Until now, I used that interface as a one-way information system—the code simply created and displayed what your brain wished. You move your arm; the code moves your virtual arm. Your brain is the one in control.” He gives me a pointed look. “But information travels both ways.”
I struggle to comprehend the truth of what he’s saying. Hideo’s invention uses the world’s best 3-D effects generator—your own brain—to create for you the most incredible illusion of reality ever.
The world’s best brain–computer interface.
I shake my head, not wanting to believe his words. “What are you trying to say?”
Hideo looks at me for a long moment before he answers. “The end of the game,” he says, “activated the NeuroLink’s ability to control its users’ minds.”
The NeuroLink can control its users.
The realization hits me so hard and fast that I can barely breathe. Users are supposed to be able to control the NeuroLink with their minds. But that can also be used the other way—type in a command and use that to tell the brain what to do. Type in enough commands, and the brain can be permanently controlled. And Hideo has created an entire algorithm to do this.
I take a step back, steadying myself against the side table. “You are controlling how people think,” I say, “. . . with code?”
“Those Warcross lenses were free,” Hideo reminds me. “They have been shipped to nearly every person in the world, in almost every corner of the globe.”
The news stories of long lines, of shipments of stolen lenses. Now I understand why Hideo wasn’t worried about stolen shipments. The more given out, the better.
Hideo brings up another image of the inside of a user’s mind. This time, the oval’s colors look deep red and purple. “The NeuroLink can tell when a user’s emotions shift to anger,” he says. “It can tell when they are plotting something violent, and it knows this with incredible accuracy.” He shifts our view to the actual person behind this specific mind. It’s a person struggling to pull a handgun out of his coat, his forehead matted with sweat as he prepares to hold up a convenience store.
“Is this happening right now?” I manage to say.
Hideo nods once. “Downtown Los Angeles.”
Right as the person reaches the convenience store entrance, the dark red oval representing his mind suddenly flares, flashing bright. As I look on, the NeuroLink’s new algorithm resets the colors. The deep scarlet turns into a mild mix of blues, greens, and yellow. On the live view, the man freezes. He stops pulling out his gun. There is a strange blankness on his face that sends a shiver through me. Then, as his face calms, he blinks out of it, exits, and moves on down the street, the convenience store forgotten.
Hideo shows me other videos, of events all happening simultaneously around the world. The color maps of billions of minds, all controlled by an algorithm.
“As time goes on,” Hideo says, “the code will adapt to each person’s mind. It will fine-tune itself, improve itself, adding to its automated responses every specific detail about what a person might do. It will turn itself into a perfect security system.”
Judging from the footage, people don’t even know what had hit them—and even if they had, the code will stop them from thinking about it now. “What if people don’t want this? What if they just stop using the NeuroLink and their lenses?”
“Remember what I told you when I first gave you a set of them?”
I recall his words at the same time he says this. The lenses leave behind a harmless film on the eye’s surface that is only one atom thick. This film acts as a conduit between the lenses and your body.
That lingering film on the eyes will keep someone connected to the NeuroLink, even when they take the lenses out.
I’d understood Zero’s plans all wrong. He had wanted to destroy this with the virus in those rigged Artifacts. He had wanted to assassinate Hideo to stop him from moving forward. He had bombed our dorms in an attempt to keep me out of the games and from carrying out Hideo’s final goal. And maybe this is why Hideo had not stopped the final game when he saw that things were going wrong. He’d wanted me to stop Zero so that I could trigger his plans.
He’s doing this because of Sasuke. He created all of this so that no one would ever have to suffer the same fate as his brother, that no family would ever go through what his did. Our conversation comes back to me in a flash. You created Warcross for him, I’d said. And he’d responded, Everything I do is for him.
Does Kenn know about this plan? Was everyone always in on it?
“You can’t,” I finally say, hoarse.
My question doesn’t stir him. “Why not?” Hideo asks.
“You can’t be serious.” I let out a single, desperate, humorless laugh. “You want to be a . . . dictator? You want to control everyone in the world?”
“Not me.” Hideo gives me the same piercing stare that I remember from our first meeting. “What if the dictator is an algorithm? A code? What if that code can force the world to be a better place, can stop wars with a single breath of text, can save lives with an automated system? The algorithm doesn’t have an ego. It doesn’t lust after power. It is programmed solely to do right, to be fair. It is the same as the laws that govern our society—except it can also enforce that law immediately, everywhere, all the time.”