The Test(8)
Deep is a psychology major, and he’s read everything there is about BVA theory. Most trainees only care about the simulation itself, but Deep enjoyed discovering just how much generalization was actually possible, despite all our claims at uniqueness. He found comfort in knowing that humans are predictable things, that we each come with a lot of the same baggage of innate and learned little quirks.
Some of these quirks are helpful in the values assessment, others are an impediment and must be broken. System justification is the idea that many of our needs can be satisfied by defending and justifying the status quo. It gives stability to our political and economic systems because people are inherently inclined to defend it. It prevents people at a disadvantage from questioning the system that disadvantages them, makes people buy the inevitability of social inequity, ignore or support policies that hurt them. It fosters dependence on government, law enforcement. It discourages vigilantism and makes it more difficult to get someone to actively participate in a virtual-reality simulated terrorist killing. K1 helps establish their involvement as part of a new system the subject will find ways to justify.
System justification is one of many decision-avoidance mechanisms we carry around. When faced with a choice, humans almost invariably seek a no-action, no-change option, even when one of the presented alternatives is quantifiably and logically more advantageous. One person dying is obviously better than two people dying.
Here the aversion to decision-making is reinforced by a phenomenon called reactance: when we feel that someone, or something, is threatening or eliminating our behavioural freedom, even just limiting our options, our innate reaction is to try to re-establish that freedom. It often translates to our challenging rules or authority. Tell a child he has to play with toy number one and that he can’t touch toy number two, you can bet he’ll play, or at least want to play, with toy number two. It doesn’t matter how unattractive that toy is. The grass is always greener. When told they must choose who lives or dies, that they no longer have the right not to choose, subjects instinctively want to reassert that right.
More than anything, the BVA experiment creates a state of cognitive dissonance, a simultaneous belief in two contradictory things that creates inconsistency. Sending one person to their death is wrong, therefore I should not choose anyone. Not saving one person is wrong, therefore I should choose. Does not compute. Humans use little conundrums such as this one to defeat evil robots or out-of-control AI on television shows, but our own brains are surprisingly ill-equipped to deal with these types of inconsistencies.
The discriminative stimulus, the death of the two hostages, serves to weaken the subject’s decision-avoidance mechanisms and status quo biases. K1 pushes the subject to re-create consistency by reranking his or her contradictory beliefs. Letting two people die is more wrong than choosing who dies.
Long story short, no one chooses on the first kill.
On the large screen, Idir puts his hands over his eyes, as the terrorist fires his pistol twice and the bodies of both men hit the ground. Deep turns his chair back towards the desk and looks at his supervisor.
—Have you ever had a hero?
—Once. My second year. A football player from Tunisia.
—You did? What was it like?
Heroes are mythical creatures in the BVA world, people who physically take on the terrorists. It’s the quickest way to end the test. Only, no one does that. Well, almost no one. It happens once out every six hundred and sixty-five tests. Despite being so rare, heroes are controversial figures, the topic of many heated debates among BVA high-ups and the politicians in the know. Because they are so rare, statistics about them are unreliable. There isn’t enough data, and data is everything when it comes to the BVA. Every decision, down to the smallest detail—the colour of the floor or the way the chairs are arranged in the waiting room—is based on extensive datasets collected over years of experiments. BVA regulations indicate that heroes automatically pass and receive citizenship because, well, because they’re heroes. One could argue that someone who stops a terror attack is unlikely ever to participate in one. The argument coming from the anti-hero side is that these people are not only endangering their own life, but those of everyone else, by trying to accomplish something any sane person would realize is impossible. At best, that would make them incredibly stupid; at worst, sociopaths with a strong penchant for violence. Either way, not the kind of people you want to roll the red carpet for. Deep hasn’t quite formed an opinion on the matter but, like all BVA employees, he relishes the chance to see one in action.
—It’s not all it’s cracked up to be. The test part wasn’t that interesting. He was rude as hell, failed everything in politeness and courtesy. The terrorist walked into the test room and the subject tackled him right away, hard. Didn’t take more than a second or two. The door hadn’t even closed yet. The subject grabbed the weapon and ran out. He just . . . ran. He didn’t fire at anyone, just kept on running. It wasn’t long before we were out of programmed scenery. The system kept recycling the main lobby, showing the same room, the same people every time he went through the door. Over and over again. He must have gone through it a dozen times. It didn’t seem to bother him. He just kept going. He was still running when we woke him up.
—What happened then?
—Same as always. We explained to him what he had just experienced, that none of it was real. He took it just fine, better than most, actually. It wouldn’t have been any different if we’d told him he was on Candid Camera.