The Startup Wife(4)
Lynn was an actress who was cast as the only woman in the drama department’s all-male production of Macbeth. We bonded over our late blooming. Lynn had spent the summer before college at fat camp and emerged nymphlike just weeks before orientation, but the high school scars were still raw, and over kale chips, which she dehydrated in a toaster oven that she kept illegally in her dorm room, we put Band-Aids over all the slights, sneers, and total invisibleness we had managed to escape. I told her about Cyrus—possibly the first time I had ever said his name aloud outside of my bedroom walls—but even then I downplayed my attraction to him, noting him as just another piece of flotsam from the shark tank that was high school.
I can’t remember when I came up with the idea of the Empathy Module, only that it had been lurking somewhere in the back of my mind for as long as I could remember. Maybe it was all the apocalyptic sci-fi I was reading that made me want to figure out a way to live without a fear of machines. They were going to be smarter than we were someday, we all knew that. They were going to beat us at chess and cook our meals and drive our cars. Someday they would paint and write operas and sing them back to us in perfect harmony. But what if they also had the one thing that humans possessed only on rare occasions? What if they had an intrinsic, automatic, unflinching, couldn’t-be-switched-off understanding of other people? What if they had empathy? Then they wouldn’t be our rivals, they would just be better versions of us. We wouldn’t have to fear them, and we wouldn’t have to subjugate them. We could just try to be more like them, because they’d be the best of humanity.
I went straight to grad school and started working in Dr. Melanie Stein’s lab. Dr. Stein had pioneered the reverse engineering of the brain. She was one of those formidable women who seemed to flourish in academic departments, her awkwardness hardened into a kind of opaque, terrifying brilliance. She was not mean, she was just never nice, never talked to fill awkward silences, and always made me feel as if I had said the dumbest thing ever. Before I met Cyrus again, I wanted nothing more than to grow up and become her.
* * *
My first encounter with Dr. Stein was not terrible. It was the start of the year, and I had just moved to Cambridge and into my tiny apartment in Ashdown House. She asked to meet me at a bar on Mass Ave, and when I turned up—I couldn’t believe how cold it was, I was already in my Michelin Man jacket—Dr. Stein was sporting a sexy poncho. She said, “I need to know right now that you’re not going to drop out or slow down, because if that’s anywhere near the horizon, you should go and join Dr. Li’s lab, which is full of the well-intentioned but only moderately ambitious.”
“I’m fully ambitious,” I said.
She ordered a vodka martini, extra dirty, and I was so nervous I ordered a Diet Coke even though I hate Diet Coke.
“So tell me about this Empathy Module.”
I shrugged out of my giant coat. “You know far better than I do that the last parts of the brain to be mapped are the ones that control our emotions.”
“I do know better than you do,” she said. The blue of her eyes was so light, I felt like I was looking into a church window. I couldn’t help staring. “I’m a cyborg,” she said, taking off her glasses and inviting me to look deeper.
“What do you mean?”
“My eyes. They’re transplants. I would’ve gone blind without them.”
“Wow.”
“His name was Hans Eikelheimer. His wife sometimes emails me.”
We raised our glasses to Hans, and I thought at that moment that she had decided to make me her friend.
“I don’t think we can get to the ultimate reaches of the brain by mapping,” I said. “I mean, I don’t think that’s the only way. It needs to be paired with other types of modeling, especially when it comes to emotional intelligence.”
“We already know that.”
“But how do we reach empathy? If we want our robots to be like us, we need to get beyond the algorithmic layers of intelligence and ensure that the AI of the future has the ability to imagine what it’s like to be someone else. It’s not just a way to make them more human. We should focus on making them better than us, not like us.”
“Okay, that’s novel. You think that’s how we’re going to survive the Singularity?”
“Yes, by making them greater—not smarter but kinder. More affected by the pain of others.”
“You want to save the world.”
“Why else would I be here?” I said, beaming.
I pranced home in my enormous coat, smug in the conviction that she was, in some tiny way, going to reciprocate my crush.
* * *
But after that evening, Dr. Stein and I did not, in fact, become friends. She avoided eye contact when we bumped into each other, and during our advisory meetings she picked on tiny aspects of the module, telling me it would never work to map the neural pathways the way I was proposing because we didn’t know how emotional information traveled, insisting that, until the entire brain had been reverse-engineered, we wouldn’t know how the limbic system truly worked. I always spent hours rewinding through our brief exchanges and coming up with better arguments which I would practice later, when it was too late.
Four years into my PhD, at the start of another summer of research in my overly air-conditioned lab, I was informed that my high school English teacher, Mrs. Butterfield, had died. When I got the message—a text from an unknown number—I was reminded of all the times I’d meant to write to her but never had. The message said, Please bring a sentence from a favorite novel to Mrs. Butterfield’s service. An invitation followed.