The Undoing Project: A Friendship that Changed the World(64)
Redelmeier noticed other problems, too. His medical school professors took data at face value that should have been inspected more closely, for example. An old man would come into the hospital suffering from pneumonia. They’d check his heart rate and find it to be a reassuringly normal seventy-five beats per minute . . . and just move on. But the reason pneumonia killed so many old people was its power to spread infection. An immune system responding as it should generated fever, coughs, chills, sputum—and a faster than normal heartbeat. A body fighting an infection required blood to be pumped through it at a faster than normal rate. “The heart rate of an old man with pneumonia is not supposed to be normal!” said Redelmeier. “It’s supposed to be ripping along!” An old man with pneumonia whose heart rate appears normal is an old man whose heart may well have a serious problem. But the normal reading on the heart rate monitor created a false sense in doctors’ minds that all was well. And it was precisely when all seemed well that medical experts “failed to check themselves.”
As it happens, a movement was taking shape right then and there in Toronto that came to be called “evidence-based medicine.” The core idea of evidence-based medicine was to test the intuition of medical experts—to check the thinking of doctors against hard data. When subjected to scientific investigation, some of what passed for medical wisdom turned out to be shockingly wrong-headed. When Redelmeier entered medical school in 1980, for instance, the conventional wisdom held that if a heart attack victim suffered from some subsequent arrhythmia, you gave him drugs to suppress it. By the end of Redelmeier’s medical training, seven years later, researchers had shown that heart attack patients whose arrhythmia was suppressed died more often than the ones whose condition went untreated. No one explained why doctors, for years, had opted for a treatment that systematically killed patients—though proponents of evidence-based medicine were beginning to look to the work of Kahneman and Tversky for possible explanations. But it was clear that the intuitive judgments of doctors could be gravely flawed: The evidence of the medical trials now could not be ignored. And Redelmeier was alive to the evidence. “I became very aware of the buried analysis—that a lot of the probabilities were being made up by expert opinion,” said Redelmeier. “I saw error in the way people think that was being transmitted to patients. And people had no recognition of the mistakes that they were making. I had a little unhappiness, a little dissatisfaction, a sense that all was not right in the state of Denmark.”
Toward the end of their article in Science, Daniel Kahneman and Amos Tversky had pointed out that, while statistically sophisticated people might avoid the simple mistakes made by less savvy people, even the most sophisticated minds were prone to error. As they put it, “their intuitive judgments are liable to similar fallacies in more intricate and less transparent problems.” That, the young Redelmeier realized, was a “fantastic rationale why brilliant physicians were not immune to these fallibilities.” He thought back to the errors he had made while trying to solve math problems. “The same problem solving exists in medicine,” he said. “In math you always check your work. In medicine, no. And if we are fallible in algebra, where the answers are clear, how much more fallible must we be in a world where the answers are much less clear?” Error wasn’t necessarily shameful; it was merely human. “They provided a language and a logic for articulating some of the pitfalls people encounter when they think. Now these mistakes could be communicated. It was the recognition of human error. Not its denial. Not its demonization. Just the understanding that they are part of human nature.”
But Redelmeier kept to himself any heretical thoughts he harbored as a young medical student. He had never felt the impulse to question authority or flout convention, and had no talent for either. “I was never shocked and disappointed before in my life,” he said. “I was always very obedient. Law-abiding. I vote in all elections. I show up at every university staff meeting. I’ve never had an altercation with the police.”
In 1985, he was accepted as a medical resident at the Stanford University hospital. At Stanford he began, haltingly, to voice his professional skepticism. One night during his second year, he was manning the intensive care unit and was assigned to keep a young man alive long enough to harvest his organs. (The American euphemism—“harvesting”—sounded strange to his ears. In Canada they called it “organ retrieval.”) His brain-dead patient was a twenty-one-year-old who had wrapped his motorcycle around a tree.
It was the first time Redelmeier had been confronted with the dying body of a person younger than himself, and it bothered him, in a way that the deaths of older people he had witnessed had not. “It was such a loss of so many life years,” he said. “It was such a preventable case. And the guy hadn’t been wearing a helmet.” Redelmeier was newly struck by the inability of human beings to judge risks, even when their misjudgment might kill them. When making judgments, people obviously could use help—say, by requiring all motorcyclists to wear helmets. Later Redelmeier said as much to one of his fellow students, an American. What is it with you freedom-loving Americans? he asked. Live free or die. I don’t get it. I say, “Regulate me gently. I’d rather live.” His fellow student replied, Not only do a lot of Americans not share your view; other physicians don’t share your view. Redelmeier’s fellow student told him about Stanford’s famous head of cardiac surgery, Norm Shumway, who had actively lobbied against the creation of a law that would require motorcyclists to wear helmets. “It dropped my jaw,” said Redelmeier. How could a guy so smart be so stupid about that? We’re definitely capable of errors. And human fallibility should be paid attention to.”