The Undoing Project: A Friendship that Changed the World(41)



The bigger the base rate—the known ratio of red to white chips—the faster the odds shift around. If the first three chips you draw are red, from a bag in which 75 percent of the chips are known to be either red or white, there’s a 27:1, or slightly greater than 96 percent, chance you are holding the bag filled with mostly red chips.

The innocent subjects who pulled the poker chips out of the book bags weren’t expected to know Bayes’s rule. The experiment would have been ruined if they had. Their job was to guess the odds, so that the psychologists could compare those guesses with the correct answer. From their guesses, the psychologists hoped to get a sense of just how closely whatever was going on in people’s minds resembled a statistical calculation when those minds were presented with new information. Were human beings good intuitive statisticians? When they didn’t know the formula, did they still behave as if they did?

At the time, the experiments felt radical and exciting. In the minds of the psychologists, the results spoke to all sorts of real-world problems: How do investors respond to earnings reports, or patients to diagnoses, or political strategists to polls, or coaches to a new score? A woman in her twenties who receives from a single test a diagnosis of breast cancer is many times more likely to have been misdiagnosed than is a woman in her forties who receives the same diagnosis. (The base rates are different: Women in their twenties are far less likely to have breast cancer.) Does she sense her own odds? If so, how clearly? Life is filled with games of chance: How well do people play them? How accurately do they assess new information? How do people leap from evidence to a judgment about the state of the world? How aware are they of base rates? Do they allow what just happened to alter, accurately, their sense of the odds of what will happen next?

The broad answer to that last question coming from the University of Michigan, Amos reported to Danny’s class, was that, yes, more or less, they do. Amos presented research done in Ward Edwards’s lab that showed that when people draw a red chip from the bag, they do indeed judge the bag to be more likely to contain mostly red chips. If the first three chips they withdrew from a bag were red, for instance, they put the odds at 3:1 that the bag contained a majority of red chips. The true, Bayesian odds were 27:1. People shifted the odds in the right direction, in other words; they just didn’t shift them dramatically enough. Ward Edwards had coined a phrase to describe how human beings responded to new information. They were “conservative Bayesians.” That is, they behaved more or less as if they knew Bayes’s rule. Of course, no one actually thought that Bayes’s formula was grinding away in people’s heads.

What Edwards, along with a lot of other social scientists, believed (and seemed to want to believe) was that people behaved as if they had Bayes’s formula lodged in their minds. That view dovetailed with the story then winning the day in social science. It had been told best by the economist Milton Friedman. In a 1953 paper, Friedman wrote that a person shooting billiards does not calculate the angles on the table and the force imparted on the cue ball, and the reaction of one ball to another, in the way a physicist might. He just shot the ball in the right direction with roughly the right amount of force, as if he knew the physics. His mind arrived at more or less the right answer. How that happened didn’t matter. Similarly, when a person calculates the odds of some situation, he does not do advanced statistics. He just behaves as if he does.

When Amos was done talking, Danny was baffled. Was that it? “Amos had described the research in the normal way that people describe research done by respected colleagues,” said Danny. “You assume it is okay, and you trust the people who did it. When we look at a paper that has been published in a refereed journal, we tend to take it at face value—we assume that what the authors say must make sense—otherwise it would not have been published.” And yet, to Danny, the experiment that Amos described sounded just incredibly stupid. After a person has pulled a red chip out of a bag, he is more likely than before to think the bag to be the one whose chips are mostly red: well, duh. What else is he going to think? Danny had had no exposure to the new research into the way people thought when they made decisions. “I had never thought much about thinking,” he said. To the extent that Danny thought of thinking, he thought of it as seeing things. But this research into the human mind bore no relationship to what he knew about what people actually did in real life. The eye was often deceived, systematically. So was the ear.

The Gestalt psychologists he loved so much made entire careers out of fooling people with optical illusions: Even people who knew of the illusion remained fooled by it. Danny didn’t see why thinking should be any more trustworthy. To see that people were not intuitive statisticians—that their minds did not naturally gravitate to the “right” answer—you needed only to sit in on any statistics class at Hebrew University. The students did not naturally internalize the importance of base rates, for instance. They were as likely to draw a big conclusion from a small sample as from a big sample. Danny himself—the best teacher of statistics at Hebrew University!—had figured out, long after the fact, that he had failed to replicate whatever it was that he had discovered about Israeli kids from their taste in tent sizes because he had relied on sample sizes that were too small. That is, he had tested too few kids to get an accurate picture of the population. He had assumed, in other words, that a few poker chips revealed the true contents of the book bag as clearly as a few big handfuls, and so he never fully determined what was in the bag.

Michael Lewis's Books