The Undoing Project: A Friendship that Changed the World(7)
That same year, the model had dismissed as unworthy of serious consideration a freshman center at Texas A&M named DeAndre Jordan. Never mind that every other team in the NBA, using more conventional scouting tools, passed him over at least once, or that Jordan wasn’t taken until 35th pick of the draft, by the Los Angeles Clippers. As quickly as Joey Dorsey established himself as a bust, DeAndre Jordan established himself as a dominant NBA center and the second-best player in the entire draft class after Russell Westbrook.?
This sort of thing happened every year to some NBA team, and usually to all of them. Every year there were great players the scouts missed, and every year highly regarded players went bust. Morey didn’t think his model was perfect, but he also couldn’t believe that it could be so drastically wrong. Knowledge was prediction: If you couldn’t predict such a glaringly obvious thing as the failure of Joey Dorsey or the success of DeAndre Jordan, how much did you know? His entire life had been shaped by this single, tantalizing idea: He could use numbers to make better predictions. The plausibility of that idea was now in question. “I’d missed something,” said Morey. “What I missed were the limitations of the model.”
His first mistake, he decided, was to have paid insufficient attention to Joey Dorsey’s age. “He was insanely old,” says Morey. “He was twenty-four years old when we drafted him.” Dorsey’s college career was impressive because he was so much older than the people he played against. He’d been, in effect, beating up on little kids. Raising the weight the model placed on a player’s age flagged Dorsey as a weak NBA prospect; more tellingly, it improved the model’s judgments about nearly all of the players in the database. For that matter, Morey realized, there existed an entire class of college basketball player who played far better against weak opponents than against strong ones. Basketball bullies. The model could account for that, too, by assigning greater weight to games played against strong opponents than against weak ones. That also improved the model.
Morey could see—or thought he could see—how the model had been fooled by Joey Dorsey. Its blindness to the value of DeAndre Jordan was far more troubling. The kid had played a single year of college basketball, not very effectively. It turned out that he had been a sensational high school player, had hated his college coach, and didn’t even want to be in school. How could any model predict the future of a player who had intentionally failed? It was impossible to see Jordan’s future in his college stats, and, at the time, there were no useful high school basketball statistics. So long as it relied almost exclusively on performance statistics, the model would always miss DeAndre Jordan. The only way to see him, it seemed, was with the eyes of an old-fashioned basketball expert. As it happens, Jordan had grown up in Houston under the eyes of Rockets scouts, and one of those scouts had wanted to draft him on the strength of what appeared to him undeniable physical talent. One of his scouts had seen what his model had missed!
Morey—being Morey—had actually tested whether there were any patterns in the predictions made by his staff. He’d hired most of them and thought they were great, and yet there was no evidence any one of them was any better than the other, or the market, at predicting who would make it in the NBA and who would not. If there was any such thing as a basketball expert who could identify future NBA talent, he hadn’t found him. He certainly didn’t think that he was one. “Weighting my personal intuition more heavily did not cross my mind,” he said. “I trust my gut very low. I just think there’s a lot of evidence that gut instincts aren’t very good.”
In the end, he decided that the Rockets needed to reduce to data, and subject to analysis, a lot of stuff that had never before been seriously analyzed: physical traits. They needed to know not just how high a player jumped but how quickly he left the earth—how fast his muscles took him into the air. They needed to measure not just the speed of the player but the quickness of his first two steps. That is, they needed to be even more geeky than they already were. “When things go wrong, that’s what people do,” said Morey. “They go back to the habits that succeeded in the past. My thing was: Let’s go back to first principles. If these physical tools are going to matter, let’s test them more rigorously than they’ve ever been tested before. The weights we placed on production in college had to go down, and the weights we placed on raw physical abilities had to go up.”
But once you started to talk about a guy’s body and what it might or might not be able to do on an NBA court, there was a limit to the usefulness of even the objective, measurable information. You needed, or seemed to need, experts to look at the tools in action and judge how well they would function playing a different game, against better competition. You needed scouts to rate a player’s ability to do the various things they knew were most important to do on a basketball court: shooting, finishing, getting to the rim, offensive rebounding, and so on. You needed experts. The limits of any model invited human judgment back into the decision-making process—whether it helped or not.
And thus began a process of Morey trying as hard as he’d ever tried at anything in his life to blend subjective human judgment with his model. The trick wasn’t just to build a better model. It was to listen both to it and to the scouts at the same time. “You have to figure out what the model is good and bad at, and what humans are good and bad at,” said Morey. Humans sometimes had access to information that the model did not, for instance. Models were bad at knowing that DeAndre Jordan sucked his freshman year in college because he wasn’t trying. Humans were bad at . . . well, that was the subject Daryl Morey now needed to study more directly.