Thinking, Fast and Slow
For the last decade or so a band of scholars has been trying to cast off the long-accepted “rational agent” theory of economic behavior, the one that says that people, in their economic lives, behave like calculating robots, making rational decisions when they buy a stock, take out a mortgage, or go to the track. These scholars have offered a trove of evidence that people, far from being the rational agents of textbook lore, are often inconsistent, emotional, and biased. Perhaps tellingly, the pioneers of this field were not economists. Daniel Kahneman and Amos Tversky were Israeli psychologists who noticed that real people often do not make decisions as economists say they do. Tversky died in 1996; six years later, Kahneman won the Nobel Prize for economics.
Thinking, Fast and Slow, Kahneman’s new and most accessible book, contains much that is familiar to those who have followed this debate within the world of economics, but it also has a lot to say about how we think, react, and reach—rather, jump to—conclusions in all spheres. What most interests Kahneman are the predictable ways that errors of judgment occur. Synthesizing decades of his research, as well as that of colleagues, Kahneman lays out architecture of human decision-making—a map of the mind that resembles a finely tuned machine with, alas, some notable trapdoors and faulty wiring.
Behaviorists, Kahneman included, have been cataloging people’s systematic mistakes and nonlogical patterns for years. A few of the examples he cites: 1.Framing. Test subjects are more likely to opt for surgery if told that the “survival” rate is 90 percent, rather than that the mortality rate is 10 percent. 2. The sunk-cost fallacy. People seek to avoid feelings of regret; thus, they invest more money and time in a project with dubious results rather than give it up and admit they were wrong. 3. Loss aversion. In experiments, most subjects would prefer to