I was pretty slow about getting around to reading Thinking, Fast and Slow. The career-capping book by Princeton psychologist Daniel Kahneman, one of the founders of behavioral economics, spent months on all the bestseller lists back in 2011. I finally picked up a paperback copy a couple of weeks ago.
The book is mainly about the limits of intuition and the biases—seemingly built into the way the human mind has evolved—that keep us from acting in accord with logic, rationality, and statistics. For example, the “anchoring effect” means we’re highly suggestible when it comes to numbers and prices. No matter how much soup they really want, grocery-store customers buy more when there’s a sign saying “Limit 12 cans per customer.”
It was Kahneman’s studies of such impulses that won him a Nobel Prize in economics in 2002. Together with Amos Tversky, Richard Thaler, and many others, Kahneman overturned economists’ old picture of society as a mathematical utopia in which individuals act rationally to maximize their own utility. Thinking, Fast and Slow is a lengthy, detailed, yet approachable summary of that revolution. You’ve probably seen or read popular behavioral-economics boks like Steven Levitt and Stephen Dubner’s Freakonomics or Dan Ariely’s Predictably Irrational; those are like sugary desserts next to Kahneman’s protein-packed tome.
As I absorbed Kahneman’s points about all the ways human judgment breaks down under various stresses and distractions, I couldn’t help looking for lessons that might apply to the high-tech circus we insiders sometimes call, in a self-congratulatory way, the “innovation ecosystem.” By which I mean the researchers and developers who incubate new technologies inside universities, corporate labs, and garages; the entrepreneurs who turn these new ideas into products; the angel and venture investors who place bets on the entrepreneurs; and the eager customers who fuel the whole process.
And such lessons abound. Indeed, entrepreneurs, executives, and investors are prone to so many kinds of errors that they almost seem to be Kahneman’s favorite subspecies of homo economicus. In the end, I think Kahneman’s observations about bias lead to a big puzzle about the nature of entrepreneurship and technological progress. But before I get into that, I’ll relate a few examples from the book—each one richly supported by the psychological experiments and surveys conducted by Kahneman and his colleagues over the last three or four decades:
1. The illusion of understanding: If we can fit past events into a satisfying story, we think we understand what really happened, and we can’t imagine things turning out any other way. Here Kahneman cites the example of Google, which was started by two Stanford graduate students who lucked into one of the biggest untapped markets in the history of business (i.e, search-based advertising) and came out looking like invincible geniuses. In fact, there were numerous points at which Google’s story could have taken a drastically different turn—such as 1999, when Page and Brin were willing to sell the company for $1 million but the buyer thought the price was too high. But luck took them in a different direction. “A compelling narrative fosters an illusion of inevitability,” Kahneman observes.
2. Outcome bias: Closely related to the illusion of understanding, this is the tendency to reward or blame decision makers for the performance of their organizations, even though the correlation between leadership quality and corporate performance is generally low. “We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact,” Kahneman writes. “Leaders who have been lucky are never punished for having taken too much risk. Instead, they are believed to have had the flair and foresight to anticipate success, and the sensible people who doubted them are seen in hindsight as mediocre, timid, and weak.”
3. The illusion of pattern: Kahneman thinks we’re too quick to ascribe meaning to events that are the product of pure chance. A basketball player who sinks three or four baskets in a row is seen as having a “hot hand,” and a CEO who oversees several successful product launches or acquisitions acquires a reputation for extraordinary insight or skill when in fact, like Page and Brin, he was probably just fortunate. “We are far too willing to reject the belief that much of what we see in life is random,” Kahneman warns.
4. Nonregressive explanations: An outstanding performance is likely to be followed by a mediocre performance. This isn’t backsliding: it’s usually just regression to the mean, the tendency of variables to gravitate around a historical average. The concept is well established, but because we’re hard-wired to seek causal rather than statistical explanations, we have a hard time accepting it. One corollary is that we shouldn’t punish a company that fails to follow up a stellar product with an even more stellar one (the iPad and its regrettable sequel, the iPad mini, come to mind). Another is that all extreme predictions are unreliable; we shouldn’t believe any entrepreneur who says his company is the next Google.
5. The illusion of validity, also known as the illusion of skill: We’re strongly influenced by the world in front of our eyes, and unwilling to admit that there’s much we don’t know—a phenomenon that Kahneman calls WYSIATI, for What You See Is All There Is. As a result, we come to believe—sometimes fiercely—that our own predictions are accurate, even when it wouldn’t take much digging to show that they’re little better than … Next Page »
By posting a comment, you agree to our terms and conditions.