Thursday, March 15, 2012

Thinking, Fast and Slow by Kahneman


Thinking, Fast and Slow by Daniel Kahneman

Daniel Kahneman, the author of this exceptional book, and Amos Tversky (who died in 1996) made economics and other disciplines a lot more realistic--and tougher--for economists, researchers and students. Prior to their work, economists and others maintained classical theories and explanations that relied on certain seemingly logical assumptions about human behavior. However, people don't always behave the way logic might suggest, for a variety of reasons that Kahneman (and Tversky) explained, starting in the 1970s. Today, the subject of behavioral decision-making is one of the more exciting ones in fields like economics, finance, medicine and even law, thanks to their pioneering work. In recognition of the impact of his work in economics, Kahneman, a cognitive psychologist and professor emeritus at Princeton, won the Nobel Prize in Economics in 2002, specifically for his work on prospect theory.

The title of this book comes from Kahneman's discussion of two simple models of how people think. "System 1" thinking corresponds to fast, intuitive, emotional and almost automatic decisions, though it sometimes leaves us at the mercy of our human biases. "System 2" thinking is more slow-going and requires more intellectual effort. To nobody's surprise, we humans are more likely to rely on System 1 thinking, because it saves us effort, even if it can lead to flawed thinking. Here is a quick way Kahneman uses to illustrate System 1 and System 2 thinking. Suppose that a bat and ball together cost $1.10 and that the bat costs $1.00 more than the ball. How much does the ball cost? Many people, relying mainly on System 1 thinking, will quickly say $0.10, but the correct answer is five cents. Think about it.

One of the book's main themes is the author's description of how little control we actually have over our own System 1 responses and the degree to which our subconscious intuition and biases affect System 1 choices. It's amazinging to me how much of our lives seems to run on System 1 autopilot. Of course, forewarned is forearmed, which another important theme. Basically, this book provides the reader an impressive overview of many key concepts in behavioral research, with lots of illuminating stories from Kahneman's work and experiences. Before you know it, you may find "heuristic" (a rule of thumb) working its way into your conversations.

I will expand on one of the book's chapters ("The Law of Small Numbers") to illustrate some of Kaneman's analysis. Suppose you learn that out of more than 3,000 counties in the United States, the incidence of kidney cancer is lowest in mostly rural, sparsely populated counties in the Midwest, the South, and the West. Before you are tempted to try to explain the lower cancer rates on some elements of rural living, you should realize that the highest rates of kidney cancer are also found in (other) rural, sparsely populated counties in those same states. The reason for these seemingly contradictory results is that the small sample sizes of kidney cancer in sparsely populated counties allow for widely varying cancer rates. Put differently, if the Law of Large Numbers says the average results obtained from a large number of trials should be close to the expected value (of cancer rates, or whatever), then the Law of Small Numbers says that the smaller the sample size you deal with, the greater the chance of obtaining results that are further from the overall expected value. The low cancer rates in some counties turn out to be artifacts, not statistically systematic results.

Here is one final example from Kahneman's work of some of the concepts the reader will encounter in this book. Suppose that Linda is 31 years old, single, outspoken, and very bright. In college, she majored in philosophy. As a student, she was deeply concerned with the issues of discrimination and social justice, and she also participated in anti-nuclear demonstrations. Which is more probable?

1. Linda is a bank teller.
2. Linda is a bank teller and is active in the feminist movement.

According to Kahneman, about 85% - 90% of undergraduates at several major universities chose the second option, that Linda was a bank teller and active in the feminist movement. However, this is an example of the "conjunction fallacy," since the probability of two events occurring together (in conjunction) must necessarily be less than the probability of either event occurring alone. Put simpler, the probability that Linda is a bank teller must be greater than the probability that she is a bank teller and active in feminist causes. (To be complete, Kahneman points out that there are critics of the Linda experiment who, for example, question whether it is reasonable for test subjects to understand the word "probability" as if it meant "plausibility.")

Okay, you hopefully have an idea about some of the ground covered in this book. If behavioral research interests you, this book merits your attention. I should also mention that there is blessedly little technical jargon in the book, so if you are new to the field of behavioral research you should be able to enjoy the book. Indeed, I think most people will get a lot from it.