NUDGE

Improving Decisions About Health, Wealth, and Happiness
By Richard H. Thaler Cass R. Sunstein

Yale University Press

Copyright © 2008 Richard H. Thaler and Cass R. Sunstein
All right reserved.

ISBN: 978-0-300-12223-7


Chapter One

BIASES AND BLUNDERS

Have a look, if you will, at these two tables:

[ILLUSTRATION OMITTED]

Suppose that you are thinking about which one would work better as a coffee table in your living room. What would you say are the dimensions of the two tables? Take a guess at the ratio of the length to the width of each. Just eyeball it.

If you are like most people, you think that the table on the left is much longer and narrower than the one on the right. Typical guesses are that the ratio of the length to the width is 3:1 for the left table and 1.5:1 for the right table. Now take out a ruler and measure each table. You will find that the two table tops are identical. Measure them until you are convinced, because this is a case where seeing is not believing. (When Thaler showed this example to Sunstein at their usual lunch haunt, Sunstein grabbed his chopstick to check.)

What should we conclude from this example? If you see the left table as longer and thinner than the right one, you are certifiably human. There is nothing wrong with you (well, at least not that we can detect from this test). Still, your judgment in this task was biased, and predictably so. No one thinks that the right table is thinner! Not only were you wrong; you were probably confident that you were right. If you like, you can put this visual to good use when you encounter others who are equally human and who are disposed to gamble away their money, say, at a bar.

Now consider Figure 1.2. Do these two shapes look the same or different? Again, if you are human, and have decent vision, you probably see these shapes as being identical, as they are. But these two shapes are just the table tops from Figure 1.1, removed from their legs and reoriented. Both the legs and the orientation facilitate the illusion that the table tops are different in Figure 1.1, so removing these distracters restores the visual system to its usual amazingly accurate state.

These two figures capture the key insight that behavioral economists have borrowed from psychologists. Normally the human mind works remarkably well. We can recognize people we have not seen in years, understand the complexities of our native language, and run down a flight of stairs without falling. Some of us can speak twelve languages, improve the fanciest computers, and/or create the theory of relativity. However, even Einstein would probably be fooled by those tables. That does not mean something is wrong with us as humans, but it does mean that our understanding of human behavior can be improved by appreciating how people systematically go wrong.

To obtain that understanding, we need to explore some aspects of human thinking. Knowing something about the visual system allowed Roger Shepard (1990), a psychologist and artist, to draw those deceptive tables. He knew what to draw to lead our mind astray. Knowing something about the cognitive system has allowed others to discover systematic biases in the way we think.

How We Think: Two Systems

The workings of the human brain are more than a bit befuddling. How can we be so ingenious at some tasks and so clueless at others? Beethoven wrote his incredible ninth symphony while he was deaf, but we would not be at all surprised if we learned that he often misplaced his house keys. How can people be simultaneously so smart and so dumb? Many psychologists and neuroscientists have been converging on a description of the brain's functioning that helps us make sense of these seeming contradictions. The approach involves a distinction between two kinds of thinking, one that is intuitive and automatic, and another that is reflective and rational. We will call the first the Automatic System and the second the Reflective System. (In the psychology literature, these two systems are sometimes referred to as System 1 and System 2, respectively.) The key features of each system are shown in Table 1.1.

The Automatic System is rapid and is or feels instinctive, and it does not involve what we usually associate with the word thinking. When you duck because a ball is thrown at you unexpectedly, or get nervous when your airplane hits turbulence, or smile when you see a cute puppy, you are using your Automatic System. Brain scientists are able to say that the activities of the Automatic System are associated with the oldest parts of the brain, the parts we share with lizards (as well as puppies).

The Reflective System is more deliberate and self-conscious. We use the Reflective System when we are asked, "How much is 411 times 37?" Most people are also likely to use the Reflective System when deciding which route to take for a trip and whether to go to law school or business school. When we are writing this book we are (mostly) using our Reflective Systems, but sometimes ideas pop into our heads when we are in the shower or taking a walk and not thinking at all about the book, and these probably are coming from our Automatic Systems. (Voters, by the way, seem to rely primarily on their Automatic System. A candidate who makes a bad first impression, or who tries to win votes by complex arguments and statistical demonstrations, may well run into trouble.)

Most Americans have an Automatic System reaction to a temperature given in Fahrenheit but have to use their Reflective System to process a temperature given in Celsius; for Europeans, the opposite is true. People speak their native languages using their Automatic Systems and tend to struggle to speak another language using their Reflective Systems. Being truly bilingual means that you speak two languages using the Automatic System. Accomplished chess players and professional athletes have pretty fancy intuitions; their Automatic Systems allow them to size up complex situations rapidly and to respond with both amazing accuracy and exceptional speed.

One way to think about all this is that the Automatic System is your gut reaction and the Reflective System is your conscious thought. Gut feelings can be quite accurate, but we often make mistakes because we rely too much on our Automatic System. The Automatic System says that "the airplane is shaking, I'm going to die," while the Reflective System responds, "Planes are very safe!" The Automatic System says, "That big dog is going to hurt me," and the Reflective System replies, "Most pets are quite sweet." (In both cases, the Automatic System is squawking all the time.) The Automatic System starts out with no idea how to play golf or tennis. Note, however, that countless hours of practice enable an accomplished golfer to avoid reflection and to rely on her Automatic System-so much so that good golfers, like other good athletes, know the hazards of "thinking too much" and might well do better to "trust the gut," or "just do it." The Automatic System can be trained with lots of repetition-but such training takes a lot of time and effort. One reason why teenagers are such risky drivers is that their Automatic Systems have not had much practice, and using the Reflective System is much slower.

To see how intuitive thinking works, try the following little test. For each of the three questions, begin by writing down the first answer that comes to your mind. Then pause to reflect.

1. A bat and ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost? _______ cents

2. If it takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets? _______ minutes

3. In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake? _______ days

What were your initial answers? Most people say 10 cents, 100 minutes, and 24 days. But all these answers are wrong. If you think for a minute, you will see why. If the ball costs 10 cents and the bat costs one dollar more than the ball, meaning $1.10, then together they cost $1.20, not $1.10. No one who bothers to check whether his initial answer of 10 cents could possibly be right would give that as an answer, but research by Shane Frederick (2005) (who calls this series of questions the cognitive reflection test) finds that these are the most popular answers even among bright college students.

The correct answers are 5 cents, 5 minutes, and 47 days, but you knew that, or at least your Reflective System did if you bothered to consult it. Econs never make an important decision without checking with their Reflective Systems (if they have time). But Humans sometimes go with the answer the lizard inside is giving without pausing to think. If you are a television fan, think of Mr. Spock of Star Trek fame as someone whose Reflective System is always in control. (Captain Kirk: "You'd make a splendid computer, Mr. Spock." Mr. Spock: "That is very kind of you, Captain!") In contrast, Homer Simpson seems to have forgotten where he put his Reflective System. (In a commentary on gun control, Homer once replied to a gun store clerk who informed him of a mandatory five-day waiting period before buying a weapon, "Five days? But I'm mad now!")

One of our major goals in this book is to see how the world might be made easier, or safer, for the Homers among us (and the Homer lurking somewhere in each of us). If people can rely on their Automatic Systems without getting into terrible trouble, their lives should be easier, better, and longer.

Rules of Thumb

Most of us are busy, our lives are complicated, and we can't spend all our time thinking and analyzing everything. When we have to make judgments, such as guessing Angelina Jolie's age or the distance between Cleveland and Philadelphia, we use simple rules of thumb to help us. We use rules of thumb because most of the time they are quick and useful.

In fact, there is a great collection edited by Tom Parker titled Rules of Thumb. Parker wrote the book by asking friends to send him good rules of thumb. For example, "One ostrich egg will serve 24 people for brunch." "Ten people will raise the temperature of an average size room by one degree per hour." And one to which we will return: "No more than 25 percent of the guests at a university dinner party can come from the economics department without spoiling the conversation."

Although rules of thumb can be very helpful, their use can also lead to systematic biases. This insight, first developed decades ago by two Israeli psychologists, Amos Tversky and Daniel Kahneman (1974), has changed the way psychologists (and eventually economists) think about thinking. Their original work identified three heuristics, or rules of thumb-anchoring, availability, and representativeness-and the biases that are associated with each. Their research program has come to be known as the "heuristics and biases" approach to the study of human judgment. More recently, psychologists have come to understand that these heuristics and biases emerge from the interplay between the Automatic System and the Reflective System. Let's see how.

Anchoring

Suppose we are asked to guess the population of Milwaukee, a city about two hours north of Chicago, where we live. Neither of us knows much about Milwaukee, but we think that it is the biggest city in Wisconsin. How should we go about guessing? Well, one thing we could do is start with something we do know, which is the population of Chicago, roughly three million. So we might think, Milwaukee is a major city, but clearly not as big as Chicago, so, hmmm, maybe it is one-third the size, say one million. Now consider someone from Green Bay, Wisconsin, who is asked the same question. She also doesn't know the answer, but she does know that Green Bay has about one hundred thousand people and knows that Milwaukee is larger, so guesses, say, three times larger-three hundred thousand.

This process is called "anchoring and adjustment." You start with some anchor, the number you know, and adjust in the direction you think is appropriate. So far, so good. The bias occurs because the adjustments are typically insufficient. Experiments repeatedly show that, in problems similar to our example, people from Chicago are likely to make a high guess (based on their high anchor) while those from Green Bay guess low (based on their low anchor). As it happens, Milwaukee has about 580,000 people.

Even obviously irrelevant anchors creep into the decision-making process. Try this one yourself. Take the last three digits of your phone number and add two hundred. Write the number down. Now, when do you think Attila the Hun sacked Europe? Was it before or after that year? What is your best guess? (We will give you one hint: It was after the birth of Jesus.) Even if you do not know much about European history, you do know enough to know that whenever Attila did whatever he did, the date has nothing to do with your phone number. Still, when we conduct this experiment with our students, we get answers that are more than three hundred years later from students who start with high anchors rather than low ones. (The right answer is 411.)

Anchors can even influence how you think your life is going. In one experiment, college students were asked two questions: (a) How happy are you? (b) How often are you dating? When the two questions were asked in this order the correlation between the two questions was quite low (.11). But when the question order was reversed, so that the dating question was asked first, the correlation jumped to .62. Apparently, when prompted by the dating question, the students use what might be called the "dating heuristic" to answer the question about how happy they are. "Gee, I can't remember when I last had a date! I must be miserable." Similar results can be obtained from married couples if the dating question is replaced by a lovemaking question.

In the language of this book, anchors serve as nudges. We can influence the figure you will choose in a particular situation by ever-so-subtly suggesting a starting point for your thought process. When charities ask you for a donation, they typically offer you a range of options such as $100, $250, $1,000, $5,000, or "other." If the charity's fund-raisers have an idea of what they are doing, these values are not picked at random, because the options influence the amount of money people decide to donate. People will give more if the options are $100, $250, $1,000, and $5,000, than if the options are $50, $75, $100, and $150.

In many domains, the evidence shows that, within reason, the more you ask for, the more you tend to get. Lawyers who sue cigarette companies often win astronomical amounts, in part because they have successfully induced juries to anchor on multimillion-dollar figures. Clever negotiators often get amazing deals for their clients by producing an opening offer that makes their adversary thrilled to pay half that very high amount.

Availability

How much should you worry about hurricanes, nuclear power, terrorism, mad cow disease, alligator attacks, or avian flu? And how much care should you take in avoiding risks associated with each? What, exactly, should you do to prevent the kinds of dangers that you face in ordinary life?

In answering questions of this kind, most people use what is called the availability heuristic. They assess the likelihood of risks by asking how readily examples come to mind. If people can easily think of relevant examples, they are far more likely to be frightened and concerned than if they cannot. A risk that is familiar, like that associated with terrorism in the aftermath of 9/11, will be seen as more serious than a risk that is less familiar, like that associated with sunbathing or hotter summers. Homicides are more available than suicides, and so people tend to believe, wrongly, that more people die from homicide.

Accessibility and salience are closely related to availability, and they are important as well. If you have personally experienced a serious earthquake, you're more likely to believe that an earthquake is likely than if you read about it in a weekly magazine. Thus vivid and easily imagined causes of death (for example, tornadoes) often receive inflated estimates of probability, and less-vivid causes (for example, asthma attacks) receive low estimates, even if they occur with a far greater frequency (here a factor of twenty). So, too, recent events have a greater impact on our behavior, and on our fears, than earlier ones. In all these highly available examples, the Automatic System is keenly aware of the risk (perhaps too keenly), without having to resort to any tables of boring statistics.

(Continues...)



Excerpted from NUDGEby Richard H. Thaler Cass R. Sunstein Copyright © 2008 by Richard H. Thaler and Cass R. Sunstein. Excerpted by permission.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.