All Stories

Make Better Decisions

Know your biases to get the big calls right the first time

The menu board at Yankee Stadium presents patrons with a list of satisfying, salty choices. Chicken. Steak. Hot dogs. Cheese fries. Peanuts. All ballpark staples. And all of them packed with more than 1,000 calories.

The list illustrates the challenges we face in everyday choices, like what to eat at a baseball game. How we make decisions, and how we can learn to make them better, are central to Carey K. Morewedge’s work in studying our cognitive processes.

Morewedge, an associate professor of marketing, points to that menu. In terms of calories, “basically, every food option is worse for you than beer,” he says. Fortunately, the frightening calorie counts are right there on the menu. But imagine a dieter trying to pick if the calories weren’t posted, he says. “You’d think, ‘I’m going to stick to my diet and have some peanuts instead of drinking a beer.’ But the peanuts are 1,200 calories, and the beer is 300 calories, or 200 calories. If you don’t have that information, then you can’t incorporate it into your decision.”

Incomplete data isn’t the only problem. The information and experiences we carry with us, the messages we receive, and the biases we apply in interpreting these inputs are all a part of the decision-making stew. And we express them in our behaviors, whether as a partner, a parent, a manager, a team member, or a policy maker.

For Morewedge, an interesting question is what to do about those biases. How can we recognize them and blunt their effects? An answer may emerge from the experiments he and his colleagues have conducted in training people to make better decisions through playing specially designed video games.

In business, even the smallest decisions can turn into the biggest what ifs. What if Kodak hadn’t decided to mothball its new digital camera technology in 1975? What if Excite.com had bought Google in 1999 for just $750,000? In both cases, hindsight makes the decisions that were taken look like stinkers.

Researchers over many years have devised a framework for analyzing decision-making and what can go wrong. In his research, Morewedge has examined how to reduce six types of biases that affect our analyses of multiple situations. Bias blind spots convince us that we are less likely to be biased than others making the same evaluation. Confirmation bias prompts us to pick evidence that bolsters our existing ideas. Fundamental attribution error blinds us to the circumstances in which an event occurs—when we focus, for example, on one airline’s flight delay without noticing that the weather has affected all departures that day. Anchoring happens when we put too much emphasis on the first piece of information we receive when making a judgment. Projection is the tendency to believe that others think as we do. Representativeness indicates that we rely on simple, often misleading, rules when considering the probability of a certain event.

To influence the decision-making process, researchers have focused on three approaches. One is to offer incentives that encourage (or discourage) certain behaviors, such as taxing cigarettes or soft drinks. Another approach involves nudges that reframe familiar questions or present information in new ways, which can lead people to make better choices. One experiment Morewedge cites from a 2015 edition of American Economic Review showed that improving the design of information about a government benefit (the earned-income tax credit) encouraged more eligible people to participate in the program.

Training represents a third way. Teach people to understand the role of bias, show them examples of bias-driven mistakes that occur in situations that are familiar to them and, the thinking goes, they can see this dynamic in action and apply its lessons to their future decisions. There’s just one problem: it hasn’t worked very well in developing decision makers who can generalize their skills beyond the textbook scenarios they faced in a classroom.

“What people find is, you can train people to recognize bias and you can reduce that bias in very specific domains. For example, weather forecasters become, believe it or not, actually very accurate at predicting the percentage chance of precipitation,” says Morewedge, 38, who in April 2016 was named by the business schools news website Poets & Quants as one of the best 40 under 40 business school professors. “But if you take a weather forecaster and ask them to make a different kind of probability estimate, they show the same kinds of overconfidence as untrained forecasters.”

6 Tips: Ditch Your Bias

Awareness that bias plays a role in our thinking is a factor in mitigating its effects. Carey K. Morewedge offers tips for combating bias.

1. Bias blind spot: Thinking we are less likely to have predisposed beliefs than others.

Morewedge’s advice: Realize this is a human trait we all share. “Biases are easier to detect in decisions than in the thoughts that guided them....If you only tend to hire men, for example, that may indicate gender bias, even if you think you fairly considered women.”

2. Confirmation bias: Choosing evidence that bolsters our existing ideas.

Morewedge’s advice: “People need to explicitly test if their hypothesis might be wrong.” When evaluating an attractive IPO, for example, “explicitly looking for evidence that the stock might decrease in value...may lead to a less biased outcome or judgment.”

3. Fundamental attribution error: Not considering the circumstances in which an event occurs.

Morewedge’s advice: Consider how anyone might have performed in the same situation. “If the weather is terrible, every airline is going to have flights that are cancelled. You have to think about how likely it is that any airline could take off in the snowstorm.”

4. Anchoring: Putting too much emphasis on the first piece of information we receive.

Morewedge’s advice: Consider other data besides the first idea that comes up. “If you think about when Washington was elected president, most people either start with 1776 [too early] or the War of 1812 [too late]. Let’s say that you come up with 1776 as the first number that you think is relevant. Trying to generate other kinds of numbers may reduce that particular number’s pull on your answer.”

5. Projection: Assuming that other people think the same way we do.

Morewedge’s advice: Consider alternative points of view. “We surround ourselves with like-minded friends and coworkers. Try to think about people with different beliefs or perspectives, which will make those alternatives more likely to be considered.”

6. Representativeness: Relying on some simple, often misleading, rules when considering the likelihood of certain events.

Morewedge’s advice: Learn some basic statistical rules, like base rates. “There may be many compelling reasons to think about why any particular start-up could succeed: its product or its team or particular market. It’s also useful to think about, in general, how many start-ups succeed on average when thinking about your investments.”

What if you could find a way to teach people in a simulation so that they can discover their individual tendencies toward bias—and then learn how to correct them?

This is what Morewedge and his colleagues from higher education and industry sought to answer through tests with two video games they designed for intelligence analysts called Missing: The Pursuit of Terry Hughes and Missing: The Final Secret. In the first game, players search for a missing neighbor (Terry of the title); in the second, they prove that she is innocent of a crime. It takes about one hour to complete each game’s three levels, each of which asks players a set of questions that expose different biases. For instance, do players primed to think that foul play is afoot focus on clues that would support that conclusion—and ignore those hinting at a more innocent explanation? At the end of each level, the game provides players with a review of their responses. This after-action review includes experts speaking on the screen to define the biases the players exhibited during decision-making moments. And it offers strategies, from considering alternative explanations to scrutinizing initial answers, to mitigate these biases. The game’s feedback also draws on lessons from academic fields, such as the basics of formal logic and statistical rules, to reinforce best decision-making practices: that big sample sizes, for instance, lead to more accurate conclusions than small ones. The review is backed up by illustrative examples of professionals (like a doctor, lawyer, or intelligence analyst) demonstrating the same kind of bias in their work. Then, the game provides practice examples to further test a player’s ability to mitigate that bias. Commercial versions of both games are now available (email James Korris at missing@cretecinc.com for information); students in Morewedge’s marketing classes at Questrom have already played them.

In 2015, the researchers published a study of the games’ effects in Policy Insights from the Behavioral and Brain Sciences. They found that “playing the games produced large, persistent reductions in the biases that players exhibited,” says Morewedge. They also discovered that the interactive game was decidedly more effective than a training video used as a control method. (In the control, participants watched videos in which an instructor explains different types of bias, actors stage vignettes showing bias, and the instructor suggests ways to mitigate bias.) Importantly, the game was more effective not only immediately after playing, but also in follow-up testing done three months later.

Mitigating bias is about teaching ourselves to double-check our thinking, says Morewedge.

Morewedge believes the interactivity of the game was a key ingredient. The immersive experience of playing along—”and seeing in real time the biases in your own decisions”—is more powerful than only receiving information through a lecture and watching actors role-play.

Without an education in sciences that involve statistical inferences, people are naturally disposed to making errors of bias. The games showed that it’s possible to teach people how to recognize their own tendencies and to substantially correct them. We can learn how to make better decisions if we can see where we’re apt to get things wrong.

Take the common problem of confirmation bias. Morewedge says we’re prone to track down information that will back what we already believe to be true, so when we evaluate a situation, even though there may be a lot of information, “If we’re testing hypotheses, we tend to look for and think of things that would confirm our hypotheses, and omit evidence that doesn’t then come to mind,” Morewedge says.

Another example: in the workplace, managers often make a fundamental attribution error when evaluating the performance of an employee, or considering their potential. We tend to give too little weight to the specifics of a person’s experience, such as their previous jobs and education. To mitigate this bias, Morewedge says a manager should ask, “What were some of the advantages or disadvantages this particular candidate had, relative to other people?” If a person performed poorly, how did everyone else in those circumstances perform? How does the context of that person’s situation influence the results they achieved?

Mitigating bias is about teaching ourselves to double-check our thinking. This is useful at a time when we tend to surround ourselves with like-minded people, Morewedge says. “If we’re trying to think about the percentage of people who believe in the same thing we do, people like us tend to be more accessible in our minds. Thinking about people who hold different beliefs and perspectives can make that inconsistent evidence more accessible and more likely to be used.”