Risk - not what you think

From issue 2671 of New Scientist magazine, 30 August 2008, page 34-37
Start with this

How to fix the problem (at least according to New Scientist)


YOU'RE at the airport about to take a flight when you suddenly realise you forgot to buy travel insurance. You go to your airline's ticket desk where they offer you a choice: a package that covers death from terrorism, and a cheaper deal covering death by any means. Which do you choose?

It sounds like a no-brainer. The cheaper option covers terrorism and everything else, so is the better deal. Yet when psychologists tested these alternatives in experiments, they found that most people preferred to pay more for terrorism-only insurance instead of the cheaper option covering all causes of death. The mere suggestion of terrorism had such a distorting effect on people's decision-making that it led them to make a very poor choice (Journal of Risk and Uncertainty, vol 7, p 35).

The overpowering influence of this kind of fear on human behaviour was illustrated most graphically in the US after the terrorist attacks of 11 September 2001. Throughout the following 12 months, many Americans chose to drive rather than take domestic flights. As a result, the number of people killed in road accidents over that year rose by around 1600, six times the number who died in the hijacked aircraft. In trying to avoid a potentially grisly fate, they leapt "out of the frying pan into the fire", says Gerd Gigerenzer of the Max Planck Institute for Human Development in Berlin, Germany, who collected the figures (Risk Analysis, vol 26, p 347).

Other studies have shown we make bad decisions when it comes to weighing up the risks versus benefits of being screened for cancer or having a vaccination, and judging the risks of everything from nuclear power to climate change.

Even people who know very well the balance of risks can get it wrong when it really matters. George Loewenstein, who studies decision-making at Carnegie Mellon University in Pittsburgh, Pennsylvania, recalls how after 9/11 an economist colleague of his, whom he had always considered "a paragon of rationality", opted to drive rather than fly to distant meetings. "Rather than deliberating about a long-term strategy to counter a risk, people often seem to go into panic mode and take actions that actually exacerbate the problem they are worried about," he says.

But why are we so bad at making good decisions in risky situations? Researchers have documented our poor reaction to risk for decades, but only recently have they finally begun to think of ways to improve it. The key lies in our emotional response. When we are distressed, or otherwise emotionally aroused, we seem to be incapable of soberly weighing up our options. Instead, we are often led by our feelings. It turns out, however, that it should be possible to change this behaviour, so long as we are primed to recognise it.

To have any hope of being better at decision-making in risky scenarios, we first need to understand why we behave and think the way we do in these situations. Psychologists and neuroscientists have divided the way people respond to uncertainty into two broad categories. One is cognitive and analytical and involves rationally weighing up probabilities and considering outcomes. The other is intuitive, quick, mostly unconscious and based on feeling and emotion. Emotion-driven decision-making probably suited humans best in the environments in which we evolved, and is still our instinctive response, says Paul Slovic of Decision Research, a non-profit organisation based in Eugene, Oregon, that investigates human judgement and risk. Instinct is essential in many kinds of judgement, in particular in deciding whom to trust and interact with, and in situations involving complex or split-second choices (New Scientist, 5 May 2007, p 35). But in some situations - especially those that evoke fear, pain or other strong emotional responses - the intuitive system can steer us away from the best decision.

A classic example is "dread risks" - events with a low probability of happening but with severe consequences if they do. Terrorist attacks are one example. The threat of cancer is another. Just as people used intuitive, fear-driven judgement when deciding how to travel after 9/11, when vivid images of the attacks were prominent in their minds, those who have witnessed a friend or relative suffering or dying from cancer are likely to strongly overestimate their chances of getting the disease. They may even take invasive - and potentially harmful - diagnostic tests as a result.

Ellen Peters, who studies decision-making in health and finance at Decision Research, says the negative feeling associated with cancer can be so strong that presenting people who have drawn such emotional conclusions with a more realistic risk assessment based on probabilities and statistics may have little or no impact on their judgement. That's unless they have a strong understanding of what probabilities mean. "Health is driven by emotions, anxiety and trust in your doctor. Few want to see the evidence, just as most people do not know how much safer flying is than driving and do not make attempts to find out," says Gigerenzer. According to Michael Sivak and Michael Flannagan of the University of Michigan at Ann Arbor, driving the length of a typical domestic flight in the US (estimated at 1157 kilometres) is 65 times riskier than flying. The events of 9/11 would have to occur every month before flying became as risky as driving (American Scientist, vol 91, p 6).

Arguably it is perfectly rational to act on your emotions. Why fly if it makes you totally miserable and you can't allay these unpleasant feelings, says Loewenstein. Furthermore, people react most strongly to new, unfamiliar risks - a reasonable response given that the severity of such risks is necessarily unknown.

The influence of fear on everyday decision-making is best illustrated by the dramatic effect of being reminded of the inevitability of your own death. Immediately after being reminded of their death, people go into a kind of cognitive overdrive in which they strive to suppress the thought of it. This takes mental effort and can distort their thinking about other tasks. Psychologist Jamie Arndt at the University of Missouri in Columbia found that people who feel threatened after reading about cancer are less likely to check themselves for signs of breast or testicular cancer (Journal of Personality and Social Psychology, vol 92, p 12). He recommends health authorities avoid making explicit references to death when encouraging people to take part in health screening. "A little bit of fear can be a good thing. We tend to notice things more. But too much fear can provoke this avoidance response. It's a fine line," says Arndt.

The problem with fear and with strong feelings in general is that they drive our judgements and our consideration of risks and benefits. They short-circuit other mental processes that might lead us to a more realistic conclusion. But feelings alone are not the only short cut to faulty decision-making in matters of risk. Memory also plays a strong role, in particular our capacity to recall graphic imagery. This is known by psychologists as the "availability rule": the more easily you can bring to mind or imagine an event, the more likely you think it is to happen. It is driven largely by feeling, so memories of emotional or vivid situations are the most easily recalled. A major reason we overestimate the likelihood of being killed in a plane crash, shark attack or terrorist attack is that extensive and graphic media coverage makes such events easy to picture. We similarly underestimate the likelihood of dying from diseases because news of these kinds of deaths is generally presented in terms of statistics rather than sensational images.

Some argue that the media's focus on shocking or traumatic news stimulates the intuitive, non-thinking side of our decision-making and is at the root of many misjudgements. "We are not rational enough to be exposed to the press," says Nassim Nicholas Taleb, co-director of the Decision Research Laboratory at the London Business School. For an illustration of this, look no further than the vastly different perceptions of the risks from terrorism and lightning strikes, each of which has killed roughly the same number of Americans since records began.

A good example of how graphic media coverage can distort our perceptions of real events is the finding by James Ost at the University of Portsmouth, UK, and colleagues that people who were highly exposed to news reports of the terrorist bombings in London on 7 July 2005 were more likely to recall things about the attacks that they could never have witnessed, such as whether or not the bus that was blown up in Tavistock Square was moving at the time (Memory, vol 16, p 76).

Media coverage of crime appears to have a similar effect. Sixty-five per cent of UK residents believe that crime is rising in the country overall, according to the government's 2006/2007 British Crime Survey. Yet the survey showed that crime rates had been holding steady after falling by 42 per cent over the 10 years from 1995. It's a similar picture in the US. A 2001 study by the youth justice campaign group Building Blocks for Youth found that between 1990 and 1998, recorded crime fell by 20 per cent while TV coverage of crime increased by 83 per cent. Even more tellingly, TV coverage of homicides rose by 473 per cent over that period, while the number of homicides actually fell by 33 per cent.

The disproportionate reporting of dramatic events is particularly effective at distorting decision making when it leads to an "availability cascade" - a process of belief formation that reinforces a story's plausibility as more and more people accept it as fact and retell it as such. This often leads to public myths that last for years. Cass Sunstein at the University of Chicago Law School and Timur Kuran at Duke University in Durham, North Carolina, who coined the term, say that availability cascades stem from the fact that people have limited knowledge about most things. "On matters ranging from the health consequences of sugar and coffee consumption to the risks of car driving, nuclear power, and global warming, each of us depends for information on what other people seem to know," they wrote in their seminal paper on the subject in 1999 (Stanford Law Review, vol 51, p 683).

Public availability of information is not the only factor governing how people use it to make decisions. Recent work by Dan Kahan and colleagues at the Cultural Cognition Project (CCP) at Yale Law School shows that when people are considering public safety or environmental risk issues such as gun crime, nuclear power, climate change, vaccinations or the safety of new technologies, they are most strongly influenced by the opinions of those experts or public advocates who seem to share their cultural world view and values. So we are predisposed to trust arguments about the safety of nanotechnology, for example, if they are put forward by people of the same social class or of similar political leanings to us, and predisposed to reject arguments put forward by people whose values we reject - regardless of any views we may previously have held on the issue. Unfortunately, that bias won't necessarily lead to the best choice, so the idea that simply distributing accurate information is the best way to get people to make informed decisions is flawed: people will reject it if it isn't presented to them by people they feel sympathetic to. Officials or campaigners have to display a plurality of cultural leanings if they want to reach out to as many people as possible.

Changing our decision-making process to enable us to make better choices will not be straightforward. Emotion plays a powerful role in the process, so when we're feeling fearful or insecure, statistics wither in the face of millennia of evolutionary adaptation.

This has led Slovic to suggest we need to imbue statistics with more emotional significance so that we take them to heart. "We learn how to deal with numbers from a young age as cold or abstract entities - to read them, add them, multiply them - but we don't learn to think about how they represent reality in a way that conveys feeling and meaning. We need to think how to teach people to step away from their intuitive response, which is insensitive to magnitude, and think more carefully about what numbers represent." The communist leader Joseph Stalin summed this up in his much quoted remark: "A single death is a tragedy; a million deaths is a statistic."

Ellen Peters showed in a study published in May that highly numerate people are more likely to use numbers and less likely to surrender to emotions when making decisions (Annals of the New York Academy of Science, vol 1128, p 1). Gigerenzer points to work published last year showing that the vast majority of people know next to nothing about the risk factors associated with stroke, heart attack and HIV (BMC Medicine, DOI: 10.1186/1741-7015-5-14). "Yet almost everyone, at least those with a Y chromosome, knows statistics about football, cricket or baseball," he says. Gigerenzer is now setting up a centre to help people better understand risk. The Harding Center for Risk Literacy, due to open at the Max Planck Institute for Human Development in April 2009, will research how people deal with risk and train people to manage it better.

Taleb thinks teaching people the facts about risks will not help to change behaviour. He says it would be more productive to teach people to screen out the information that distorts our decision-making than to teach them to use general information better. "If it was possible to teach people to adjust their behaviour to risks, we wouldn't have smokers. But we do. Our intelligence doesn't translate into behaviour the way we think it should."

But surely it's worth a try. Next time you're faced with a decision and your mind is filled with thoughts of death, violence or disease, or politicians tell you that you should be afraid of some threat that they aim to protect you from - or anything else that has your blood racing and muscles tensing - try this: switch off the TV; be grateful for your panicky response (it's got you this far down the evolutionary road, after all) but tell yourself it may not be appropriate since you are no longer living on the savannah; weigh up all the facts and remember, when it comes to risks, feel the numbers.


Suggestions:

Put wax in your ears. People are more afraid of flying than driving because the press does not report car accidents. I never watch the news. Only listen to news you get in a social setting, the things people talk about. Our brains cannot deal with the overload of information. Having a lot of data is not good for anyone trying to make a decision. Nicholas Nassim Taleb, Decision Research Laboratory, London Business School

Try to pause when you come across a statistic or probability. Think: 'What is the human angle here, how do I appreciate this?' Rather than just taking in the number as you perceive it, stop and reflect on it. Try to re-frame the numbers. Paul Slovic, Decision Research, Eugene, Oregon

If as a citizen you would like to form well-considered views on a culturally divisive risk issue - for example, global warming, or gun control - find a knowledgeable person who shares your general cultural outlook but who disagrees with you. You are likely to give this person's arguments a sympathetic hearing, which will help offset the natural disposition we all have to dismiss as unreliable and biased the arguments of persons whose basic outlooks are different. From our own Dan Kahan, Yale Law School