Facts vs. feelings

Image: Paulinho Fluxuz
Whatsapp
Facebook
Twitter
Instagram
Telegram

By TIM HARFORD*

How to stop our emotions from fooling us

As spring 2020 approaches, the importance of accurate, timely, and honest statistics is suddenly apparent. A new coronavirus was spreading around the world. Politicians had to make their most significant decisions in decades, and quickly. Many of them depended on the data-gathering work that epidemiologists, medical statisticians, and economists rushed to do. Tens of millions of lives were potentially at risk, as were the livelihoods of billions of people.

At the beginning of April, countries around the world had already been in lockdown for a few weeks, the global death toll passed 60.000 and the unfolding of history was still far from clear. Perhaps the deepest economic depression since the 1930s was under way, riding on an explosive death toll. Perhaps, thanks to human ingenuity or good luck, such apocalyptic fears would fade from memory. Many scenarios were plausible. And that's the problem.

An epidemiologist, John Ioannidis, wrote in mid-March that Covid-19 “could be a once-a-century evidence fiasco”. Data detectives are doing their best – but they have to work with flawed, inconsistent and woefully inadequate data to make life-and-death decisions with the confidence we'd like to have.

There is no doubt that the details of this fiasco will be studied for years. But some things already seem to be clear. At the onset of the crisis, politics seem to have impeded the free circulation of honest statistics. Although your claim is contested, Taiwan complained that it had already offered, at the end of December 2019, important evidence of human transmission to the World Health Organization – but, still in mid-January, the WHO tweeted reassuringly that China had not found any evidence of human transmission. (Taiwan is not a member of the WHO as China asserts sovereignty over the territory and demands that it not be treated as an independent state. It is possible that this geopolitical obstacle led to the alleged delay.)

Did it matter? Almost certainly; with cases doubling every two or three days, we'll never know what could have gone differently with a few more weeks of warning. It is quite clear that many leaders were slow to admit the potential gravity of the threat. President Trump, for example, announced at the end of February: “This will go away. One day, it's like a miracle, it will disappear.” Four weeks later, with 1.300 Americans dead and more confirmed cases in the US than any other country, Trump was still talking hopefully about taking everyone to church on Easter.

As I write, the debates are raging. Can rapid testing, isolation and case tracking contain outbreaks indefinitely, or do they just delay their spread? Should we be more concerned about small crowds indoors or large crowds outdoors? Does closing schools prevent the spread of the virus, or does it do more harm by leaving children with their vulnerable grandparents? How much does wearing masks help? These and many other questions can only be answered by good data on who got infected, and when.

But in the early months of the pandemic, a large number of infections were not being recorded by official statistics, due to a lack of testing. And the tests that were being conducted painted an imperfect picture by focusing on healthcare professionals, critically ill patients, and – let's face it – rich and famous people. It took months to build up a picture of how many mild or asymptomatic cases there are, and therefore how deadly the virus really is. As the death toll grew exponentially in March, doubling every two days in the UK, there was no time to wait and see. Leaders put economies in a induced coma – More than 3 million Americans filed for unemployment benefits in one week at the end of March, five times the previous record. The following week was even worse: more than 6.5 million demands have been applied. Were the potential health consequences really catastrophic enough to justify taking so many people's incomes away in this way? It looked like it would – but epidemiologists could only make their best guesses with very limited information.

It is hard to imagine a more extraordinary illustration of how often we take accurate and systematically obtained numbers for granted. Statistics on a wide range of important pre-coronavirus issues have been carefully accumulated over the years by dedicated statisticians, and are often made available for download, free of charge, anywhere in the world. However, we are spoiled for such luxury, casually dispensing with “lies, damn lies and statistics”. The Covid-19 case reminds us of how desperate the situation can become when the statistics just aren't there.

When it comes to interpreting the world around us, we need to realize that feelings can speak louder than knowledge. It explains why we buy things we don't need, fall in love with the wrong romantic partner, or vote for politicians who betray our trust. In particular, it explains why we assume statistical claims that the simplest questioning would invalidate. Sometimes we want to be deceived.

Psychologist Ziva Kunda found this effect in the lab when she showed subjects in an experiment a paper presenting evidence that coffee and other sources of caffeine could increase women's risk of developing breast cysts. Most people found the article quite convincing. Women who drank a lot of coffee, no.

We often look for ways to dismiss evidence we don't like. And the opposite is also true: when the evidence seems to support our conceptions, we are less likely to look more closely for defects. It is not easy to dominate our emotions while evaluating information that is relevant to us, not least because our emotions can lead us in different directions.

We don't need to become cold numerical information processors – just noticing our emotions and taking them into account can often be enough to improve our judgment. Rather than demanding superhuman control of our emotions, we simply need to develop good habits. Ask yourself: How does this information make me feel?

Do I feel justified or superior? Anxious, angry or fearful? Am I in denial, looking for a reason to dismiss the claim?

In the early days of the coronavirus epidemic, useful-looking misinformation spread as quickly as the virus itself. One viral post – circulating on Facebook and in email groups – explained convincingly how to distinguish between Covid-19 and a cold, assured people that the virus was destroyed by hot weather and, incorrectly, recommended that ice water should be avoided, while hot water would kill any viruses. The publication, sometimes attributed to "my friend's uncle", sometimes to "Stanford Hospital staff" or to some irreproachable and impartial pediatrician, was occasionally accurate but speculative and generally misleading. Still, people—usually sensitive people—shared it incessantly. Why? Because they wanted to help others. They felt confused, found advice seemingly helpful, and felt compelled to share. That was just a human impulse, and full of good intentions – but it wasn't wise.

Before repeating any statistical statement, I first try to take note of how it makes me feel. It's not an infallible method against self-deception, but it's a little harmful habit that sometimes helps a lot. Our emotions are powerful. We cannot make them disappear, nor should we want to. But we can, and should, try to notice when they cloud our judgment.

In 1997, economists Linda Babock and George Loewenstein did an experiment in which participants were given evidence of an actual judgment about a motorcycle accident. They were then randomly assigned to play the role of attorney for the prosecution (arguing that the injured motorcyclist should receive $100.000,00 in damages) or attorney for the defense (arguing that the case should be dismissed or that damages should be low). .

The subjects of the experiment were given a financial incentive to argue persuasively, and to reach an advantageous agreement with the other side. They were also given a separate financial incentive to guess what damages the judge in the case actually awarded. Their predictions were supposed to be unrelated to their enacted roles, but their opinions were heavily influenced by what they expected to be true.

Psychologists call this "motivated reasoning." Motivated reasoning is thinking about a topic with the purpose, consciously or unconsciously, of arriving at a particular type of conclusion. In a football match, we see the fouls committed by the other team but turn a blind eye to the sins on our side. We tend to notice what we want to notice. Experts are not immune to motivated reasoning. Under some circumstances, your expertise can even become a liability. The French satirist Molière once wrote, “an educated fool is more foolish than an ignorant one.” Benjamin Franklin said, “It is so convenient to be a rational creature since it allows us to find or work out a reason for everything we want to do.”

Modern social science agrees with Molière and Franklin: people with deeper expertise are better equipped to detect deception, but if they fall into the trap of motivated reasoning, they are able to muster more reasons to believe anything they really want to. to believe.

One of the recent review of the evidence concluded that this tendency to evaluate evidence and test arguments in a biased way in favor of our preconceptions is not only common, but even more common among intelligent people. Being smart or polite is no defense. In some circumstances, it can even be a weakness.

An illustration of this is a study published in 2006 by two political scientists, Charles Taber and Milton Lodge. They wanted to examine how Americans thought about controversial political issues. The two themes chosen were disarmament and affirmative action.

Taber and Lodge asked participants in their experiment to read a number of arguments from each side, and to rate the strengths and weaknesses of each argument. One might hope that being asked to review such pros and cons might give people a greater shared appreciation of opposing views; instead, the new information pushed them further apart.

This was because people considered the information they received as a means to strengthen their previous beliefs. When invited to look for more information, they would look for data that supported their preconceived ideas. When asked to assess the strength of an opposing argument, they would spend considerable time thinking of ways to overthrow it.

This isn't the only study to come to this kind of conclusion, but what's particularly intriguing about Taber and Lodge's experiment is that expertise made matters worse. More sophisticated participants in the experiment found more material to support their preconceptions. Even more surprisingly, they found less material that contradicted them – as if they used their knowledge to actively avoid uncomfortable information. They produced more arguments in favor of their own views, and pointed to more flaws in the other side's arguments. They were significantly better equipped to reach the conclusion they had wanted to reach all along.

Of all the emotional responses we can have, the most politically relevant are motivated by partisanship. People with a strong political affiliation want to be on the right side of things. We see an allegation, and our response is immediately shaped by what we believe “is what people like me think”.

Consider this claim about global warming: “Human activity is causing the Earth’s climate to warm, posing serious risks to our way of life.” Many react emotionally to a statement like that; it's not like a statement about the distance to Mars. Believing or doubting it is part of our identity; it says something about who we are, who our friends are, and the kind of world we want to live in. If I put a statement about global warming in a newspaper headline, or in a graphic to be shared on social media, it will attract attention and engagement not because it is true or false, but because of the way people feel about it. she.

If you doubt this, consider the findings of a poll conducted by Gallup in 2015. She found a huge gap between how much Democrats and Republicans in the US cared about climate change. What rational reason could there be for this?

Scientific evidence is scientific evidence. Our beliefs around climate change should not lean left or right. But they tend. This gap becomes larger the more educated people are. Among those without a college education, 45% of Democrats and 23% of Republicans were “very” concerned about climate change. However, among those with higher education, the figures were 50% Democrats and 8% Republicans. A similar pattern persists if you measure science literacy: Republicans and Democrats who are more scientifically educated are even further out compared to those who know very little about science.

If emotion didn't play a part, surely more education and more information would help people come to terms with what the truth is—or at least the current best theory? But offering people more information seems to actively polarize them on the issue of global warming. That fact alone tells us how important our emotions are. People struggle to reach the conclusion that fits with their other beliefs and values ​​– and the more they know, the more ammunition they have with which to reach the conclusion they hope to reach.

In the case of climate change, there is an objective truth, even if we are not able to discern it with perfect certainty. But since you are one individual among approximately 8 billion on the planet, the environmental consequences of what you happen to believe are irrelevant. With a handful of exceptions – you're the president of China, say – climate change will run its course no matter what you say or do. From an egocentric point of view, the practical cost of being wrong is close to zero. The social consequences of their beliefs, however, are real and immediate.

Imagine you own a barley farm in Montana, and hot, dry summers are ruining your crops with increasing frequency. Climate change matters to you. But despite that, rural Montana is a conservative place, and the words “climate change” are politically charged. Anyway, what can you personally do about it?

Here's the way one farmer, Erik Somerfeld, balanced that scale, onlyaccording to the description from journalist Ari LeVaux: “In the field, watching his deteriorating crop, Somerfeld was unequivocal about the cause of his damaged crop – 'climate change'. But, arriving at the bar, with his friends, his language changed. He abandoned those taboo words in favor of 'erratic weather' and 'hotter, drier summers' – a conversational tactic not uncommon in the rural countryside in these times.”

If Somerfeld lived in Portland, Oregon, or Brighton, East Sussex, he wouldn't need to be so circumspect at his local tavern – he'd probably have friends who take climate change really, really seriously. But then those friends would quickly exclude someone from their social group who walks around screaming that climate change is a chinese scam.

So perhaps it's not so surprising, after all, to find educated Americans on diametrically opposed positions on the topic of climate change. Hundreds of thousands of years of human evolution have programmed us to care deeply about fitting in with those around us. This helps explain Taber and Lodge's findings that well-informed people are actually more likely to reason motivatedly about politically partisan topics: the more persuasively we can argue in defense of what our friends already believe, the more our friends will respect.

It is much easier to get sidetracked when the practical consequences of being wrong are small or non-existent, while the social consequences of being “wrong” are severe. It is no coincidence that this describes many controversies that cause splits along party lines.

It's tempting to assume that motivated thinking is just something that happens to other people. I have political principles; you are politically biased; he is a fringe conspiracy theorist. But it would be wiser to recognize that we all think with our hearts instead of our heads from time to time.

Kris De Meyer, a neuroscientist at King's College, London, shows his students a message describing an environmental activist's problem with climate change denialism:

“To sum up the activities of climate deniers, I think we can say that:

(1) Their efforts have been aggressive while ours have been defensive.

(2) The activities of denialists are somewhat orderly – almost as if they have a plan in place.

I think the denialist forces can be characterized as dedicated opportunists. They are quick to act and seem to have no principles when it comes to the information they use to attack the scientific community. There is no doubt, however, that we have been inept at getting our side of the story, however good it may be, through the media and the public.

The students, all committed believers in climate change, incensed by the cloud of smoke posed by cynical, anti-scientific deniers, wave in acknowledgment. De Meyer then reveals the source of the text. It's not a recent email. It's taken, sometimes word for word, from an infamous memo written by a cigarette marketing executive in 1968. The memo is complaining not about "climate deniers" but about "anti-tobacco forces." Other than that, few changes were necessary.

You can use the same language, the same arguments, and perhaps even have the same conviction that you are right, whether you are arguing (correctly) that climate change is real or (incorrectly) that the link between smoking and cancer is not.

(Here's an example of this trend that, for personal reasons, I can't help but be moved by. My eco-conscious friends on the left are justifiably critical of ad hominem attacks on scientists. You know what this is about: claims that scientists scientists are fabricating data because of their political leanings, or because they are seeking government funding, generally speaking, defaming the person rather than devoting themselves to the evidence.

However, these same friends have no problem adopting and amplifying the same kind of tactics when they are used to attack my fellow economists: that we are making up data because of our political leanings, or looking to finance big business. I tried to point this parallel out to an understanding person, and got nowhere. She was completely unable to understand what I was saying. I would call this a double standard, but that would be unfair – it would suggest it was deliberate. It is not. It's an unconscious bias that's easy to spot in others and very hard to see in ourselves.)

Our emotional reaction to statistical or scientific claims is not a side issue. Our emotions can, and often do, shape our beliefs more than any logic. We are able to persuade ourselves to believe strange things, and to doubt solid evidence, in the service of our political position, our desire to keep drinking coffee, our unwillingness to face the reality of our HIV diagnosis, or any other cause. involving an emotional reaction.

But we must not despair. We can learn to control our emotions – that's part of the maturing process. The first simple step is to notice such emotions. When you see a statistical statement, pay attention to your own reaction. If you feel indignation, triumph, denial, stop for a moment. Then reflect. You don't have to be an emotionless robot, but you can and should think as much as you feel.

Most of us don't actively want to delude ourselves, even when doing so may be socially advantageous. We have reason to draw certain conclusions, but the facts matter too. Many people would like to be movie stars, billionaires or hangover immune, but few actually believe they are. O wishful thinking it has limits. The more we get used to counting to three and noticing our hasty reactions, the closer we get to the truth.

For example, a poll, conducted by a team of academics, found that most people were perfectly capable of distinguishing between journalism and fake news, and also agreed that it was important to amplify the truths, not the lies. Yet the same people would happily share headlines like “More than 500 'Caravan Migrants' Arrested in Suicide Garments,” because the moment they clicked “share,” they weren't stopping to think. They weren't thinking, "Is this true?" nor were they thinking "Do I think the truth is important?"

Instead, they glided over the internet in that state of constant distraction we all know, they were carried away by their emotions and their partisanship. The good news is that simply taking a moment to reflect was all that was needed to filter out a lot of the misinformation. It doesn't cost much; we can all do it. All we need is to get into the habit of stopping to think.

Inflammatory memes and dramatic rants lead us to jump to the wrong conclusion without thinking. That's why we need to be calm. And this is also why so much persuasion is designed to provoke us – our desire, our sympathy or our anger. When was the last time Donald Trump, or even Greenpeace, tweeted something intended to make you pause for quiet reflection? Today's persuaders don't want you to stop and think. They want you to hurry up and feel it. Don't be rushed.

*Tim Harford is a journalist and writer. Author, among other books by the underground economist (Record).

Translation: Daniel Pavan

Originally published in the newspaper The Guardian

 

Sign up for our newsletter!
Receive a summary of the articles

straight to your email!