Nieman Foundation at Harvard
HOME
          
LATEST STORY
Collaboration helps keep independent journalism alive in Venezuela
ABOUT                    SUBSCRIBE
Aug. 12, 2024, 9:57 a.m.

Repetition makes climate misinformation feel more true — even for those who back climate science

“As our social media feeds fill up with AI-driven bots, sheer repetition of lies may erode the most essential resource for action on climate change — public support.”

If you consider yourself a climate science supporter, you probably wouldn’t think simple exposure to a skeptic’s claim could shift your views.

Our new research has produced worrying findings. Climate misinformation may be more effective than we’d like to think because of a phenomenon called the illusory truth effect. In short, we are more likely to believe a lie if we encounter it repeatedly. Worse, the effect works immediately — a lie seems to be more true even after just one repetition.

As our social media feeds fill up with AI-driven bots, sheer repetition of lies may erode the most essential resource for action on climate change — public support. Traditional media has a different problem — in their commitment to presenting both sides, journalists often platform climate skeptics whose untrue claims add to the repetition of misinformation.

There’s no easy answer. But one thing that does work is to come back to the scientific consensus that our activities are the major cause of global warming — and to the overwhelming public support worldwide for stronger action on climate change.

We’ve long known about the illusory truth effect, where sheer repetition makes information sound more true, regardless of whether it’s true or false. The reason this works on us is familiarity — when information becomes familiar, we mentally ascribe a level of truth to it. But does this repetition still shift perceptions of truth when we hold seemingly strong existing beliefs?

To find out, we ran experiments where a total of 172 people who were overwhelmingly endorsers of climate science viewed claims aligned with solid climate science, climate sceptic claims, and weather-related claims. Participants saw some claims just once, while others were repeated. What we found was that it took just a single repetition to make the claims seem more true. This happened for all types of claims, including climate science and skeptic claims.

What’s more, this happened even to those people who regarded themselves as endorsing the scientific consensus on human-caused climate change and who are highly concerned about the issue. The effect held even when these participants later identified the claim was aligned with climate skepticism.

In recent years, many researchers have explored this effect in different areas of knowledge. This evidence base points towards an important finding: Low quality or malicious information can be laundered through repetition and made to seem true and trustworthy.

This poses an interesting challenge to the way the traditional media has long operated. Many journalists pride themselves on their adherence to fair, balanced reporting. The reason for this lies in history — when the mass media first emerged as a major force in the 19th century, highly partisan or sensational “yellow” journalism was common. Balanced journalism emerged as a counter to this, promising to platform several sides of a debate.

But balance can be easily gamed. Giving equal exposure to opposing voices can lead people to think there is less of an expert consensus.

What our research suggests is comments, articles, and posts of climate misinformation may have a corrosive nature — the more we’re exposed to them, the more likely we will come to accept them. You might think intelligence and careful thinking can have a protective effect. But the broader body of research on illusory truth has found being smarter or more rational is no protection against repetition.

What can we do to protect ourselves? Researchers have found one reliable solution — come back to the scientific consensus. For decades, scientists have researched the question of whether our activities are the main cause of rising global temperatures. Many different lines of evidence, from rates of ice melt to sea temperatures to satellite measurements, have now answered this conclusively. The scientific consensus is now 99.9% certain, a figure which has only grown over time. Drawing on this consensus may work to protect us from accepting sceptic arguments by reminding us of the very large areas of agreement.

There’s a systemic problem here. Never before in history have we been able to access so much information. But our information environments are not benign. Actors with an agenda are at work in many areas of public life, trying to shape what we do or do not do. We need to learn more about how we can battle the power of lies on repeat.

Yangxueqing Mary Jiang is a Ph.D. student in psychology at the School of Medicine and Psychology at Australian National University. Eryn Newman is an associate professor of psychology at Australian National University. Kate Reynolds is a professor of educational psychology and learning at the University of Melbourne. Norbert Schwarz is a professor of psychology and marketing and co-director of USC’s Dornsife Mind & Society Center. This article is republished from The Conversation under a Creative Commons license.The Conversation

Photo of increasing Arctic ice melt in 2016 by the U.S. Geological Service.

POSTED     Aug. 12, 2024, 9:57 a.m.
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
Collaboration helps keep independent journalism alive in Venezuela
In recent weeks, Venezuelan journalists have found innovative ways to keep independent journalism alive; here are some of their efforts.
The Salt Lake Tribune, profitable and growing, seeks to rid itself of that “necessary evil” — the paywall
The first daily newspaper in the U.S. to become a nonprofit has published a refreshingly readable and transparent annual report.
Want to fight misinformation? Teach people how algorithms work
In the four countries studied, each with its own unique technological, political, and social environment, understanding of algorithms varied across different sociodemographic groups.