Nieman Foundation at Harvard
HOME
          
LATEST STORY
The media becomes an activist for democracy
ABOUT                    SUBSCRIBE
Sept. 1, 2022, 9 a.m.

Vaccinating people against fake news

Researchers are trying to boost people’s immunity to fake news using online games and other strategies. Can these efforts protect the wider population against disinformation?

My first move in the online game Harmony Square is to transform myself into a fake-news mastermind. “We hired you to sow discord and chaos,” my fictional boss informs me in a text box that pops up on a stark blue background. “We’ve been looking for a cartoonishly evil information operative. You seemed like a good fit.”

Through a series of text-box prompts, the game goads me to inflame my pretend social media audience as much as possible. I stoke an online firestorm with a ginned-up takedown article about a fictitious local politician: “PLOOG LIED ABOUT PAST—SUPPORTED ANIMAL ABUSE IN COLLEGE!” At management’s behest, I unleash an army of bots to comment approvingly on my story, driving more traffic to it. As I escalate my crusade against Ploog, the game cheers me on.

Harmony Square is one of several games University of Cambridge researchers have developed to bolster people’s resistance to disinformation. “What we thought would be interesting was having people make their own fake news in a safe environment,” says Cambridge psychologist Jon Roozenbeek, a lead researcher on the games project with fellow psychologist Sander van der Linden. “The goal is to prevent unwanted persuasion.”

These games rest on a single, overarching premise: You can inoculate people against fake news by exposing them to small amounts of such content — much as low doses of live virus can vaccinate people against a disease — if you catch them before they are fully infected by conspiratorial belief. So far, games like Harmony Square are among the best-developed vehicles for disinformation inoculation. Researchers are also proposing and testing other, related strategies, including inoculating students in classroom settings, having people cook up their own conspiracy theories, and creating online classes that teach how to identify common fake-news tactics.

Reaching enough people to achieve something akin to herd immunity against disinformation is a significant challenge, however. In addition to bolstering people’s BS detection skills, a broad immunity-building campaign would need to neutralize fake news’s strong emotional pull. “Even as this approach of science and inoculation takes off, the problem has to be solved at the cultural level,” says Subramaniam Vincent, director of journalism and media ethics at Santa Clara University’s Markkula Center. “So many efforts have to come together.”

Mentally vaccinating people against fake news goes back to the 1960s, when psychologist William McGuire proposed making people resistant to propaganda using a strategy he called a “vaccine for brainwash.” Much as weakened viruses can teach the immune system to recognize and fight off disease, alerting people to false arguments — and refuting those arguments — might keep them from succumbing to deception, McGuire reasoned.

Take, for example, the public-health recommendation that everyone visit a doctor every year. In an experiment, McGuire gave people counterarguments against going to the doctor annually (say, that regular visits promote health anxiety and actually lead people to avoid the doctor). Then he poked holes in those counterarguments (in reality, regular doctor visits reduce undue health anxiety). In McGuire’s studies, people became better at resisting false arguments after their beliefs were challenged.

The inoculation messages warned people of impending attempts to persuade them, causing them to recognize that they might be vulnerable. The brain is wired to mount a defense against apparent threats, even cognitive ones; when challenged, people therefore seek fresh ways to protect their beliefs, much as they’d fight back if someone attacked them in a bar. Threat is a critical component of inoculation, says Josh Compton, a Dartmouth speech professor who specializes in inoculation theory. “Once we experience threat, we are motivated to think up counterarguments folks might raise and how we’ll respond,” he says.

In the 1980s and 90s, experts put inoculation theory into practice with fairly limited goals, like preventing teenage smoking, and limited but promising outcomes. It wasn’t until the mid-2010s, as fake news gained traction online, that Cambridge’s Van der Linden was inspired to take the inoculation concept to a higher level. Like McGuire, he was convinced that “prebunking,” or sensitizing people to falsehoods before they encountered them, was better than debunking fake stories after the fact. Multiple studies show that once someone has internalized a nugget of false information, it’s very hard to get that person to disavow it, even if the original creator posts a correction.

Van der Linden found that focusing on a single issue, as McGuire had done, has its limits. Warning people about lies on a particular subject like smoking may help them fend off falsehoods about that one topic, but it doesn’t help them resist fake news more broadly. So Van der Linden started focusing on building people’s general immunity by cluing them in to the persuasion techniques in every fake-news creator’s toolbox.

In a series of mostly online studies, Van der Linden gave people general warnings about bad actors’ methods. For instance, he told them that politically motivated groups were using misleading tactics, like circulating a petition signed by fake scientists, to convince the public that there was lots of scientific disagreement about climate change. When Van der Linden revealed such fake-news tactics, people readily understood the threat and, as a result, got better at sniffing out and resisting disinformation.

The idea of turning fake-news inoculation into something fun was conceived in 2016 at a bar in the Netherlands. Over beers with friends, Roozenbeek batted around the possibility of using a game to combat false information online. He created a prototype, which he called Bad News. As he researched the idea further, Roozenbeek came across Van der Linden’s studies, and the two agreed to work together on more advanced online inoculation games. Their collaboration expanded on Bad News, then added Harmony Square, which is now freely available at harmonysquare.game.

In tongue-in-cheek fashion, the games introduce players to a host of common fake-news tactics. As I type a fake headline about a local politician in Harmony Square, my boss stresses the importance of stoking people’s fear with inflammatory language. “You missed some. Do better,” she scolds when I don’t include enough incendiary words like “corrupt” or “lie” in my headline. “Remember: Use words that upset people.” The game also goads me to create a website that claims to be a legitimate news outlet, sucking people in by projecting the appearance of credibility.

The argument against these dishonest tactics is embedded in the game play. The more disinformation you spread, the more unrest you sow in the fictional town of Harmony Square. By the end of the game, normally placid townspeople are screaming at one another. As I play, I get caught up in the narrative of how fake-news tactics undermine community from within.

To evaluate whether the games are truly effective, Roozenbeek and Van der Linden surveyed about 14,000 people before and after they played Bad News. After playing the game to the end, people were better overall at spotting falsehoods, rating the reliability of fake tweets and news reports about 20 percent lower than they had before. The effects lasted for more than two months. These results are in line with those of other anti-disinformation tactics such as correcting or fact-checking suspect content, according to a meta-analysis of such interventions by researchers from the University of Southern California.

Social scientists see promise in the Cambridge team’s efforts to inoculate people against fake news. “Walking in the perpetrators’ shoes, so to speak, can be very effective for understanding how disinformation can be produced and some reasons why,” says Robert Futrell, a sociologist and extremism researcher at the University of Nevada, Las Vegas, although he has not reviewed specific data from Bad News or Harmony Square.

Even if they work well, games alone will not be enough to inoculate whole populations against online disinformation. Several million people have played the Cambridge team’s offerings so far, according to Roozenbeek, a tiny fraction of the global population. Daniel Jolley, a psychologist at the University of Nottingham, notes that large-scale inoculation will have to be implemented in a wide range of settings, from classrooms to community centers. Ideally, such programs should reach students during their school years, before they have been extensively exposed to fake news, Stanford education professor Sam Wineburg has argued.

Finland is the first country to try inoculating people against fake news on a national scale. As Russian fake news began making its way across the border into Finland in 2014, the Finnish government developed a digital literacy course for state-run elementary and high schools. The curriculum, still in use, asks students to become disinformation czars, writing their own off-the-wall fake-news stories. As they learn how fake news is produced, students also learn to recognize and be skeptical of similar content in the real world.

Elsewhere, researchers and organizations are experimenting with inoculation efforts on a smaller scale. In Australia, communications professor John Cook designed an online course in 2015 to teach people how to detect common disinformation tactics used by climate deniers. So far, more than 40,000 people have enrolled in Cook’s course.

In the United States, nonprofits like the News Literacy Project teach middle and high school students how to distinguish between fact and fiction in the media. NLP has developed a series of 14 interactive lessons, some of which walk students through fake-news creation and give examples of bogus stories likely to spread like wildfire (“Fireman Suspended & Jailed by Atheist Mayor for Praying at Scene of Fire”). More than 40,000 U.S. educators have signed up to work with NLP so far. (The Science Literacy Foundation, which supports OpenMind, where this piece originally ran, is also a financial supporter of the News Literacy Project.)

Adding to the challenge of fake-news inoculation, a serious literacy campaign must do more than train people to ferret out falsehoods. It must also counter the emotional pull of those falsehoods. People tend to wade into conspiracies and false narratives when they feel scared and vulnerable, according to Jolley. When their brains flood with stress hormones, their working memory capacity takes a hit, which can affect their critical thinking. “You’ve got the skills” to mentally counter conspiracy theories, Jolley says, “but you may not be able to use them.” Research shows that people who feel socially isolated are also more likely to believe in conspiracies.

By contrast, the more fulfilled and capable people feel, the less vulnerable they are to disinformation. Jolley suggests that community-building ventures in which people feel part of a larger whole, like mentoring programs or clubs, could help individuals grow psychologically secure enough to resist the pull of a conspiracy theory. Making it easier to access mental health services, he adds, might also support people’s well-being in ways that improve their immunity to common fake-news tactics.

As the disinformation-vaccine movement grows, one crucial unknown is just how much inoculation is enough. “What’s the equivalent of herd immunity for human society?” Vincent asks. “Do we have to have inoculation for, let’s say, 80% of a country in order for the spread of misinformation to be mitigated?” Calculating that percentage, he notes, is a complex undertaking that would have to account for different ways of reaching people online and the multiple strategies used to counter fake news.

Given how challenging it will be to defang disinformation, it seems fitting that the Cambridge team’s Harmony Square game builds to an open-ended finish. When I complete the game’s final chapter, everyone in town is still fighting over the content my fake news empire churns out, and it’s unclear whether the destruction I’ve caused can be reversed. Surveying the damage, my boss applauds me. “They’re all at each other’s throats now.”

Elizabeth Svoboda is a science writer in San Jose, Calif., and the author of What Makes a Hero?: The Surprising Science of Selflessness. This story originally appeared on OpenMind, a digital magazine tackling science controversies and deceptions, which Nieman Lab covered here.

POSTED     Sept. 1, 2022, 9 a.m.
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
The media becomes an activist for democracy
“We cannot be neutral about this, by definition. A free press that doesn’t agitate for democracy is an oxymoron.”
Embracing influencers as allies
“News organizations will increasingly rely on digital creators not just as amplifiers but as integral partners in storytelling.”
Action over analysis
“We’ve overindexed on problem articulation, to the point of problem admiring. The risk is that we are analyzing ourselves into inaction and irrelevance.”