The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.
“Relatively short, scalable interventions could be effective in fighting misinformation around the world.” In 2017, Facebook released a set of “Tips to spot false news.” Developed in collaboration with First Draft, the tips were “promoted at the top of users’ news feeds in 14 countries in April 2017 and printed in full-page newspaper advertisements in the United States, the United Kingdom, France, Germany, Mexico, and India,” write the authors of a study published this week in PNAS. “A variant of these tips was later distributed by WhatsApp (a Facebook subsidiary) in advertisements published in Indian and Pakistani newspapers in 2018. These tips are therefore almost surely the most widely disseminated digital media literacy intervention conducted to date.”
We ran longitudinal experiments in both the U.S. and India modeled off of @Facebook's "Tips to Spot False News" shown to millions of people around the world in 2017.
Examples:
– "Be skeptical of headlines."
– "Watch for unusual formatting."
– "Look at other reports."— Andy Guess (@andyguess) June 22, 2020
The researchers tested the effectiveness of these tips on audiences in the U.S. and India — and found that they worked.
Strikingly, our results indicate that exposure to variants of the Facebook media literacy intervention reduces people’s belief in false headlines. These effects are not only an artifact of greater skepticism toward all information — although the perceived accuracy of mainstream news headlines slightly decreased, exposure to the intervention widened the gap in perceived accuracy between mainstream and false news headlines overall. In the United States, the effects of the treatment were particularly strong and remained statistically measurable after a delay of approximately 3 weeks. These findings suggest that efforts to promote digital media literacy can improve people’s ability to distinguish between false and mainstream news content, a result with important implications for both scientific research into why people believe misinformation online and policies designed to address the problem.
In the U.S.:
– On average, people think mainstream news is more accurate than false news;
– The tips intervention reduced perceived accuracy of mainstream news somewhat, but of false news even more;
– This effect persists up to several weeks later!— Andy Guess (@andyguess) June 22, 2020
In India:
– We ran online surveys with similar findings, but effect doesn't seem to persist;
– We also ran face-to-face surveys with a different population (rural, much lower WhatsApp usage), and no measurable effect of news tips.— Andy Guess (@andyguess) June 22, 2020
“A brief intervention which could be inexpensively disseminated at scale can be effective at reducing the perceived accuracy of false news stories,” the authors conclude, “helping users more accurately gauge the credibility of news content they encounter on different topics or issues.”
Consumer Reports’ Kaveh Waddell (he’s an investigative reporter at the Consumer Reports Digital Lab, which launched last year and which I’m looking forward to reading more from) points out that Facebook itself could surely shed further light on the these research findings: “The company should know how many people clicked on the media literacy list, how long they spent on that page, whether they later changed their reading or sharing habits, and how long any effects lasted.” But it’s not sharing. “These scholars did an amazing job of looking at the scale of the intervention with the tools they had available, but I’m just so disappointed that there isn’t a way for an independent audit of what happened on the platform,” First Draft’s Claire Wardle told Waddell.
On the topic of brief interventions, Facebook is taking a cue from The Guardian and will show a warning if users try to share a story that’s more than 90 days old. (If they still want to share it after that, they can.) Other types of notifications may be coming, too. From Facebook’s John Hegeman, VP of feed and stories:
Over the past several months, our internal research found that the timeliness of an article is an important piece of context that helps people decide what to read, trust and share. News publishers in particular have expressed concerns about older stories being shared on social media as current news, which can misconstrue the state of current events. Some news publishers have already taken steps to address this on their own websites by prominently labeling older articles to prevent outdated news from being used in misleading ways.
Over the next few months, we will also test other uses of notification screens. For posts with links mentioning COVID-19, we are exploring using a similar notification screen that provides information about the source of the link and directs people to the COVID-19 Information Center for authoritative health information. Through providing more context, our goal is to make it easier for people to identify content that’s timely, reliable and most valuable to them.
(OK, now do it for Trump’s posts.)
“A symptom of an information ecosystem poisoned by racial inequality.” The Shorenstein Center has a report on COVID-19 misinformation in Black online communities in the United States — an especially crucial topic since Black people are disproportionately affected by the coronavirus, dying of it at a higher rate than White people. Brandi Collins-Dexter identified four main strands of misinformation circulating — some organic, some “targeted directly at the community by outsiders.”
1. Black people could not die from COVID-19
2. The virus was man-made for the purposes of population control
3. The virus could be contained through use of herbal remedies
4. 5G radiation was the root cause of COVID-19
“Our research makes clear that the health misinformation surrounding COVID-19 poses an immediate threat to the health of Black people, and is a symptom of an information ecosystem poisoned by racial inequality,” Collins-Dexter writes.
While there is much to be learned about COVID-19 and how it works, it is clear that misinformation and conspiratorial frames that suggest that Black people are somehow inoculated from the disease are both dangerous and patently untrue. Black lives are consistently put in danger, and it is incumbent upon community actors, media, government, and tech companies alike to do their part to ensure that timely, local, relevant, and redundant public health messages are served to all communities.
“Consuming far-right media and social media content was strongly associated with low concern about the virus at the onset of the pandemic.” The Washington Post’s Christopher Ingraham has a very useful, detailed roundup of three recent studies focused on “conservative media’s role in fostering confusion about the seriousness of the coronavirus. Taken together, they paint a picture of a media ecosystem that amplifies misinformation, entertains conspiracy theories and discourages audiences from taking concrete steps to protect themselves and others.”
Tucker Carlson viewers began social distancing about *one week earlier* than Sean Hannity viewers. That had huge effects: "A one SD increase in relative viewership of Hannity relative to Tucker Carlson Tonight is associated with approximately 32% more COVID-19 cases on March 14."
— Christopher Ingraham (@_cingraham) June 25, 2020