Many Americans share fake news on social media because they’re simply not paying attention to whether the content is accurate — not necessarily because they can’t tell real from made-up news, a new study in Nature suggests.
Lack of attention was the driving factor behind 51.2% of misinformation sharing among social media users who participated in an experiment conducted by a group of researchers from MIT, the University of Regina in Canada, University of Exeter Business School in the United Kingdom and Center for Research and Teaching in Economics in Mexico. The results of a second, related experiment indicate a simple intervention — prompting social media users to think about news accuracy before posting and interacting with content — might help limit the spread of online misinformation.
“It seems that the social media context may distract people from accuracy,” study coauthor Gordon Pennycook, an assistant professor of behavioral science at the University of Regina, said. “People are often capable of distinguishing between true and false news content, but fail to even consider whether content is accurate before they share it on social media.”
Pennycook and his colleagues conducted seven behavioral science and survey experiments as part of their study, “Shifting attention to accuracy can reduce misinformation online,” published last week. Some experiments focused on Facebook and others focused on Twitter.
The researchers recruited participants for most of the experiments through Amazon’s Mechanical Turk, an online crowdsourcing marketplace that many academics use. For one experiment, they selected Twitter users who previously had shared links to two well-known, right-leaning websites that professional fact-checkers consistently rate as untrustworthy — Breitbart.com and Infowars.com. The sample size for each experiment varies from 401 U.S. adults for the smallest to 5,379 for the largest.
For several experiments, researchers asked participants to review the basic elements of news stories — headlines, the first sentences and accompanying images. Half the stories represented actual news coverage while the other half contained fabricated information. Half the content was favorable to Republicans and half was favorable to Democrats. Participants were randomly assigned to either judge the accuracy of headlines or determine whether they would share them online.
For the final experiment, researchers sent private messages to 5,379 Twitter users who previously had shared content from Breitbart and Infowars. The messages asked those individuals to rate the veracity of one news headline about a topic unrelated to politics. Researchers then monitored the content those participants shared over the next 24 hours.
The experiments reveal a host of insights on why people share misinformation on social media:
Pennycook and his colleagues note that the Twitter intervention — sending private messages — seemed particularly effective among people with a larger number of Twitter followers. Pennycook told me that’s likely because Twitter accounts with more followers are more influential within their networks.
“The downstream effect of improving the quality of news sharing increases with the influence of the user who is making better choices,” he explained. “It may be that the effect is as effective (if not more so) for users with more followers because the importance of ‘I better make sure this is true’ is literally greater for those with more followers.”
Pennycook said social media platforms could encourage the sharing of higher-quality content — and re-orient people back to truth — by nudging users to pay more attention to accuracy.
Platforms, the authors point out, “could periodically ask users to rate the accuracy of randomly selected headlines, thus reminding them about accuracy in a subtle way that should avoid reactance (and simultaneously generating useful crowd ratings that can help identify misinformation.”
The researchers received funding for their study from the Ethics and Governance of Artificial Intelligence Initiative of the Miami Foundation, the William and Flora Hewlett Foundation, the Omidyar Network, the John Templeton Foundation, the Canadian Institutes of Health Research, and the Social Sciences and Humanities Research Council of Canada.
Denise-Marie Ordway is managing editor of Journalist’s Resource. This article first appeared on Journalist’s Resource and is republished here under a Creative Commons license.