The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.
“Of all the categories of fake news, health news is the worst.” This week, a story popped to the top of my Facebook feed: “Women absorb and retain DNA from every man they have sex with.” It was probably the first time a Snopes-debunked story has been so high in my feed, and that was because it was shared in a Facebook local moms’ group in which I’m an active member, with a whole bunch of ensuing discussion before two women pointed out that it was junk. It was a jarring example of how fake/misleading/hoax-y science news can spread easily in places where you might not see other types of viral fake news.
“My sense is that of all the categories of fake news, health news is the worst. There’s more bad health news out there than there is in any other category,” Kelly McBride, VP of the Poynter Institute, told The Atlantic’s Julie Beck. “Reliable sources on other topics are [sometimes] really bad on health care news.” And depending on what you like to do on Facebook, you might be a lot more likely to see fake news about health than fake news on other topics.
“Quality and popularity of information are weakly correlated.” Low-quality information is just about as likely to go viral as high-quality information, researchers found in a paper published in Nature Human Behavior (paywall) this week. They created a simplified social feed and analyzed real data from Tumblr and Twitter. Rachel Becker writes in The Verge:
By tracking 100,000 different posts across 20 different simulations, the researchers learned that generally, higher-quality ideas — more beautiful photos, or truer statements — are better at spreading through the network. But if the social network is constantly deluged by new posts and the users don’t have infinite attention spans (which, we don’t), the group loses its ability to discriminate between good and bad ideas. Basically, for high-quality posts to win the sharing war on social media, the volume of new information flowing into that network has to be pretty low, and the users’ attention spans have to be pretty high.
Of course, the volume of new information flowing into social networks is extremely high, and people’s attention spans waver quickly.
“Fake news is a problem, but the bigger problem is the overabundance of information in people’s lives,” Filippo Menczer, one of the study’s authors and a professor of informatics and computer science at Indiana University, told BuzzFeed News’s Tom Chivers. As for all of the talk about teaching people to spot fake news with various apps and Facebook initiatives: “People are human. You can’t expect them to treat information as a scholarly thing,” Menczer said. “It’s not supposed to be homework. It’s supposed to be part of their daily lives.”
The paper’s authors also think Twitter needs to find a way of dealing with bots. “We’re finding very strong quantitative evidence that social bots are very effective at getting fake news to go viral,” Menczer told NOVA Next.
“Feeling like you can connect different pieces and tell a story.” Vox editor-in-chief Ezra Klein interviewed researcher danah boyd this week about a variety of fake news–adjacent issues. The full episode is here; the media/news conversation starts at around 50 minutes in. One bit from boyd:
The public is very confused about how they’re consuming and making sense of information. They don’t trust many players. They’ve been socialized into an idea that they should go and research it themselves, in ways that actually can be deeply costly. We see the rise of conspiracy theories when we see a moment of feeling like you can connect different pieces and tell a story, and that same practice that allows us to be researchers or investigative journalists is also the manufacturing of conspiracy. In order to actually go back and challenge this, it’s not just going out and saying “Don’t propagate this conspiracy or think about it this way.” It’s, whoa, how did we get here?
“Killing 3 GOP senators prevents ten 9/11s.” Speaking of Vox, last week two Twitter users masqueraded as Vox staffers and tweeted a very Vox-looking infographic, “entitled ‘AHCA Quick Facts,’ [that] came to a rather un-Vox-y conclusion: ‘Killing 3 GOP senators prevents ten 9/11s.'” Ian Prasad Philbrick writes at Slate:
Harris and Cohen’s behavior chimed with another faux-news tradition, one that doesn’t just aim to churn out bogus information but, in adopting reputable-looking URLs like BostonTribune.com and ABCNews.com.co or impersonating journalists employed by otherwise-credible outlets, seeks to sow doubt and confusion among readers toward news organizations dedicated to disseminating accurate information as well. Harris and Cohen’s fake credentials undermine journalistic credibility, junk Vox’s reputation, and embolden broad-brush cries of “fake news” aimed at the mainstream media.
But wait, there’s more…