Nieman Foundation at Harvard
HOME
          
LATEST STORY
The media becomes an activist for democracy
ABOUT                    SUBSCRIBE
Nov. 1, 2019, 10:33 a.m.
Audience & Social

The Daily Wire and black salve show that Facebook’s takedown policies have one thing in common: Inconsistency

Plus: Cognizant is exiting the content moderation business, and fake news–debunking Lithuanian “elves.”

The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

Inauthentic coordinated behavior, in the U.S.? One of modern media’s mysteries the past few years has been: How does the right-wing website The Daily Wire do so incredibly well on Facebook? It’s regularly one of the world’s top publishers there; it was #11 in September, not far behind The New York Times and The Washington Post in terms of engagements despite having far fewer employees than those major news organizations and publishing many fewer stories.

And the answer isn’t just “conservative content shares really well on Facebook.” The Daily Wire had 20 percent more Facebook engagements than Breitbart did in September. That’s despite Breitbart publishing more than 11 times as many stories as The Daily Wire did and Breitbart’s web audience being 63 percent larger than The Daily Wire’s. The U.K.’s Daily Mail got 72 percent more engagements than The Daily Wire — but it took publishing 50 times as many stories to do so.

So: How does The Daily Wire do it? This week, Judd Legum’s Popular Information provided convincing evidence that it in part is because it’s gaming the system: “Some [and it really is some, not all] of this success is attributable to a clandestine network of 14 large Facebook pages that purport to be independent but exclusively promote content from The Daily Wire in a coordinated fashion. This kind of ‘inauthentic coordinated behavior’ violates Facebook’s rules. Facebook has taken down smaller and less coordinated networks that promoted liberal content. But Facebook told Popular Information that it will continue to allow this network to operate and amplify The Daily Wire’s content.”

“A particularly grotesque form of fake cancer treatment has flourished in private groups on Facebook.” Black salve is a paste that eats through skin (maybe do not click through to that Wikipedia page if you don’t want to see gross pictures; it really does eat through skin). The FDA lists it as a fake cancer cure and cautions against its use. But black salve cancer “cure” groups are flourishing on Facebook, BuzzFeed reports, and Facebook says the groups don’t violate its guidelines. It’s unclear why, since the company this year began cracking down on anti-vax groups (and is now actually reminding people to get their flu shots).

How Lithuania uses software to fight fake news. While Russian disinformation is everywhere — it’s now evolving in Africa, for instance — it’s particularly rife in the Baltic states of Estonia, Latvia, and Lithuania, The Economist reported this week. The Lithuanian media company Delfi, in partnership with Google (as part of a DNI grant) has developed software called Demaskuok (“debunk” in Lithuanian) that sifts “through reams of online verbiage in Lithuanian, Russian and English, scoring items for the likelihood that they are disinformation. Then, by tracking back through the online history of reports that look suspicious, it attempts to pin down a disinformation campaign’s point of origin — its patient zero.” Here’s more on how the software works:

Demaskuok identifies its suspects in many ways. One is to search for wording redolent of themes propagandists commonly exploit. These include poverty, rape, environmental degradation, military shortcomings, war games, societal rifts, viruses and other health scares, political blunders, poor governance, and, ironically, the uncovering of deceit. And because effective disinformation stirs the emotions, the software gauges a text’s ability to do that, too. Items with terms like “current-account deficit” are less likely to be bogus than those that mention children, immigrants, sex, ethnicities, animals, national heroes and injustice. Gossip and scandal are additional tip-offs. Verbiage about sports and the weather is less likely to fire up outrage, so the software scores items about those subjects as less suspicious.

Another clue is that disinformation is crafted to be shared. Demaskuok therefore measures “virality”— the number of times readers share or write about an item. The reputations of websites that host an item or provide a link to it provide additional information. The software even considers the timing of a story’s appearance. Fake news is disproportionately posted on Friday evenings when many people, debunkers included, are out for drinks.

Disinformers can be careless, too. Demaskuok therefore remembers the names of people quoted in fake news, as they sometimes crop up again. It also runs image searches to find other places a picture has been posted. Some, it turns out, first appeared before the events they supposedly document. Others also appear on websites with a reputation for disinformation, such as rt and Sputnik — both news outlets backed by Russia’s government.

Users of Demaskuok include Lithuanian governmental departments, news outlets, universities, and other organizations, as well as “more than 4,000 volunteers known as ‘elves'” who select items for fact-checking.

After two alarming investigations, Cognizant gets out of the content moderation business. Cognizant — which has provided content moderation services for Twitter, Google, and Facebook and was the subject of two Verge investigations by Casey Newton this year over its terrible workplace conditions and lack of support for contract workers moderating horrifying content at two U.S. sites — is getting out of the business by March, Newton reported. The company also had workers in India, Europe, and Latin America, for a total of 15,000 content moderators working at 20 sites around the world. About 6,000 jobs will be cut, the BBC reported.

Illustration from L.M. Glackens’ The Yellow Press (1910) via The Public Domain Review.

Laura Hazard Owen is the editor of Nieman Lab. You can reach her via email (laura@niemanlab.org) or Bluesky DM.
POSTED     Nov. 1, 2019, 10:33 a.m.
SEE MORE ON Audience & Social
PART OF A SERIES     Real News About Fake News
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
The media becomes an activist for democracy
“We cannot be neutral about this, by definition. A free press that doesn’t agitate for democracy is an oxymoron.”
Embracing influencers as allies
“News organizations will increasingly rely on digital creators not just as amplifiers but as integral partners in storytelling.”
Action over analysis
“We’ve overindexed on problem articulation, to the point of problem admiring. The risk is that we are analyzing ourselves into inaction and irrelevance.”