The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.
Inauthentic coordinated behavior, in the U.S.? One of modern media’s mysteries the past few years has been: How does the right-wing website The Daily Wire do so incredibly well on Facebook? It’s regularly one of the world’s top publishers there; it was #11 in September, not far behind The New York Times and The Washington Post in terms of engagements despite having far fewer employees than those major news organizations and publishing many fewer stories.
And the answer isn’t just “conservative content shares really well on Facebook.” The Daily Wire had 20 percent more Facebook engagements than Breitbart did in September. That’s despite Breitbart publishing more than 11 times as many stories as The Daily Wire did and Breitbart’s web audience being 63 percent larger than The Daily Wire’s. The U.K.’s Daily Mail got 72 percent more engagements than The Daily Wire — but it took publishing 50 times as many stories to do so.
So: How does The Daily Wire do it? This week, Judd Legum’s Popular Information provided convincing evidence that it in part is because it’s gaming the system: “Some [and it really is some, not all] of this success is attributable to a clandestine network of 14 large Facebook pages that purport to be independent but exclusively promote content from The Daily Wire in a coordinated fashion. This kind of ‘inauthentic coordinated behavior’ violates Facebook’s rules. Facebook has taken down smaller and less coordinated networks that promoted liberal content. But Facebook told Popular Information that it will continue to allow this network to operate and amplify The Daily Wire’s content.”
Facebook just got back to me with a statement on the network of pages such as “Conservative News” and “Being Conservetive” that appear to all be different, but in fact spend all day posting the exact same links to Ben Shapiro’s Daily Wire.
Here’s the statement, in full: pic.twitter.com/7Ex7UnqYkT
— Will Oremus (@WillOremus) October 28, 2019
3. Facebook is now claiming that these are "real pages" run by "real people." If these pages are real, how is anything on Facebook considered inauthentic?
It's an OBVIOUS a ploy to game the Facebook algorithm and boost Daily Wire contenthttps://t.co/Rdhzbt4V4B
— Judd Legum (@JuddLegum) October 28, 2019
Facebook claims these 12 large Facebook pages are "real pages run by real people" and not centrally-controlled inauthentic behavior by The Daily Wire
They all just posted the same link from the Daily Wire with the same text ALL WITHIN FOUR SECONDS https://t.co/Rdhzbt4V4B pic.twitter.com/j2l8foJkRs
— Judd Legum (@JuddLegum) October 29, 2019
Lots of media orgs cross-post stories to multiple pages, that's fine. But some of the pages posting Daily Wire links weren't labeled as being tied to the site. If they are, hard to see how that's kosher, given that FB has punished similar behavior before. https://t.co/jaN0L5jQdj
— Kevin Roose (@kevinroose) October 28, 2019
2/ Daily Wire isn't the only one. The somewhat less-well-known, but also disproportionately Facebook-successful, Web outlet, Western Journal, used precisely the same strategy to grow on Facebook, as @bankonjustin and I showed in our August investigation. https://t.co/e4NoBHu5NZ
— Nick Confessore (@nickconfessore) October 28, 2019
1. Facebook isn’t requiring Daily Wire to rebrand these pages
2. How do you distinguish between the October 2018 takedown of large liberal pages supposedly engaged in “inauthentic coordinated behavior”
— Judd Legum (@JuddLegum) October 28, 2019
“A particularly grotesque form of fake cancer treatment has flourished in private groups on Facebook.” Black salve is a paste that eats through skin (maybe do not click through to that Wikipedia page if you don’t want to see gross pictures; it really does eat through skin). The FDA lists it as a fake cancer cure and cautions against its use. But black salve cancer “cure” groups are flourishing on Facebook, BuzzFeed reports, and Facebook says the groups don’t violate its guidelines. It’s unclear why, since the company this year began cracking down on anti-vax groups (and is now actually reminding people to get their flu shots).
When I asked YouTube about black salve videos, they immediately took them down. Amazon even took down a book that promoted black salve. Facebook said the groups – where people encourage each other to burn off their skin to "cure cancer" – aren't against the rules.
— Katie Notopoulos (@katienotopoulos) October 31, 2019
There's been a lot of great reporting about health misinfo in Facebook Groups, like @BrandyZadrozny's reports about drinking bleach as a "cure" for autism. But it's totally unclear how exactly Facebook is making decisions about what's in violation https://t.co/NfKoPhgo6u
— Katie Notopoulos (@katienotopoulos) October 31, 2019
How Lithuania uses software to fight fake news. While Russian disinformation is everywhere — it’s now evolving in Africa, for instance — it’s particularly rife in the Baltic states of Estonia, Latvia, and Lithuania, The Economist reported this week. The Lithuanian media company Delfi, in partnership with Google (as part of a DNI grant) has developed software called Demaskuok (“debunk” in Lithuanian) that sifts “through reams of online verbiage in Lithuanian, Russian and English, scoring items for the likelihood that they are disinformation. Then, by tracking back through the online history of reports that look suspicious, it attempts to pin down a disinformation campaign’s point of origin — its patient zero.” Here’s more on how the software works:
Demaskuok identifies its suspects in many ways. One is to search for wording redolent of themes propagandists commonly exploit. These include poverty, rape, environmental degradation, military shortcomings, war games, societal rifts, viruses and other health scares, political blunders, poor governance, and, ironically, the uncovering of deceit. And because effective disinformation stirs the emotions, the software gauges a text’s ability to do that, too. Items with terms like “current-account deficit” are less likely to be bogus than those that mention children, immigrants, sex, ethnicities, animals, national heroes and injustice. Gossip and scandal are additional tip-offs. Verbiage about sports and the weather is less likely to fire up outrage, so the software scores items about those subjects as less suspicious.
Another clue is that disinformation is crafted to be shared. Demaskuok therefore measures “virality”— the number of times readers share or write about an item. The reputations of websites that host an item or provide a link to it provide additional information. The software even considers the timing of a story’s appearance. Fake news is disproportionately posted on Friday evenings when many people, debunkers included, are out for drinks.
Disinformers can be careless, too. Demaskuok therefore remembers the names of people quoted in fake news, as they sometimes crop up again. It also runs image searches to find other places a picture has been posted. Some, it turns out, first appeared before the events they supposedly document. Others also appear on websites with a reputation for disinformation, such as rt and Sputnik — both news outlets backed by Russia’s government.
Users of Demaskuok include Lithuanian governmental departments, news outlets, universities, and other organizations, as well as “more than 4,000 volunteers known as ‘elves'” who select items for fact-checking.
After two alarming investigations, Cognizant gets out of the content moderation business. Cognizant — which has provided content moderation services for Twitter, Google, and Facebook and was the subject of two Verge investigations by Casey Newton this year over its terrible workplace conditions and lack of support for contract workers moderating horrifying content at two U.S. sites — is getting out of the business by March, Newton reported. The company also had workers in India, Europe, and Latin America, for a total of 15,000 content moderators working at 20 sites around the world. About 6,000 jobs will be cut, the BBC reported.
It will also remove from power some very bad managers who ruthlessly exploited poor people, immigrants, and people of color by micro-managing every aspect of their working lives and left them with PTSD and other long-lasting mental health consequences.
— Casey Newton (@CaseyNewton) October 30, 2019
Accenture, Genpac and others
— Casey Newton (@CaseyNewton) October 30, 2019