Nieman Foundation at Harvard
HOME
          
LATEST STORY
Two-thirds of news influencers are men — and most have never worked for a news organization
ABOUT                    SUBSCRIBE
Feb. 16, 2018, 7 a.m.
Audience & Social

Should we consider fake news another form of (not particularly effective) political persuasion — or something more dangerous?

Plus: The lines between “fake news” and psyops, the Russians shared real news too, and “reality apathy.”

The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

“Most forms of political persuasion seem to have little effect at all.” Dartmouth’s Brendan Nyhan writes in The New York Times that it isn’t that easy to change people’s votes in an election, in an Upshot post titled “Fake news and bots may be worrisome, but their political power is overblown.” When we’re trying to evaluate “claims about vast persuasion effects from dubious online content,” Nyhan writes, we should actually be looking at three things: 1) How many people actually saw the material; 2) Whether the people exposed are persuadable/swing voters; and 3) the percentage of bogus news as a percentage of all news viewed.

But: What if “persuadability” isn’t the right metric to look at? That’s the argument from information warfare expert Molly McKew, who specializes in U.S.–Russia relations. Read her whole thread in response to Nyhan’s piece, but here are some excerpts:

“There aren’t good tools to evaluate the impact of shadow campaigns,” she writes.

Nyhan and McKew then discussed:

If you’re interested in more by McKew, here’s a recent piece she wrote for Politico, “How Twitter bots and Trump fans made #releasethememo go viral,” on how misinformation leaks into the public consciousness. “Information and psychological operations being conducted on social media — often mischaracterized by the dismissive label ‘fake news’ — are not just about information, but about changing behavior,” she writes. “And they can be surprisingly effective.”

Russia’s disinformation campaign relied on mainstream media sources. This fits in well with the above: New research from Jonathan Albright, the Tow Center for Digital Journalism’s research director, written up by The Washington Post’s Craig Timberg, shows that during the 2016 presidential election, of more than 36,000 tweets sent by Russian accounts, “obscure or foreign news sources played a comparatively minor role, suggesting that the discussion of ‘fake news’ during the campaign has been somewhat miscast.” Instead, the Russian accounts curated mainstream news to achieve their ends:

Some well-chronicled hoaxes reached large audiences. But Russian-controlled Twitter accounts, Albright said, were far more likely to share stories produced by widely read sources of American news and political commentary. The stories themselves were generally factually accurate, but the Russian accounts carefully curated the overall flow to highlight themes and developments that bolstered Republican Donald Trump and undermined his Democratic rival Hillary Clinton.

In a Medium post, Albright documents the sources that the troll tweets linked to.

Sure, Breitbart ranks first, but it’s followed by a long list of what many would argue are credible — if not mainstream — news organizations, as well a surprising number of local and regional news outlets.

Another result from this analysis is the effect of “regional” troll accounts, aka the fake accounts with a city or region name in the handle (e.g., HoustonTopNews, DailySanFran, OnlineCleveland), which showed a pattern of systematically re-broadcasting local news outlets’ stories…

Trolls are using real news  — and in particular local news — to drive reactionary news coverage, set the daily news agenda, and target local journalists and community influencers to follow certain stories.

In separate tweets, Albright noted, “Twitter isn’t the platform where you reach ‘Americans.’ Twitter is the place where you mislead the 1-2% of susceptible journalists, policymakers, techies, and opinion leaders.”

I DM’d Albright to ask him a little more about this. “Twitter has little effect on regular citizens outside of helping to set the daily news agenda. I have most of the data needed to prove this, just need to get to formally presenting it and writing it up,” he told me. “Facebook and especially Instagram, and of course YouTube, reach more people more often, and almost certainly have a greater impact on their political and daily understanding of the world.”

Reality apathy. Charlie Warzel writes in BuzzFeed about Aviv Ovadya, chief technologist at the Center for Social Media Responsibility at the University of Michigan and a Knight News innovation fellow at the Tow Center for Digital Journalism at Columbia, about a terrifying possible future of disinformation that includes, for instance,

“polity simulation,” [a] dystopian combination of political botnets and astroturfing, where political movements are manipulated by fake grassroots campaigns. In Ovadya’s envisioning, increasingly believable AI-powered bots will be able to effectively compete with real humans for legislator and regulator attention because it will be too difficult to tell the difference. Building upon previous iterations, where public discourse is manipulated, it may soon be possible to directly jam congressional switchboards with heartfelt, believable algorithmically-generated pleas. Similarly, Senators’ inboxes could be flooded with messages from constituents that were cobbled together by machine-learning programs working off stitched-together content culled from text, audio, and social media profiles….

[It all] can lead to something Ovadya calls “reality apathy”: Beset by a torrent of constant misinformation, people simply start to give up. Ovadya is quick to remind us that this is common in areas where information is poor and thus assumed to be incorrect. The big difference, Ovadya notes, is the adoption of apathy to a developed society like ours. The outcome, he fears, is not good. “People stop paying attention to news and that fundamental level of informedness required for functional democracy becomes unstable.”

In his newsletter, Warzel followed up with some of the right-wing response he got to the piece. He also writes about the Florida school shootings this week: “It’s hard to glance at the internet during our biggest national tragedies and feel like the current ecosystem is healthy. The anger, the people using both real news and misinformation solely to score political points, the visibility the current system of coverage gives to shooters, the hoaxes, the way the platforms goad people to chime in and report without the facts. It’s all broken.”

Illustration from L.M. Glackens’ The Yellow Press (1910) via The Public Domain Review.

Laura Hazard Owen is the editor of Nieman Lab. You can reach her via email (laura@niemanlab.org) or Bluesky DM.
POSTED     Feb. 16, 2018, 7 a.m.
SEE MORE ON Audience & Social
PART OF A SERIES     Real News About Fake News
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
Two-thirds of news influencers are men — and most have never worked for a news organization
A new Pew Research Center report also found nearly 40% of U.S. adults under 30 regularly get news from news influencers.
The Onion adds a new layer, buying Alex Jones’ Infowars and turning it into a parody of itself
One variety of “fake news” is taking possession of a far more insidious one.
The Guardian won’t post on X anymore — but isn’t deleting its accounts there, at least for now
Guardian reporters may still use X for newsgathering, the company said.