The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.
“Residents cycle between verifying information and disengaging from news to relieve stress.” In the International Journal for Communication, Temple University’s Andrea Wenzel looks at how consumers — in 13 focus groups across cities in California, Indiana, Kentucky, and New York — “navigate vast quantities of often conflicting information and misinformation about the state of their country and fellow residents.” (The focus groups, consisting of 58 participants in total, took place in 2017. Support came from Columbia’s Tow Center for Digital Journalism, where Wenzel is a fellow.)
The excerpts from the conversations are fascinating. Here are some:
As one California participant said regarding legacy media outlets ranging from Fox to the LA Times, “It’s hard to tell if it’s their opinion or news…Even facts are opinions.” For some participants, fake news had more to do with the absence or presence of partisan content. Several who identified as conservative put forward ideas of fake news that echoed President Trump’s referencing of the “fake news media.” When asked what he meant by “fake news,” Bob, a 67-year-old in Indiana, answered:
“Well, the stories that you’re hearing, that the other outlets aren’t providing. And when they hush up things, when you only see them playing certain things. You take Benghazi, you take a lot of these other things that were going on, and you only see two or three media outlets making people aware of it. And then you’ve got Fox News over here, and they’re letting you know what is going on.” For Bob, fake news was more about an absence: “Things that should have been put out there that weren’t put out there.” These stories, he argued, were needed to form an opinion. And outlets, apart from Fox, which he believed gave “both sides,” were withholding key information.
A 42-year-old Democrat in Indiana said:
I have to put my faith in some organization to at least, I can’t question everything that’s, I mean, I could, but I’d be a nutcase. Like, I have to put credibility in let’s say, CBS News and C-SPAN. Those are my credible news sources, but you know, I’m hearing that they’re not credible.
Another Indiana focus group member, “Jason,” responded, picking up on how hard it can feel to trust anything:
He said that while in the past, he had thought “you had to hang your hat on something,” he now was feeling closer to the rather more destabilizing idea that “there’s no such thing as truth.” For Jason, there was a fine line between feeling ambivalent about outlets but continuing to rely on them, and essentially giving up on the legitimacy of any outlet. He, like several others, felt himself sliding into a space of generalized mistrust.
And group members discussed how they check different sources — but it can be tiring and frustrating:
Brian, a 42-year-old in Indiana, explained that when he wanted to get more information about something he had heard, he would tend to directly visit the websites or television channels of “one of the big three ABC, or like CBS, or CNN.” In the same group, Todd said he would do a kind of cross-check: “I’ll check Yahoo. I’ll check Fox News. I’ll check CNN. I guess at that point I kind of do a combo, say is this kind of the same thing from all three?” Several participants referenced seeking out news sources associated with the “other” political side — generally through direct visits to legacy brands such as Fox or CNN. A few spoke of following politicians or campaigns they disagreed with on Twitter or Facebook, or being subscribed to email lists. However, several also spoke of feeling exhausted by this process. Todd, who shared his seemingly time-consuming cross-checking routine, explained, “At the end of the day, it’s like, well, got to go on with your daily life.”
Some participants held the actions taken by social media platforms in low regard. “As a participant who associated tech platforms with the left explained, ‘There’s nothing Facebook can do at this point to make me think that they’re impartial or balanced.'”
Participants also discussed disengagement with the news.
Steve [Kentucky], who identified as right-leaning, explained how this had shifted his goals and his communication ecology: “I still read stuff occasionally but not like I used to. I’ll just go play video games. I don’t even care about news anymore.”
Sally [Kentucky]: “I think, there are times where I think that my sense of well-being and satisfaction with my life is directly proportional to the amount of news that I consume, and the more news I consume is like, oh, this world is just falling apart. I just don’t need to know it all, you know, there’s enough to concentrate on in the community that needs the attention rather than focusing on everything, you know.”
Jason [Indiana] on Twitter, where he primarily follow comedians: “But now all that levity has been replaced by gravity. So it’s just like, ah. You know, I came to you for a lift, and now all I get are these bring me downs all day. So I’ve kind of, I’m finding myself using it less.”
Ali (New York) talked about how he had relied on Facebook for news during the election, but had to take a break:
It was kind of exhausting. I also didn’t particularly like being Muslim and having that come up so much during this election, it was like it was a Muslim problem. It was just shitty content 24/7 to be reading reviews about your religious identity. So immediately after the election I kind of unplugged the world news and have slowly, incrementally gone back on just because I don’t think I can do this for the next four years.
“We don’t even know exactly what we’re up against and we don’t even know what we would do to try to fix it.” Wired’s Paris Martineau spoke with disinformation/extremism researchers Whitney Phillips, Alice Marwick, and Becca Lewis about their growing sense of futility and feelings of malaise about the work that they do. Phillips: “It’s not that one of our systems is broken; it’s not even that all of our systems are broken…It’s that all of our systems are working…toward the spread of polluted information and the undermining of democratic participation.”
i got to chat with @parismartineau about something that's been plaguing me for a long time: in responding to a crisis caused in part by the attention economy, we are also subject to the same exact forces pic.twitter.com/jLJSjRr5So
— Becca Lewis (@beccalew) May 2, 2019
The piece doesn’t go into gender much, though Marwick mentions it briefly: “When you are researching things that have a real emotional impact on you…you have to feel that emotion, because, if you get inert to the racism or the misogyny or the hatred, you’re no longer doing your job. You’re seeing this deeply racist garbage all day long. You can’t pretend it is not happening. And I think that has an effect on you.” Kate Starbird has talked, too, about the disorienting effects of studying disinformation.
Feat @wphillips49 @alicetiara and @beccalew with the awful costs of researching online extremism- this is one case where I wish gendered effects had been directly tackled- but this shit is real and thanks @wired https://t.co/urn8NXy0B9 pic.twitter.com/ghQPoZplaA
— Nikki Usher, Ph.D. (@nikkiusher) May 2, 2019
I do know that I couldn't do any of this shit and my soul would have fallen out long ago if it wasn't for people like @alicetiara + @beccalew + @BostonJoan + @digitalsista + so many others who face the worry and confusion and fucking aporia every day and keep showing up
— Whitney Phillips (@wphillips49) May 2, 2019
Yeah from the very beginning of troll-type/harassment research, women have always been a supermajority in the field, with extremely strong voices (100 hats off to @BiellaColeman and @jlbeyer). Not sure what accounts for those numbers, but it's always been an intriguing question
— Whitney Phillips (@wphillips49) May 2, 2019
Facebook finally bans some far-right figures. On Thursday, Facebook banned far-right extremists including Alex Jones, Milo Yiannopoulous, and Laura Loomer. (The way the announcement was handled was weird: Facebook announced it to some news outlets before actually removing the figures’ pages.) From Casey Newton at The Verge:
Infowars founder [Alex] Jones was suspended from Facebook last year under rules against bullying and hate speech. In February, the company removed another 22 pages associated with him and his businesses. Jones has continuously promoted fringe conspiracy theories, including baseless arguments that the Sandy Hook elementary school massacre never happened. His followers have stalked and harassed families of the victims, requiring them to move frequently and live in hiding.
[Paul Joseph] Watson is an editor at Infowars and associate of Jones. [Louis] Farrakhan is the leader of the Nation of Islam and is known for making inflammatory anti-Semitic and homophobic remarks. [Paul] Nehlen is a white supremacist politician who had previously been banned from Twitter. Yiannopoulos is a far-right provocateur who was banned from Twitter after inspiring a wave of racist abuse. Loomer is a far-right activist who recently called Islam “a cancer on humanity” on her Instagram story. (Instagram removed the post.)
Taylor Lorenz at The Atlantic:
Infowars is subject to the strictest ban. Facebook and Instagram will remove any content containing Infowars videos, radio segments, or articles (unless the post is explicitly condemning the content), and Facebook will also remove any groups set up to share Infowars content and events promoting any of the banned extremist figures, according to a company spokesperson. (Twitter, YouTube, and Apple have also banned Jones and Infowars.)
from my convos with platform employees, this was always where it was going…says something about them that it took years after hemming and hawing and all kids of equivocating (despite the viewpoints/tactics of this crowd not changing) https://t.co/JBgDEsFyMF
— Charlie Warzel (@cwarzel) May 2, 2019
And last year they admitted Milo broke rules by goading people to bomb our office, but didn't ban him until today: https://t.co/CV3YrxznD2
— Kelly Weill (@KELLYWEILL) May 2, 2019
It will be extremely interesting to see if Facebook will apply this policy on “dangerous individuals and organizations” in its largest market (by number of users): India. [I’m not holding my breath.] https://t.co/zABqxQGZDc
— Sadanand Dhume (@dhume) May 2, 2019