Our world is awash in mis- and disinformation. Conspiratorial thinking and misguided beliefs and attitudes towards science are widespread. But combatting this deluge of problematic information must be done with care. Too often, argues Whitney Phillips, we dismiss the people who consume or propagate such information as irrational. In fact, given the experiences they may have been having on the internet for many years — or what they may have been encountering via traditional media for decades — their views, troubling as they may be, may in many cases be perfectly rational.
Phillips, an assistant professor of communication and rhetorical studies at Syracuse University, investigates the intersections between media ecosystems, beliefs, and politics. Her research has taken her to some of the internet’s darkest corners, but she argues that it’s futile to treat those virtual spaces in isolation, because the cultural factors that paved the way for their creation are all around us — and in some cases are almost mundanely mainstream.
Phillips’ latest book is You Are Here: A Field Guide for Navigating Polarized Speech, Conspiracy Theories, and Our Polluted Media Landscape, co-authored with Ryan M. Milner.
Our conversation was conducted over Zoom and by email and has been edited for length and clarity.
Dan Falk: One of the things you describe in your book is how misinformation and disinformation grow and evolve. And you use the metaphor of a hurricane that swallows up smaller storms to create a bigger storm. When thinking of things ranging from
anti-vaccine rhetoric to, say, QAnon, what is it that connects them?
Whitney Phillips: Take any conspiracy theory — take QAnon, take Covid stuff, whatever — when you think about what energizes that, so much of what is actually fueling the storm are narratives that have been around for decades. It is not the case that a bunch of people just started going on Facebook, and then magically became radicalized, because the algorithm started feeding them information; that’s just an overly simplistic view of how this happens.
Falk: Where did those narratives come from?
Phillips: In the United States, you have two kind of parallel media ecosystems: You have the sort of center-left mainstream media ecosystem, and then the right-wing media ecosystem, which was built as a reactionary apparatus, essentially, to mainstream media, in the early 20th century, and really became intensified in the post-Cold War era. And within that media ecosystem, narratives around liberal bias, anti-expertise narratives, and the idea that if something was coming out of an institution, you couldn’t trust it — that has been bread and butter within certain media networks for many people’s entire lives.
And so if you’re steeped in those particular media networks and those narratives — especially about liberal bias, or about how you can’t trust experts, because they’re lying to you, or they’ve got vested economic interests, or they’re just liberals, so you can’t trust them — if that is the oxygen that you have breathed your entire life, that’s what you bring to Facebook; that’s not what Facebook gives you. So once you go to Facebook, with those narratives fully internalized, you start searching for certain things, or you engage with certain kinds of people, communities, then the algorithm starts feeding you more of what you’re already bringing to the table.
Falk: You mentioned certain media networks have been around for decades. Just to clarify, are you talking about things like right-wing talk radio?
WP: Well, yeah. In the ’50s, radio was the new technology. That was the equivalent of Facebook and its algorithms. Conservative media and the right wing generally, that’s a really big tent. And that includes everything from the National Review, which positioned itself as being more reasonable, responsible, and things like the John Birch Society — so it’s not fair to say all of right-wing media was exactly the same, that you can collapse all of it. It’s really complicated.
But the one thing that unifies the conservative movement and right-wing media is that it was a reaction against the consensus-based mainstream. It emerged because conservatives didn’t feel like their views were being accurately represented, or didn’t have a home within mainstream media. So those networks were created because conservatives, rightly or wrongly, felt like their interests were not reflected in the mainstream. So they created an alternative media ecosystem to solve for a problem that they had identified.
Falk: We’ve been talking about the link between right-wing media and misinformation, but I suppose misinformation can also come from the liberal side? I’m thinking of the
links between the anti-vax movement and the wellness community, for example.
Phillips: It sort of depends on what you mean by the liberal side. Liberal people/influencers/certain publications spread
misinformation about vaccines and a whole host of other political issues, and that can pollute the landscape just as much as conservative misinformation, and/or can send otherwise liberal people down right-wing rabbit holes when misinformation about vaccines, for instance, or child protection, butt up against elements of conspiracy theories like QAnon.
I was less focused on individual liberal people versus individual conservative people, whose beliefs, already, can vacillate between conservative and liberal positions depending on the issue, and was speaking more about the network structures and media wraparound of right-wing media — with Fox News the center of that universe in many ways.
From that vantage point, it doesn’t make sense to equate what happens on the left with what happens on the right. For one thing, “liberal media” subsumes all kinds of positions and publications, making it an unwieldy concept to begin with. Just ask someone on the far left how they feel about CNN, or even The New York Times, and they’ll have all kinds of critical things to say.
Falk: So although misinformation can flow from either side, you feel the right is more focused, or more concentrated?
Phillips: There just isn’t a singularity of messaging as you move from center-left to far-left. The right-wing media ecosystem isn’t entirely a monolith, of course, but it’s broadly much more homogenous than what happens to the left of center. Beyond that, the power, influence, and political feedback loop right-wing media enjoys with Trumpworld doesn’t have an equivalent on the left.
There is certainly false and harmful information and all kinds of informational problems emerging from legacy center-left media — I don’t know any media scholar who argues that there aren’t! But the big question is: Do I stay up at night worrying about democracy-eroding falsehoods coming from legacy center-left media in the same way I do about falsehood on the right? I don’t.
Falk: But interestingly, on the surface, some of these mis- and disinformation issues that have been so problematic — like those that have surfaced in relation to Covid, for example — hardly seem rooted in politics. So I’m wondering, how did this virus become so politicized? Or, to focus things a bit more, how did we end up with Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases,
receiving death threats?
Phillips: You look at someone like Fauci: He kind of ticks all the boxes of the kind of “villain” template. So everything I was just describing about how you have this alternative media ecosystem, and some of the recurring themes are that you can’t trust liberals, and that the establishment is lying to you, and experts are not actually experts, that you need to mistrust expertise — Fauci is all of that.
And Fauci is part of the government. So if you also have a kind of anti-government impulse within the milieu in which many people are raised, someone like Fauci, as a figurehead, represents everything you would have kind of a knee-jerk resistance to. And then you consider all the ways that Trump and his Republican allies in Congress actively politicized Covid.
Falk: In one of your
lectures, you made the point that it’s wrong to characterize QAnon supporters as crazy — and I wonder if you can elaborate on that a little bit. Why do we have to be cautious about dismissing people who hold some of these far-from-the-mainstream beliefs?
Phillips: Well, first of all, describing someone is “crazy” is ableist and propagates this idea that people with mental health issues or mental illness are somehow less than human, which is inherently problematic. But besides that, what the word “crazy” does, as used in that way, is to suggest that these beliefs are somehow unexplainable and that they are incoherent, and that they’re totally irrational.
Now, I’m not suggesting that they those beliefs are correct; QAnon is not real, full stop. But at the same time, you have the cumulative effect of being told the same kinds of stories throughout your whole life, right? So we might be talking about someone who was raised on right-wing radio growing up, and then Fox News comes around and they’re watching Fox News their whole life; then they go to Facebook and their affinity networks ensure that Facebook’s algorithm are feeding them information that’s confirming of their existing worldview. And what ends up happening is that there’s a media “wraparound” effect, whereby everywhere a person looks, they’re seeing what feels like evidence for their particular worldview.
Falk: This “wraparound” effect that you’re describing sounds similar to, or at least related to,
confirmation bias — the idea that we latch onto information that supports what we already believe, while dismissing information that challenges those beliefs. What’s the connection there?
Phillips: If you are surrounded on all sides by information that seems to confirm this particular belief that you have, and every time you search for something you get information that confirms your beliefs, it would actually be illogical for you to say, “You know what, I reject this.” I mean, people can de-rabbit-hole themselves, or extract themselves from an existing worldview — but that’s really hard to do. So with all this wraparound confirmation, there’s coherence to the belief — which doesn’t make it empirically true, and doesn’t make it less destructive or harmful; but it means that it’s not an illogical conclusion for someone to draw based on all the information that it’s constantly being pushed in front of their in front of their eyeballs.
Falk: A lot has been written about the problem of extremist content online, and in particular on how the algorithms that the various social media platforms use seem to be making the problem worse — how
Facebook’s algorithms appear to steer users toward more extreme content, for example, and with similar
concerns raised over YouTube. But you’ve often said that the algorithms are a small part of the problem. Can you expand on that?
Phillips: I think of them [the algorithms] like salt in some ways — that they intensify the flavor of food, not because they change the flavor of food; they just make that flavor more intense. So if you think about that in terms of sort of beliefs, and what people are bringing to their social platforms, algorithms can absolutely enhance those beliefs, both in negative ways in positive ways, by showing people more of what it is they want to see — but people come to those algorithms asking to be shown certain things.
So we just can’t simplistically say, “it’s the algorithm that’s radicalizing our kids” — you don’t just take someone who’s a run-of-the-mill, everyday person, a centrist or a moderate, and have them watch 10 YouTube videos, and suddenly they’re a Nazi. That’s just not how belief happens; that’s not how identity-formation happens. People who are getting sucked into rabbit holes are by and large people who go to those platforms asking to be fed something.
Falk: As you’ve shown, these issues have deep roots, extending back far earlier than the rise of social media in the 21st century. So how do you see the way forward?
Phillips: I think we’re just not going to be able to come up with a simple regulatory solution or a simple technological solution, because the problems are so much more complex than that. But I do think that one way forward is to think small.
Up until this point, the impulse has been to think big: How do we scale solutions to these challenges? How do we get government to intervene? How do we get Facebook to do whatever to its algorithm? It’s not that we shouldn’t be talking about those things or thinking about those things — but it’s also a question of what’s going on within the information ecosystems of local communities. Where can people turn in their own communities if they have questions? Who are the trusted leaders within communities?
And so for me, part of the solution needs to be “assessing the local” and thinking about the role that local information ecosystems, local news reporters, and local faith leaders can play. How can we harness trust and healthier information exchange within more manageable, smaller chunks of space? Because there’s never going to be a top-down solution; that’s not going to happen.
Photo of a rubber band ball by
Alex France used under a Creative Commons license.