Nieman Foundation at Harvard
HOME
          
LATEST STORY
The media becomes an activist for democracy
ABOUT                    SUBSCRIBE
Feb. 15, 2019, 9:01 a.m.
Audience & Social

If Facebook wants to stop the spread of anti-vaxxers, it could start by not taking their ad dollars

“You have nothing to be ashamed of for your parents not vaccinating you. It wasn’t something you researched and decided against, you were just doing the whole ‘being a kid’ thing.”

The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

How much should we freak out about anti-vaxxers? WHO named anti-vaxxers one of the top 10 global health threats for 2019. But is the threat from internet crazies overblown? Or are there certain things about the anti-vaccination movement that make it particularly dangerous?

This debate is the health version of an argument we see often these days: That covering far-right figures and extremists too much — even in highly critical articles — gives them the oxygen they need to become more powerful and mainstream. “The mere fact that anti-vaxxer beliefs are treacherous and wrong doesn’t make them worthy of attention on the national scale,” Daniel Engber writes in a Slate article in which he warns against catastrophizing. “Vaccine refuseniks are still well outside the mainstream.” (Or, sometimes, fairly close to the mainstream!).

Engber’s argument:

The anti-vaxxer movement isn’t really on the rise all across America, and measles hasn’t really re-emerged from clinical oblivion or become a fatal threat to everyone’s well-being. The outbreak in Clark County may be disturbing, but it’s a local story: Low vaccination rates in the area, enabled by residents’ adherence to fringe beliefs and suspect statewide policies, have made this outbreak possible. But attempts to link it to a dawning crisis in America are, at best, a waste of readers’ time and trust. At worst, they might provide a boost to the anti-vaxxer movement, by exaggerating its extent and influence; or else they could distract us from other, more important obstacles to vaccination, such as health care inequality.”

I’m still forming thoughts on this, but it is at least my personal experience that health-related misinformation trickles into my orbit far more often than political fake news does. It’s pervasive on the mom internet in a way that someone looking at it mainly through the lens of national news reports may not recognize. This week, The Washington Post and The Guardian both looked at Facebook’s struggles/inefficacy in dealing with vaccination misinformation. From The Washington Post’s Taylor Telford:

A working paper published in November by the National Bureau of Economic Research looked at the role of Facebook in spreading false information about vaccines. The paper found Facebook’s ban on ads that linked to fake news stories did lead to a decrease in shares of anti-vaccination content. But in anti-vaccination circles, ads aren’t the primary issue — people are. They found that anti-vaccination groups on Facebook tend to pass around the same misleading links and junk science, then end up spreading the information to the broader public through likes, shares and word of mouth.

“The majority of misinformation about vaccines is spread by individuals — and the majority of that misinformation by a few individuals — sharing the message organically,” Catherine Tucker, a professor at the Massachusetts Institute of Technology and co-author of the NBER paper, said in an email to The Washington Post. “That is a far harder problem to solve, as trying to clamp down on that kind of social sharing has tensions with trying to preserve free speech.”

The Guardian’s Ed Pilkington and Jessica Glenza looked more closely at some of the private groups where vaccination misinformation is shared.

One group, Vitamin C & Orthomolecular Medicine for Optimal Health, tells its users that it is “not an anti-vax group.” Its leader, Katie Gironda, says: “This group needs to remain neutral on the vaccine topic.”

Yet anyone allowed into this closed group of about 49,000 approved members will find ample material questioning the safety of vaccines. They will also find recommendations for alternative remedies that are falsely claimed to protect against disease.

Gironda is listed on LinkedIn as CEO of an online business in Colorado selling high-dose vitamin C. Members of her closed group are encouraged to “shop now” — in one click they are linked directly to her firm, Revitalize Wellness.

The site sells vitamin C powder in bulk, with customers encouraged to give children aged two up to three grams a day whereas the recommended daily intake is 15mg. Twenty-four-pound bags of the powder cost $432.

Revitalize Wellness carries a disclaimer saying that its products are “not intended to treat, diagnose, cure or prevent disease.” But in conversation with members of her closed Facebook group, Gironda gives the opposite advice.

“Vitamin C has an amazing record of fighting the same diseases vaccines were made for,” she posts.

In another entry she says: “I think the cons outweigh the pros on vaccines…Through greed they became a weapon. Until they become safe and not driven by money I would avoid all vaccines.”

“2019: Teenager sneaks outside home to get measles vaccine” What do you do when your parents are anti-vaxxers and you aren’t? You take to the internet, of course. For Pacific Standard, Emily Moon wrote about how the kids of parents who don’t believe in vaccines are taking control back by going on sites like Reddit to try to learn more about how they can get vaccinated when they’re under 18.

Right now, parents have easy access to anti-vaxxer propaganda, thanks to sites like Facebook. Michael’s mother, for example, blames vaccines for an inflammatory disease that runs in the family, even though “every medical professional I’ve ever talked to has told me that vaccines don’t trigger sarcoidosis,” he says. Alex, the 18-year-old in Washington, said whenever they brought up vaccination, their mom would try to get them to watch the pseudoscience documentary VAXXED, directed by Andrew Wakefield, the discredited former physician who authored the fraudulent study. “It all just seems ridiculous to me,” Alex says.

With this misinformation still thriving elsewhere on the Internet, forums like r/legaladvice are a rare safe haven. Even some of the harsher feedback has helped; Charly, who posted when she was 17 years old and living in Canada, said she began to question her parents’ stance after seeing jokes and rants against anti-vaxxers on Reddit. “I began to feel bad about my unvaccinated existence,’ she says. “It isn’t the best reason, but it’s truly what got to me.” Like Charly’s, many of the plaintive requests for advice contain a lot of shame. One unvaccinated user says they’ve tried not to tell anyone. Another writes in the comments: “You have nothing to be ashamed of for your parents not vaccinating you. It wasn’t something you researched and decided against, you were just doing the whole ‘being a kid’ thing.”

A couple different takes on Facebook’s fact-checking program. Is Facebook’s fact-checking program a disaster or, IDK, largely acceptable? Depends on who you ask. Brooke Binkowski, former managing editor of Snopes (which pulled out of the fact-checking program earlier this month) wrote in BuzzFeed recently that participating in the program was “like playing a doomed game of whack-a-mole.” In her piece, she offers a pretty detailed description of what the work was like.

We were given access to a tool that hooked into our personal Facebook accounts and was accessed that way (strike one, as far as I was concerned) and it spat out a long list of stories that had been flagged for checks. We were free to ignore the list, or mark stories as “true,” “false,” or “mixture.” (Facebook later added a “satire” category after what I like to call “the Babylon Bee incident,” where a satirical piece was incorrectly labeled false.)

It was clear from the start that that this list was generated via algorithm. It contained headlines and URLs, and a graph showing their popularity and how much time they had been on the site. There were puzzling aspects to it, though. We would often get the same story over and over again from different sites, which is to be expected to a certain degree because many of the most lingering stories have been recycled again and again. This is what Facebook likes to call “engagement.”

But no matter how many times we marked them “false,” stories would keep resurfacing with nothing more than a word or two changed. This happened often enough to make it clear that our efforts weren’t really helping, and that we were being directed toward a certain type of story — and, we presumed, away from others.

What were the algorithmic criteria that generated the lists of articles for us to check? We never knew, and no one ever told us.

There was a pattern to these repeat stories though: they were almost all “junk” news, not the highly corrosive stuff that should have taken priority. We’d be asked to check if a story about a woman who was arrested for leaving her children in the car for hours while she ate at a buffet was true; meanwhile a flood of anti-semitic false George Soros stories never showed up on the list. I could never figure it out why, but perhaps it was a feature, not a bug.

Current partners Agence France-Presse and Full Fact, however, told Digiday it’s going at least okay, especially if you like debunking horse stories.

“We want Facebook to be sharing data transparently and more widely. It’s clear Facebook can share more information,” said Will Moy, director at Full Fact. “We’ll be telling them that is what we expect a responsible internet company to do.”

So far, Full Fact has fact-checked just 10 stories on Facebook, including debunking a picture that was shared over 25,000 times of a horse living in a flat in Preston, Lancashire, which turned out to be a picture of a model horse in a window in Illinois, U.S. More pernicious claims include an false image stating that illegal immigrants and refugees can claim a much larger yearly benefit than British pensioners, which was shared over 2,000 times on Facebook.

“Some of this activity was linked to employees of the Moldovan government.” This week Facebook removed “168 Facebook accounts, 28 Pages and eight Instagram accounts for engaging in coordinated inauthentic behavior targeting people in Moldova.” Facebook’s post on the takedown was notable for its level of detail and the fact that Facebook links the activity to the Moldovan government:

The Page admins and account owners typically posted about local news and political issues such as required Russian or English language education and reunification with Romania. They also shared manipulated photos, divisive narratives and satire and impersonated a local fact checking organization’s Page that called out other Pages for spreading fake news. Although the people behind this activity attempted to conceal their identities, our manual review found that some of this activity was linked to employees of the Moldovan government.

Illustration from L.M. Glackens’ The Yellow Press (1910) via The Public Domain Review.

Laura Hazard Owen is the editor of Nieman Lab. You can reach her via email (laura@niemanlab.org) or Bluesky DM.
POSTED     Feb. 15, 2019, 9:01 a.m.
SEE MORE ON Audience & Social
PART OF A SERIES     Real News About Fake News
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
The media becomes an activist for democracy
“We cannot be neutral about this, by definition. A free press that doesn’t agitate for democracy is an oxymoron.”
Embracing influencers as allies
“News organizations will increasingly rely on digital creators not just as amplifiers but as integral partners in storytelling.”
Action over analysis
“We’ve overindexed on problem articulation, to the point of problem admiring. The risk is that we are analyzing ourselves into inaction and irrelevance.”