On Monday, Facebook said it would take down false information about vaccines in general, Covid-19, and the Covid-19 vaccine, and would move to support vaccination campaigns around the world.
The company said:
In addition to sharing reliable information, we are expanding our efforts to remove false claims on Facebook and Instagram about Covid-19, Covid-19 vaccines and vaccines in general during the pandemic. Today, following consultations with leading health organizations, including the WHO, we’re expanding the list of false claims we will remove to include additional debunked claims about Covid-19 and vaccines. Learn more about how we’re combating Covid-19 and vaccine misinformation.
Facebook had already taken some steps to remove Covid-19 misinformation from the platform. It had said in December that it would remove some false claims about vaccines, and ads that make false claims about vaccines are not allowed. Under the new policy, the company will remove — not just downrank — unpaid posts on the site, in groups, and on Pages and will remove misinformation about all types of vaccines.
Facebook’s other promises include:
- Helping people find where and when they can get vaccinated — similar to how we helped people find information about how to vote during elections
- Giving $120 million in ad credits to help health ministries, NGOs and UN agencies reach billions of people around the world with COVID-19 vaccine and preventive health information
- Providing data to inform effective vaccine delivery and educational efforts to build trust in COVID-19 vaccines
In January, Facebook’s Oversight Board overturned Facebook’s decision to remove a post that criticized a public health strategy in France and claimed that there was a cure for the coronavirus; the board ruled that Facebook’s standards for removing such posts were “inappropriately vague.” The move this week appears to be Facebook’s attempt to clarify its standards — and make clear more forcefully that anti-vaxxers aren’t welcome.
On Section 230’s 25th birthday, the Oversight Board did what Congress can’t — or won’t. I wrote about Facebook finally ejecting the anti-vaxxershttps://t.co/UliFBiVpIO pic.twitter.com/LFAPRSULhR
— Casey Newton (@CaseyNewton) February 9, 2021
It only took….years: “For years, public health advocates and outside critics took issue with Facebook’s refusal to remove false or misleading claims about vaccines. That led to a surge in false vaccine information.” https://t.co/iI7rt4RNpZ
— Tim O’Brien (@TimOBrien) February 9, 2021
Facebook’s latest move raises questions about what will qualify as misinformation now. For example, early on in the pandemic, the United States’ leading infectious disease expert Dr. Anthony Fauci advised Americans against wearing masks because there was a shortage of personal protective equipment for healthcare workers coming in direct contact with the virus. When it was understood that the virus could be spread by asymptomatic people, then Fauci advised people to wear masks. Would posts with those initial comments now be deleted?
I’m looking at the list of things that Facebook says it will take down as “misinformation” and is this retroactive? They’d then have to take down most of public health advice, including from the CDC and the WHO and major newspapers, from the first six months of the pandemic. https://t.co/PpDzWPUdY6
— zeynep tufekci (@zeynep) February 8, 2021
Also, despite Facebook’s claims of greater transparency, we still don’t know just how far-reaching misinformation about vaccines and Covid-19 actually are, because Facebook doesn’t release that data.
What content are they taking down?
And what has already been taken down? How many people had it reached before they took it down?
These measures Facebook take are done with no oversight, no transparency or accountability. What trust do we possibly have?https://t.co/JnAMBWj18y— Chris Cooper (@cooper_cn) February 8, 2021
Most of all, it begs the question: why didn’t Facebook do this sooner?
Facebook says that it is going to remove posts with false claims about vaccines. This kind of misinfo thrives on both Facebook and Instagram, so would be interesting to see how the new rules are implemented https://t.co/EaOsOuQqz9
— Olga Robinson (@O_Rob1nson) February 8, 2021
Social media has played an irrefutable role in the spread of health misinformation, esp in the last decade. Unquantifiable damage, beyond COVID-19, has already been passively inflicted by Facebook & co. It’s promising news, but you can’t unscramble an egghttps://t.co/vQwiDgIBYs
— Niamh O’Kane (@okaneniamh) February 9, 2021
What percentage of vaccine information on Facebook will have to be removed as a result of this policy? https://t.co/S2il1jSZN1
— Meena Bewtra (@DrsMeena) February 9, 2021
Like trying to put toothpaste back into the tube with a fucking pineapple. https://t.co/RD1SClxyuX
— Ryan (@OaksieDoakes) February 9, 2021
.@Facebook is changing its fake news rules to delete anti-vaxxer conspiracies. Given we’ve had a year of conspiracies being peddled by everyone from fitness influencers to the former president of the US, it feels like it’s a bit too little too late.https://t.co/IgZndwv6PI
— Eric Johansson (@EricJohanssonLJ) February 9, 2021
On Monday, The Guardian reported that conspiracy theory accounts were still surfacing in searches on Instagram.
In my own search on Instagram, the first non-verified account is one that posts about adverse reactions to the Covid-19 vaccine. It has more than 87,000 followers.
Clicking on the #covidvaccine hashtag led to this prompt:
Selecting “See Posts Anyway” shows photos of people getting their vaccines or photos of vaccination cards.
Read the new policy here.
Leave a comment