The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.
Anti-vaxxers are one of the top 10 global health threats. The World Health Organization identified “vaccine hesitancy” — “the reluctance or refusal to vaccinate despite the availability of vaccines” — as one of its top 10 health concerns facing the world in 2019.
A 2018 study found that “philosophical-belief” vaccine non-medical exemptions have risen in 12 of the 18 states that allow them, and the authors noted:
While NMEs continue to rise in most of the 18 US states that allow them, several European countries, including France and Italy, as well as Australia, have taken measures to either make vaccines compulsory or even fine parents who refuse to vaccinate their children. Romania has experienced serious and large measles outbreaks and may also tighten vaccine legislation. Our concern is that the rising NMEs linked to the antivaccine movement in the US will stimulate other countries to follow a similar path. It would be especially worrisome if the very large low- and middle-income countries — such as Brazil, Russia, India, and China (the BRIC nations), or Bangladesh, Indonesia, Nigeria, and Pakistan — reduce their vaccine coverage. In such a case, we could experience massive epidemics of childhood infections that may threaten achievement of United Nations global goals.
Measles cases in Europe are at a 20-year high, The Guardian reported last month, topping 60,000 in 2018 per WHO — “more than double that of 2017 and the highest this century. There have been 72 deaths, twice as many as in 2017.” New York is facing its most severe measles outbreak in decades, with cases there concentrated almost exclusively among ultra-Orthodox Jews.
This week, Elsevier Atlas highlighted research that aims to help explain anti-vaccine attitudes. The study looked at the Dunning-Kruger effect — a form of cognitive bias in which people assume they know more than they actually do about an issue, or people’s “ignorance of their own ignorance” — surrounding vaccines. Matthew Motta, a postdoc at the Annenberg Public Policy Center at the University of Pennsylvania and is the lead author of of the study, explained:
We gave people a knowledge test about the causes of autism and then we asked people in a national survey: How much do you think you know about the causes of autism? We asked the same question about medical experts like doctors and scientists. We compared people’s perceptions of self to perceptions of experts and looked at that versus how well they scored on the knowledge test. We show there’s a relationship between knowledge and misinformation and what we call overconfidence — the belief that you know more. As we showed, those who are the least knowledgeable and most misinformed were most likely to exhibit overconfidence. Once we did that, we looked at policy implications of overconfidence. We looked at the correlation between attitudes, for example, about whether it should be required to vaccinate kids going to public school. Those who were the most overconfident were less likely to think that was the case.
The researchers, in a survey of 1,300 U.S. adults, found that “more than a third of study participants believe they knew as much as or more than medical doctors and scientists about the causes of autism,” and that while they trusted information from experts, they also “place high levels of trust on information from non-experts (42 percent) and feel that non-experts should play a major policymaking role (38 percent).”
“We need efforts to inform people, but we also need to debunk misinformation. Hitting people over the head with facts probably isn’t going to do that,” Motta said. “What it might look like is the subject of follow-up studies. That’s the key question: how can we combat misinformation about vaccines?”
“More transparent — in some cases literally.” In Science Magazine, author and science journalist Laura Spinney reports on how rumors and hoaxes are making the battle against an Ebola outbreak in the Democratic Republic of Congo more difficult — and how public health workers have launched an unprecedented effort to fight misinformation.
For the first time in an Ebola outbreak, UNICEF and other agencies have joined forces as a single response team, which answers to the DRC’s Ministry of Health and includes dozens of social scientists, who use the airwaves, social media, and meetings with community and religious leaders to fight misinformation. Responders also foster trust by making their work more transparent — in some cases literally. A new biosecure tent, called the Biosecure Emergency Care Unit for Outbreaks (CUBE), allows relatives to visit and see Ebola patients during treatment.
Here’s some of what the social scientists are doing:
Part of their role is to chart the social networks through which the virus spreads, but they also gather information about communities’ perceptions, which is entered within days into an online “dashboard” created by the International Federation of Red Cross and Red Crescent Societies (IFRC) in Geneva. The government has also recruited young people to report misinformation circulating on WhatsApp, a major information channel in the DRC, says Jessica Ilunga, a spokesperson for the DRC’s Ministry of Health in Kinshasa.
As rumors surface, communications experts rebut them with accurate information via WhatsApp or local radio. They take care not to repeat the misinformation; research has shown this is the best way to help the public “forget” false news and reinforce the truth. The vocal support of Ebola survivors has helped as well. Grateful for their care, some have become volunteers at Ebola treatment centers (ETCs).
Our colleague @Ombaggio talked to @ScienceMagazine on how trust and community feedback is key for an effective #Ebola response in #DRC. Gathering community data on rumours, beliefs, questions helps us find the right community engagement approaches https://t.co/yhXwOkH1Yt pic.twitter.com/E2ucJK10NJ
— IFRC Intl. Federation #RedCross #RedCrescent (@Federation) January 15, 2019
And the BBC’s Yvonne McPherson, director of BBC Media Action USA, wrote in December about her work as BBC Media Action’s Ebola response efforts. There’s a difference, she explains, between “acute and chronic misinformation problems.” An example of an acute misinformation problem, for instance, was a 2014 rumor that you could avoid Ebola by bathing in salt water.
The salt water Ebola example was a real life case of acute misinformation in West Africa. News reports tracked this rumor to a text message from a student in Nigeria. It spread immediately to social media, with hundreds of tweets repeating the rumor in the following couple of days. In just as many days, the Nigerian Ministry of Health, the World Health Organization and others corrected it across traditional and social media, and the rumor was quashed. Sadly, this misinformation was responsible for at least two deaths and many people were hospitalized due to excessive consumption of salt water.
That’s an acute scenario: where misinformation spreads rapidly, then is corrected by multiple trusted sources and goes away.
But chronic misinformation is even trickier:
A chronic misinformation example would be the belief or suspicion that vaccines are harmful. It is chronic because this misinformation persists over years despite available facts to the contrary.
Algorithms, and the market forces underpinning them, are designed to capture attention, and in turn provide a breeding ground for misinformation to spread. Tweaking algorithms to direct people away from non-credible sources or annotating articles with credibility warnings may be part of a solution; however, these efforts do not address the longstanding beliefs people may already have about a health issue.