In October, the Media Manipulation Casebook, published by the Technology and Social Change (TaSC) team at Harvard’s Shorenstein Center on Media, Politics and Policy, gave us the tools to understand and analyze the origins and motives of some of the most viral misinformation and disinformation campaigns in the past few years.
With just four (!) weeks left in 2020, Dr. Joan Donovan, the research director of the Shorenstein Center, says that now is the time to think about what we can do to “restore moral and technical order” at time when so many people — from journalists, public servants, and civil society leaders to public health professionals — are paying the price (and consequences) of misinformation.
“The question of media manipulation and disinformation for me really is about, do we have a right to the truth? Do we have a right to accurate information?” Donovan said during a webinar Tuesday. “As we’ve watched the last two decades of technology develop, we’ve seen the foreclosure of some of our trusted institutions. We’ve seen the drying up of local journalism, we’ve seen universities come to rely on Google services for most of their infrastructure, we’ve started to see social media take hold of the public imagination, and none of these things are really designed to deal with what we’re going through in this moment, which is profound isolation, coupled with immense unwarranted paranoia about our political system, and economic collapse…What are we doing here with this media ecosystem in the midst of a pandemic, knowing that the cost is very high if people get medical misinformation before they get access to timely local relevant and accurate information?”
Donovan said it’s important to focus on how bad actors use technology and wield its power to scale up certain types of misinformation, like Covid-19 conspiracy theories or false claims about election fraud. From that frame, here are some of Donovan’s suggestions to start restoring “moral and technical order”:
Content curation coupled with transparency in content moderation: Donovan said that we need more models to advance a knowledge-based infrastructure, rather than a popularity infrastructure, in our information ecosystem. Librarians could play a huge role in helping to categorize and organize information. “I’ve argued in the past for hiring 10,000 librarians to help Google, Facebook, and Twitter sort out the curation problem, so that when people are looking for accurate information and they’re not looking for opinion, they can find it,” Donovan said. “If you think about Google search results, the things that become popular are the things that are free. Anything that’s behind a paywall is not something that people are going to continue to return to.”
A distribution plan for the truth that also supports public media: Alerting a news consumer or social media platform user that something is “disputed” or “misleading” isn’t enough anymore. Platforms need to work with more professional news organizations to distribute their factual information. “We’ve seen this happen with pandemic information where there’s lots of these yellow banners that are showing up on websites,” Donovan said. “That’s not a plan. That’s, like, a sticky note. So we need something else.”
Develop a policy on strategic amplification that mirrors the public interest obligations of other broadcast companies: The Federal Communications Commission in the United States regulates much of how television and radio stations operate, and something similar could help clean up, or at least mitigate, the misinformation problem on social media platforms. “If something is reaching 50,000 people, when we think about broadcast and radio, we have rules for that,” Donovan said. “So when when something is reaching a certain amount of views, or a certain amount of clicks, or a certain amount of people routinely, especially if it’s a certain influencer, there has to be some kind of measure that will help us understand when misinformation or hate speech or incitement to violence is circulating at epic volumes, and what is the protocol that should exist across social media sites.”
Technology companies must fund independent civil rights audits: Donovan said this could be done in the form of some kind of agency, but auditors must be able to access the relevant data to perform investigations, “including a record of decisions to remove, monetize, or amplify content.”
Find the full webinar here.
Leave a comment