There are many proposals to regulate social media. Let’s say you are a high-normal level of concerned about platforms making things terrible and you also begin to tune out when you hear about proposed social media legislation, or any sections of anything, but you would like to support some kind of solution.
A panel I watched this week helped me clarify the kinds of solutions it might be useful to look for. The panel, “Dismantling Disinformation,” was hosted by the Harvard T. H. Chan School of Public Health and moderated by Brandy Zadrozny, a senior reporter for NBC News and a fellow at Harvard’s Shorenstein Center. (The panel was all women, shout out to the planners.)
Zadrozny asked Renée Di Resta, research manager at the Stanford Internet Observatory, if she thinks it’s possible to come up with regulatory solutions to slow the spread of mis- and disinformation, and whether she’s particularly fond of any bills out there currently.
Di Resta:
[Disinformation] does have national security implications …so there is a justification for government to understand that particular facet of the problem…
Very few of the bills I have seen would either have a positive impact or would even be defensible from a constitutional standpoint, because the government should not be regulating the content on social media platforms — in my opinion, anyway. There are some real legal minefields associated with that. But what can it do?
I think the bill that I’ve been most interested in — full disclosure, you know, it was originally drafted by a colleague of mine at Stanford Law — is the Platform Accountability and Transparency Act. It asks for research for data access. It says: The work that we’re doing as outside researchers is trying to understand not only the complexities of social media, but also the impact. Is there a there there? What is the actual harm?
It’s very hard for us to answer that question with the access that we have today because it is so piecemeal, and also it is really at the discretion of the platforms. We’re really operating on their — their sort of goodwill. And there’s a lot more goodwill now than there was back in 2017. But that sort of data access to that capacity to answer questions is foundational. And this is where I think platform accountability and transparency is the kind of regulation that we need.
The questions that people have — “Is my viewpoint being censored?” “Are there unfair disproportionate takedowns?” “Do recommendations radicalize people?” all of those sorts of questions — in order to answer those questions, in order to address those concerns, we need access. That’s where I feel that bill is is foundational.
Nabiha Syed, CEO of The Markup, a fellow at Yale Law School, and a media lawyer, offered her advice on how to analyze legislation that seeks to regulate social media.
Right now we have this very unsatisfactory binary that’s evolving. You have platforms, on the one hand, saying, “Our First Amendment rights mean that no one can tell us what we do on our own platforms. We are exercising editorial rights, just like a newspaper.” They often will make these proposals saying they’re just like a newspaper and newspapers and media companies go out there and they publish their own perspectives and the First Amendment allows them to do that subject to very few limitations…
That doesn’t feel right. A newspaper puts out its own perspective and is responsible for the consequences of that. Platforms are not doing that. They are a home for other people’s perspectives, and because of legislative immunity through Section 230, they’re not responsible.
So on one hand, you have this [idea that] platforms can do whatever they want. They have unfettered First Amendment rights, like an earlier analog publisher. That feels incomplete and gives them, importantly, totally wide berth to do whatever they want. There’s not a lot of room to inculcate democratic values if you’re looking at it through an extreme view.
On the other hand, you have the Texas social media law, a Florida law, a proposal in Michigan, that sort of take this other view that say because these platforms are so important, they’re like common carriers. They’re like the telephone. They’re just an infrastructure for speech. The government shouldn’t be able to regulate them [because they’re] pass-through conduits for all information, and therefore they can do anything. They should just be like a total free for all … and that’s the other extreme.
When you’re reading these proposals, understand, on that spectrum of these two extremes, where you’re trying to land. To me, the most important reality of this moment is that it’s not going to be either one. That doesn’t make sense, right? They’re both chaos in their own ways.
So we do have to craft a new version going forward, a new balance, and that’s where I just want to underscore what [Renée di Resta] mentioned about platform research. There is just so much that we do not know. At The Markup we build a lot of tools to help understand and collect information from large platforms like Facebook. It’s not just that we don’t know. It’s that when you attempt to find out, the companies actually do — not just can, do — go after you legally, saying you are not allowed to come onto our private property and collect this kind of information. They actively weaponize against any types of research interventions.
The most important recommendations and proposals are actually saying we have to decide where on the spectrum we’re going to land in this binary. It’s somewhere new, right? It’s not fitting into the shoes of newspapers or telephones or something else. We’ve got to know what the what is.
Other panelists were Dolores Albarracin, Alexandra Heyman Nash University Professor at the University of Pennsylvania; Vineet Arora, dean for medical education at The University of Chicago Pritzker School of Medicine; and Raven Baxter, director of diversity initiatives at the Office of Diversity, Equity, and Inclusion at the School of Biological Sciences, University of California, Irvine. You can watch the panel here.
Leave a comment