Nieman Foundation at Harvard
HOME
          
LATEST STORY
Two-thirds of news influencers are men — and most have never worked for a news organization
ABOUT                    SUBSCRIBE
Jan. 17, 2019, 11:39 a.m.
Audience & Social

Nine steps for how Facebook should embrace meaningful interac— er, accountability

“There are broad concerns that Facebook continues to engage in deceptive behavior when it comes to user privacy, and that it is biased against certain groups, but outsiders currently have almost no possibilities to verify these claims.”

What would you put on Facebook’s to-do list?

Well, a group of Oxford and Stanford researchers (Timothy Garton Ash, Robert Gorwa, and Danaë Metaxa) started with nine items, in their report released Thursday via Oxford and Stanford. (No funding for the report came from Facebook, but the company did provide “under the hood” access to them and other academics.) The focus is on ways Facebook could improve itself as a “better forum for free speech and democracy,” which, you know, the platform has had some struggles with in the past few years.

Part of the report focuses on the amends Facebook has attempted, such as broader transparency with academics and policymakers and introducing content appeal processes, but also points to the impact (and issues) that can arise from self-regulatory actions instead of external policies. (Remember, senators, he sells ads!) “A single small change to the News Feed algorithm, or to content policy, can have an impact that is both faster and wider than that of any single piece of national (or even EU-wide) legislation,” the authors write.

And these ideas aren’t just limited to the blue F — Instagram and WhatsApp, two growing platforms owned by the mothership itself, are still understudied and without the same transparency as Facebook: “Transparency and accountability for WhatsApp and Instagram remain rudimentary, with most public (and regulatory) attention in the past two years focused on Facebook. For example, Instagram has not implemented the same moderation overhauls as Facebook (appeals, detailed community guidelines, or content policy enforcement reports).”

The biggest takeaway, the authors argue, is that Facebook could make immense progress by being less power hungry. Open up and let users have more active control, they write:

Ideally, the user interface and experience on Facebook should be designed to promote active, informed citizenship, and not merely clickbait addiction for the commercial benefit of Facebook, the corporation. That would be change indeed.

Here’s the full nine:

Expand and open up (political) content moderation

Facebook has brought on more content reviewers, released reports on its policies, and started a blog to talk through its decision-making process…but it should (1) establish clearer definitions of hate speech, (2) hire “more and culturally expert content reviewers” (Myanmar, anyone?), (3) share more details — maybe even case studies — on the decision-making process for users like Infowars, and (4) expand and improve the appeals process.

Facebook shared with us, on an off-the-record basis, some interim internal statistics about appeals. These indicated that quite a large proportion of takedowns (perhaps as many as one in eight) are actually appealed, with a heavy concentration of appeals in two categories: bullying and hate speech. The highest proportion of successful appeals was on bullying, and one of the lowest on hate speech. These figures are both informative and suggestive. For example, the high proportion of appeals in the area of hate speech indicates the problem of over breadth discussed in #1. Although we have respected Facebook’s request not to give the exact percentages, since these were explicitly given on an off-the-record basis and are now out of date, we feel very strongly that this is exactly the kind of concrete, detailed information that should be made available to analysts and users on a regular basis.

Pop the News Feed with more trustworthy (political) information

Yes, Facebook put in the context button — that little “i” on links shared on the feed — partnered with fact-checkers, and tried to crank out a system for political advertising…but it can go further with (5) sliders users can set for different types of political content, news, and the level of News Feed curation overall. However, the researchers do note “such features would likely be used by a fraction of all of Facebook’s users, but could be powerful customization tools for such power users.”

Facebook could also (6) invest in fact-checking partnerships in some of the 76 countries where the platform operates but has no fact-checking partners: “Facebook should take responsibility for both promoting these efforts and funding research into their effects, especially outside of the Global North.”

Go beyond the public relations schtick and be open to meaningful accountability

There are broad concerns that Facebook continues to engage in deceptive behavior when it comes to user privacy, and that it is biased against certain groups, but outsiders currently have almost no possibilities to verify these claims. Facebook remains very difficult to study, meaning that it is very difficult for policymakers to be able to formulate evidence-based policy and truly understand the scope of the relevant problems (from polarization to disinformation, as it stands only Facebook can know the true scope of the issue).

Facebook has submitted to three audits based on civil and human rights, suggested the development of an external moderation council, and established more pathways for academics to work together…but the researchers push for regular meaningful (7) auditing mechanisms, (8) meetings with an external content policy group comprised of individuals from “affected communities,” and (9) useful interactions with an external appeals group, as so far “all the major questions remain unanswered.”

The authors use the word “meaningful” seven times in this report, hinting that it’s time to embrace not just meaningful interactions, but meaningful accountability.

POSTED     Jan. 17, 2019, 11:39 a.m.
SEE MORE ON Audience & Social
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
Two-thirds of news influencers are men — and most have never worked for a news organization
A new Pew Research Center report also found nearly 40% of U.S. adults under 30 regularly get news from news influencers.
The Onion adds a new layer, buying Alex Jones’ Infowars and turning it into a parody of itself
One variety of “fake news” is taking possession of a far more insidious one.
The Guardian won’t post on X anymore — but isn’t deleting its accounts there, at least for now
Guardian reporters may still use X for newsgathering, the company said.