Nieman Foundation at Harvard
HOME
          
LATEST STORY
A year in, The Guardian’s European edition contributes 15% of the publisher’s pageviews
ABOUT                    SUBSCRIBE
Feb. 19, 2019, 2 p.m.
Audience & Social

It’s time for a “radical shift in the balance of power between the platforms and the people,” the British parliament says

Facebook acts like “digital gangsters,” “Mark Zuckerberg has shown contempt” toward governments, and the company’s “deliberate” strategy was to send uninformed executives to answer Parliament’s questions.

Companies like Facebook are behaving like “digital gangsters,” British parliament said in a final report on disinformation and fake news released on Sunday after 18 months of work, and it’s time to rein them in.

“We need a radical shift in the balance of power between the platforms and the people. The age of inadequate self regulation must come to an end,” Damian Collins, chair of the House of Commons’ Digital, Culture, Media, and Sport Committee, said in a statement. (If you’re interested, Time has a little profile of Collins here.) “The rights of the citizen need to be established in statute, by requiring the tech companies to adhere to a code of conduct written into law by Parliament, and overseen by an independent regulator.”

The final report builds on an interim report that was released last July.

Here are some of the key recommendations and findings from the final report:

Neither platform nor publisher. The commission recommends a new “formulation” of tech company, “which tightens tech companies’ liabilities, and which is not necessarily either a ‘platform’ or a ‘publisher.'” (The term “platisher” is not used.)

“Mark Zuckerberg has shown contempt.” Facebook “seems willing neither to be regulated nor scrutinized,” the commission writes, noting that CEO Zuckerberg chose “not to appear before the committee” and sent uninformed company representatives instead (“we are left in no doubt that this strategy was deliberate”); “Facebook, in particular, is unwilling to be accountable to regulators around the world.”

— The U.K. should set up an independent regulator with “statutory powers to monitor relevant tech companies.”

The process should establish clear, legal liability for tech companies to act against agreed harmful and illegal content on their platform and such companies should have relevant systems in place to highlight and remove “types of harm” and to ensure that cyber security structures are in place. If tech companies (including technical engineers involved in creating the software for the companies) are found to have failed to meet their obligations under such a Code, and not acted against the distribution of harmful and illegal content, the independent regulator should have the ability to launch legal proceedings against them, with the prospect of large fines being administered as the penalty for non-compliance with the Code.

This regulatory body would also “have statutory powers to obtain any information from social media companies that are relevant to its inquiries” and have access to “tech companies’ security mechanisms and algorithms.” It would be accessible to the public and open to public complaints.

— Current U.K. electoral law doesn’t acknowledge “the role and power of unpaid campaigns and Facebook Groups that influence elections and referendums.”

There needs to be: absolute transparency of online political campaigning, including clear, persistent banners on all paid-for political adverts and videos, indicating the source and the advertiser; a category introduced for digital spending on campaigns; and explicit rules surrounding designated campaigners’ role and responsibilities.

— Maybe Facebook should team up with a company like NewsGuard.

Social media users need online tools to help them distinguish between quality journalism, and stories coming from organizations that have been linked to disinformation or are regarded as being unreliable sources. The social media companies should be required to either develop tools like this for themselves, or work with existing providers, such as NewsGuard, to make such services available for their users. The requirement for social media companies to introduce these measures could form part of a new system of content regulation, based on a statutory code, and overseen by an independent regulator, as we have discussed earlier in this report.

The commission also vaguely recommends that “participating in social media should allow more pause for thought…Techniques for slowing down interaction online should be taught, so that people themselves question both what they write and what they read — and that they pause and think further, before they make a judgment online.” It does not, however, offer any ideas about what such techniques, or “obstacles or ‘friction,’” should be.

It’s worth reading this report in concert with the similar-but-distinct Cairncross Review, released seven days ago and examining the future of digital news in the U.K. It also had prescriptions for Facebook and other platforms, though not particularly harsh ones.

Mark Zuckerberg illustration by Paul Chung used under a Creative Commons license.

Laura Hazard Owen is the editor of Nieman Lab. You can reach her via email (laura_owen@harvard.edu) or Twitter DM (@laurahazardowen).
POSTED     Feb. 19, 2019, 2 p.m.
SEE MORE ON Audience & Social
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
A year in, The Guardian’s European edition contributes 15% of the publisher’s pageviews
After the launch of Guardian Europe, one-time donations from European readers increased by 45%.
Press Forward awards $20 million to 205 small local newsrooms
In response to the volume and quality of applications, Press Forward doubled the funding and number of grantees for this open call.
Midwestern news nonprofit The Beacon shuts down its Wichita newsroom
“We’ve realized that we can’t do it all, and have made the decision to no longer have a staffed newsroom in Wichita.”