Companies like Facebook are behaving like “digital gangsters,” British parliament said in a final report on disinformation and fake news released on Sunday after 18 months of work, and it’s time to rein them in.
“We need a radical shift in the balance of power between the platforms and the people. The age of inadequate self regulation must come to an end,” Damian Collins, chair of the House of Commons’ Digital, Culture, Media, and Sport Committee, said in a statement. (If you’re interested, Time has a little profile of Collins here.) “The rights of the citizen need to be established in statute, by requiring the tech companies to adhere to a code of conduct written into law by Parliament, and overseen by an independent regulator.”
The final report builds on an interim report that was released last July.
Here are some of the key recommendations and findings from the final report:
— Neither platform nor publisher. The commission recommends a new “formulation” of tech company, “which tightens tech companies’ liabilities, and which is not necessarily either a ‘platform’ or a ‘publisher.'” (The term “platisher” is not used.)
— “Mark Zuckerberg has shown contempt.” Facebook “seems willing neither to be regulated nor scrutinized,” the commission writes, noting that CEO Zuckerberg chose “not to appear before the committee” and sent uninformed company representatives instead (“we are left in no doubt that this strategy was deliberate”); “Facebook, in particular, is unwilling to be accountable to regulators around the world.”
— The U.K. should set up an independent regulator with “statutory powers to monitor relevant tech companies.”
The process should establish clear, legal liability for tech companies to act against agreed harmful and illegal content on their platform and such companies should have relevant systems in place to highlight and remove “types of harm” and to ensure that cyber security structures are in place. If tech companies (including technical engineers involved in creating the software for the companies) are found to have failed to meet their obligations under such a Code, and not acted against the distribution of harmful and illegal content, the independent regulator should have the ability to launch legal proceedings against them, with the prospect of large fines being administered as the penalty for non-compliance with the Code.
This regulatory body would also “have statutory powers to obtain any information from social media companies that are relevant to its inquiries” and have access to “tech companies’ security mechanisms and algorithms.” It would be accessible to the public and open to public complaints.
— Current U.K. electoral law doesn’t acknowledge “the role and power of unpaid campaigns and Facebook Groups that influence elections and referendums.”
There needs to be: absolute transparency of online political campaigning, including clear, persistent banners on all paid-for political adverts and videos, indicating the source and the advertiser; a category introduced for digital spending on campaigns; and explicit rules surrounding designated campaigners’ role and responsibilities.
— Maybe Facebook should team up with a company like NewsGuard.
Social media users need online tools to help them distinguish between quality journalism, and stories coming from organizations that have been linked to disinformation or are regarded as being unreliable sources. The social media companies should be required to either develop tools like this for themselves, or work with existing providers, such as NewsGuard, to make such services available for their users. The requirement for social media companies to introduce these measures could form part of a new system of content regulation, based on a statutory code, and overseen by an independent regulator, as we have discussed earlier in this report.
The commission also vaguely recommends that “participating in social media should allow more pause for thought…Techniques for slowing down interaction online should be taught, so that people themselves question both what they write and what they read — and that they pause and think further, before they make a judgment online.” It does not, however, offer any ideas about what such techniques, or “obstacles or ‘friction,’” should be.
It’s worth reading this report in concert with the similar-but-distinct Cairncross Review, released seven days ago and examining the future of digital news in the U.K. It also had prescriptions for Facebook and other platforms, though not particularly harsh ones.
We think the UK can be a world leader in content regulation. The Government have two months to respond to our report.
Read our report:https://t.co/Dtf7L3DPHqhttps://t.co/0LDuxQpYqU#fakenews pic.twitter.com/zyJBqsk3Qz
— Digital, Culture, Media and Sport Committee (@CommonsCMS) February 18, 2019
Pinning this to the top of my timeline (replacing my open questions thread – https://t.co/FpCTtIXns4), this is my run-down on the @CommonsCMS report released tonight. I attempted to surface important Facebook elements that may be too deep for the general media coverage. https://t.co/vvLXmpyfEW
— Jason Kint (@jason_kint) February 18, 2019
The shortsighted greed and callousness by Facebook is baiting a draconian response from European legislators. “Safe harbor” is up for debate, free speech to be heavily sanctioned, and there’ll be few willing to defend Facebook (and thus rest of internet). https://t.co/Nm9Zel96Au
— DHH (@dhh) February 18, 2019
An election called tomorrow would be wide open to abuse—so we welcome the DCMS committee's calls for an overhaul of election and political advertising rules to make them fit for the digital age.
But the govt mustn't be allowed to water them down. https://t.co/uAo29XhiO5 [1/5]
— Full Fact (@FullFact) February 18, 2019