Nieman Foundation at Harvard
HOME
          
LATEST STORY
The media becomes an activist for democracy
ABOUT                    SUBSCRIBE
March 20, 2018, 12:53 p.m.
Business Models

The Facebook-Cambridge Analytica fallout continues. Data breach? No. Pretty bad? Yes

Hidden cameras. Leadership disagreements. And, oh yeah, misinformation is still a problem.

We’re onto the fourth day of the Facebook-Cambridge Analytica revelations, but now the pendulum is swinging further and further toward Facebook.

My colleague Laura Hazard Owen tracked the initial wave over the weekend and the fallout continues. The story seems to have staying power and is spreading (information warfare expert Molly McKew wrote a Cosmo explainer about it).

As a refresher, the news that broke Saturday was about how the Cambridge Analytica data analytics firm harvested data from millions of Facebook users to build a system that would target voters with personalized political advertisements. New reporting is building on that story.

Hey Facebook, I think there are a few people who want to have meaningful conversations with you.

This wasn’t a data breach — it was a result of Facebook’s business model, Zeynep Tukefci points out in a New York Times op-ed.

This wasn’t a breach in the technical sense. It is something even more troubling: an all-too-natural consequence of Facebook’s business model, which involves having people go to the site for social interaction, only to be quietly subjected to an enormous level of surveillance. The results of that surveillance are used to fuel a sophisticated and opaque system for narrowly targeting advertisements and other wares to Facebook’s users.

Motherboard also explained to readers why this isn’t a breach.

Saying that “everyone involved” consented seems misleading, given that only around 270,000 out of the 50 million people who got their data harvested reportedly signed up for the app. The others probably had no idea this app even existed. And since Facebook changes its privacy settings so frequently, we also don’t know if the people who agreed to use the app fully understood what kind of data they were giving up. And no one at the time knew the data would later be handed out to a shadowy data analytics firm hired by the Trump campaign….

We’ve been regularly covering data breaches for years. No one hacked into Facebook’s servers exploiting a bug, like hackers did when they stole the personal data of more than 140 million people from Equifax. No one tricked Facebook users into giving away their passwords and then stole their data, like Russian hackers did when they broke into the email accounts of John Podesta and others through phishing emails.

In 2014, when Kogan collected the data of 50 million people, he was playing by the rules. At the time, Facebook allowed third party apps to collect not only the data of the people who consented to giving it up, but also their friends’ data. The company later shut down this functionality.

A former Facebook platform operations manager said Facebook looked the other way when he raised the alarm about these issues years ago.

[Sandy] Parakilas said he “always assumed there was something of a black market” for Facebook data that had been passed to external developers. However, he said that when he told other executives the company should proactively “audit developers directly and see what’s going on with the data” he was discouraged from the approach.

He said one Facebook executive advised him against looking too deeply at how the data was being used, warning him: “Do you really want to see what you’ll find?” Parakilas said he interpreted the comment to mean that “Facebook was in a stronger legal position if it didn’t know about the abuse that was happening”.

Where is Facebook’s apology tour? We’re become accustomed to Facebook executives stepping out and nodding their heads understandingly after revelations about their platform, from acknowledging Russian interference in the 2016 presidential election to missing the memo on how advertisers could take advantage of offensive targeting options. (Hey, remember when they apologized for messing with users’ emotions in a science experiment?) Facebook’s VP of marketing answered a question at a conference yesterday, saying that the company is “outraged and beyond disturbed” at the Cambridge Analytica news and that “if the allegations are true, this is an incredible violation of everything that we stand for.” But there’s no statement from Sandberg or Mark Zuckerberg yet. (Facebook is holding a Q&A for its employees on Tuesday, The Verge reported.)

Facebook’s chief security officer, Alex Stamos, tried to offer clarifications on Twitter over the weekend but deleted his tweets “not because they were factually incorrect but because I should have done a better job weighing in.” The New York Times reported that Stamos is planning to leave the company in a reflection of “heightened leadership tension at the top of the social network.” The tension comes from disagreements over how much Facebook should disclose about foreign interference on the platform ahead of the 2018 midterm elections. Stamos was reportedly on the share-more side but went from overseeing a team of 120 people to three, according to the Times.

Facebook updated its blog post about hiring a digital forensics firm to investigate at Cambridge Analytica’s office with the news that the auditors “stood down” at the request of the United Kingdom.

Are politicians actually going to do something about this? On Tuesday morning, Bloomberg reported that the U.S. Federal Trade Commission is probing whether Facebook violated terms of a consent decree in sharing the user data that ended up with Cambridge Analytica. Facebook could be fined more than $40,000 a day per violation. Some Republican and Democratic senators calling for Zuckerberg (and Google and Twitter execs) to testify before Congress. (A White House spokesperson told Fox News, “If Congress wants to look into the matter or other agencies want to look into the matter, we welcome that.”) European regulators are also seeking more information and testimony, especially with the European data privacy regulation coming into effect in May.

And as Axios’ Sara Fischer notes, “Facebook shares fell nearly 7 percent at market close on Monday. Its stock hasn’t seen this type of a drop in response to any of the major scandals its faced over the past year. Even during the Russia hearings on Capitol Hill, Facebook stock hit record highs.”

Undercover reporters confirmed that Cambridge Analytica was no good guy. Channel 4 hid a camera in a London meeting between reporters posing as prospective clients with the CEO of the data analytics company. Alexander Nix is recorded suggesting his company can put politicians in compromising situations with bribes and Ukrainian sex workers, adding, “We’re used to operating through different vehicles, in the shadows, and I look forward to building a very long-term and secretive relationship with you.”

Now, the company has backtracked and said the video was “edited and scripted to grossly misrepresent the nature of those conversations” and Nix was just playing along with “a series of ludicrous hypothetical scenarios” to avoid embarrassing the client.

Meanwhile, misinformation is still rampant on the platform. Buzzfeed’s Craig Silverman, Jane Lytvyenko, and Lam Thuy Vo published a story Monday about the “spammers, hackers, and trolls” taking advantage of Facebook’s new focus on Groups to wreak havoc and spread misinformation. It’s not pretty, and it shows that “meaningful interactions” can still be hijacked for unseemly purposes.

Photo of Cambridge Analytica CEO Alexander Nix by Web Summit used under a Creative Commons license.

POSTED     March 20, 2018, 12:53 p.m.
SEE MORE ON Business Models
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
The media becomes an activist for democracy
“We cannot be neutral about this, by definition. A free press that doesn’t agitate for democracy is an oxymoron.”
Embracing influencers as allies
“News organizations will increasingly rely on digital creators not just as amplifiers but as integral partners in storytelling.”
Action over analysis
“We’ve overindexed on problem articulation, to the point of problem admiring. The risk is that we are analyzing ourselves into inaction and irrelevance.”