Do you recognize the following websites: (Yes. No.)
That’s Facebook’s first question on a survey for users that Facebook wrote up itself, which the company will use as a signal that affects a news publisher’s ranking in the News Feed, according to BuzzFeed News, which first obtained the entire survey.
Here’s the second — and final — question Facebook intends to ask its users: How much do you trust each of these domains? (Options: Entirely / A lot / Somewhat / Barely / Not at all)
From publishers, eyerolls and snark ensued, as well as that ever-present baseline fear of traffic to news sites drying up and the “bloodletting” to come at organizations fixated on that absolute traffic number. Mark Zuckerberg has said in a Facebook post that these updates would “not change the amount of news [users] see on Facebook,” and that these “tweaks for trust” would only mean a change in the “balance of news you see towards sources that are determined to be trusted by the community.”
The survey is brief; how exactly it’ll be used is less clean cut. Adam Mosseri, Facebook’s VP for News Feed and media Twitter’s favorite Facebook rep has responded directly, and with some specificity, to questions and comments on Facebook and Twitter.
Second thought: The fact that the survey itself is simplistic doesn't rule out the possibility that Facebook is handling the results with more sophistication than people are assuming. See this thread with @mosseri, head of news feed: https://t.co/dD7vpXawnj
— Will Oremus (@WillOremus) January 24, 2018
How big a ranking signal is this “trust” thing? Casey Newton of The Verge asked, and sort of got an answer:
How trustworthy people feel a publisher is a actually a big signal for publishers for which we have data, and not a signal at all for publishers for which we don’t. It’s an important signal with limited coverage.
— Adam Mosseri (@mosseri) January 24, 2018
What’s clear is that Facebook has plenty of data from users already, which it can — will — use in conjunction with its two-question survey.
So, two public questions and answers but lots and lots of answers you have already given based on your behavior?
— David Clinch (@DavidClinchNews) January 24, 2018
Embedded in the trust question are additional user characteristics: Facebook is looking at how trusted a news organization is, by a range of users, with a range of news consumption habits.
Exactly, the dimension we're looking at is what people read. What we mean by broadly trusted is trusted by people with a wide range of reading habits.
— Adam Mosseri (@mosseri) January 20, 2018
That’s interesting. So the classic issue that results in Fox News being #1 for most trusted News brand and #1 for least trusted News brand works against them vs a News brand trusted by a diverse group? https://t.co/yjxWg7tIwl
— Jason Kint (@jason_kint) January 20, 2018
Some offered other ideas for measuring trust. Mike Caulfield, head of the Digital Polarization Initiative at the American Democracy Project, laid out this suggestion.
Facebook should ask news sources to provide meta tags as to whether something is news, opinion, or analysis, rate feed quality up or down based on these criteria. Users should also be given a slider control that lets them tweak news/opinion mix in feed.
— Mike Caulfield is tired, so tired. (@holden) January 24, 2018
In the end, rating based on qualities is going to be far more objective than people rating based on whether they trust individual papers: in case you haven't noticed, people aren't that objective, which is why we develop *methods* of reducing bias like the above.
— Mike Caulfield is tired, so tired. (@holden) January 24, 2018
From the rollout of the News Feed announcement to Facebook’s follow-up, it can really feel as if there’s a disconnect between teams (it’s a pure engineering problem vs. algorithms are flawed and need a human touch) and among its leadership (why has Facebook’s head of news partnerships Campbell Brown been so quiet?).
Facebook (and I'm totally speculating here) doesn't see the fake news problem the same way journalists or the general public thinks of it. They probably see it purely as a technical problem. How to prevent bad actors from gaming the inherent feedback loops (using ad placement)…
— irwin (@irwin) January 24, 2018
The more pertinent question is how Facebook defines "community" — is it people who live in the same city? The people I'm friends with? The people I interact with most? Think about college towns — how do you draw those community lines?
— irwin (@irwin) January 24, 2018
Some things are certain: Facebook will roll out new products for news. Announcements will use the phrases “meaningful connections” and “friends and family” and “community.” And news organizations will still stay on the platform, only a little bit less visible to users, and a little bit more battered by uncertainty.
Facebook's equation for local news:
-Updates on local businesses
-Syndicated national news
-Events today
-ClassifiedsHuh, almost like local news has been doing its job and more, without FB, for decadeshttps://t.co/briTZmgpWi
— Dave Gershgorn (@davegershgorn) January 25, 2018