Facebook announced a batch of changes in how it organizes your News Feed today, and their organizing principle seems to be: Maybe we should ask people what they want to see? Here’s product management director Aastha Gupta:
Our goal with News Feed is to arrange the posts from friends, Groups and Pages you follow to show you what matters most to you at the top of your feed. Our algorithm uses thousands of signals to rank posts for your News Feed with this goal in mind. This spring, we’re expanding on our work to use direct feedback from people who use Facebook to understand the content people find most valuable. And we’ll continue to incorporate this feedback into our News Feed ranking process.
As the kids on Filepile used to say, [this is good]. An overreliance on implicit feedback has historically biased Facebook’s algorithms toward content that drives big emotions, whether good or bad — stuff that inspires you to tap that ❤️ or 😡. Anyone who’s done a consumer survey knows that what people want and what people say they want can be two different things. But giving people more room to say “I like that, more please” and “Um, please, shut my uncle up before he memes again” should improve the signal quality.
One part of that direct feedback is what Facebook calls “worth your time” surveys:
In 2019, we introduced surveys to ask people, “Is this post worth your time?” and we use that feedback to inform how we arrange posts in their News Feed going forward. For example, if people say a post is worth their time, we’ll aim to show posts like that higher in News Feed, and if it isn’t worth their time, we’ll aim to show posts like that closer to the bottom. We also use surveys to better understand how meaningful different friends, Pages, and Groups are to people, and ranking algorithms are updated based on the responses.
While a post’s engagement — or how often people like it, comment on it, or share it — can be a helpful indicator that it’s interesting to people, this survey-driven approach, which largely occurs outside the immediate reaction to a post, gives a more complete picture of the types of posts people find most valuable and what kind of content detracts from their News Feed experience. Now, we’re building on these surveys by asking new questions about the content people find valuable as well as the content people don’t enjoy seeing in their News Feed.
Whenever Facebook tweaks its algorithms, journalists and publishers circle back to the eternal question: Yes, but is it good for the news? Will asking for more direct feedback mean news stories’ tiny share of people’s News Feeds — which Facebook estimates at less than 4% — goes up or down?
Well, that’ll depend on how people respond, of course. If you say you like news stories, you’ll probably get more of them. But it’ll also depend on the questions Facebook is asking. One of those questions made my eyes roll, but that probably says more about the deep rot at the center of my soul than anything wrong with Facebook:
Whether people find a post inspirational: People have told us they want to see more inspiring and uplifting content in News Feed because it motivates them and can be useful to them outside of Facebook. For example, a post featuring a quote about community can inspire someone to spend more time volunteering, or a photo of a national park can inspire someone to spend more time in nature. To this end, we’re running a series of global tests that will survey people to understand which posts they find inspirational. We’ll incorporate their responses as a signal in News Feed ranking, with the goal of showing people more inspirational posts closer to the top of their News Feed.
That should help cross-cultural-hugs-near-murals content, in the vein of old Upworthy, but doesn’t sound like it’ll help most hard news. This one might not either:
Better understanding content people want to see less of: Increasingly, we’re hearing feedback from people that they’re seeing too much content about politics and too many other kinds of posts and comments that detract from their News Feed experience. This is a sensitive area, so over the next few months, we’ll work to better understand what kinds of content are linked with these negative experiences. For example, we’ll look at posts with lots of angry reactions and ask people what kinds of posts they may want to see less of.
It’ll be interesting to see whether this means getting rid of more politics news (i.e., stories in mainstream media about local or national politics) or politics content (i.e., partisan messaging of the kind that historically thrives on Facebook and doesn’t do much to lower the temperature in the room). “Less politics” might hit the former, “fewer posts that make people angry” might hit the latter.
Gupta and Facebook already announced a broad depoliticization of the News Feed in February. (I’d imagine the end of the Trump administration likely contributed to News Feed depoliticization even without any algorithmic intervention.)
This seeming uptick in humility — acknowledging that Facebook’s algorithms don’t always increase user happiness or quality of experience — pairs well with last month’s change to make it easier for people to see their News Feed in raw reverse chronological order — an unmunged form that ignores the algorithm altogether.
All in all, it’s hard to criticize Facebook for being interested in learning what its users want. If someone doesn’t want any politics in their News Feed, that’s fine! I don’t expect Facebook to force-feed news to people who don’t want it. All I want is for it to avoid using its imperfect system of implicit signals to push people to content they wouldn’t say they wanted if you asked them beforehand.
Not many people would check a box that said: “Yes, I’d like an ever-increasing dose of ever-more-radicalizing political content that make me more prone to conspiracy theories and hatred of the political ‘other.'” And yet that’s what overfitting an algorithm to flawed inputs can do. Doing more explicit asking — “Do you like this? Do you want more or less of it?” — can only be a step in the right direction.