Facebook is taking small steps toward reducing the amount of political news that some users in some countries see in their News Feeds, the company announced this week in a blog post:
Over the next few months, we’ll work to better understand peoples’ varied preferences for political content and test a number of approaches based on those insights. As a first step, we’ll temporarily reduce the distribution of political content in News Feed for a small percentage of people in Canada, Brazil and Indonesia this week, and the US in the coming weeks. During these initial tests we’ll explore a variety of ways to rank political content in people’s feeds using different signals, and then decide on the approaches we’ll use going forward. Covid-19 information from authoritative health organizations like the CDC and WHO, as well as national and regional health agencies and services from affected countries, will be exempt from these tests. Content from official government agencies and services will also be exempt.
U.S. Facebook users already don’t see that much political content in their News Feeds, according to both the company and to research we did last year. More than half the people in our survey saw no news at all. I wrote at the time:
141 of the 1,730 posts in my sample could be described as political content … In other words, in this survey, political content made up about 8% of what people saw. That’s pretty close to Facebook’s claimed 6%.
In the new test, Facebook will use “a machine learning model that is trained to look for signals of political content and predict whether a post is related to politics,” the company said in a statement.
It’s not clear how much this (again, temporary) change will affect the feeds of the small number of people who will be part of the experiment, or how Facebook’s algorithm will define “politics,” or what kinds of posts will be included in the changes: News articles and breaking political news? Memes? Political rants from relatives? Matt Kiser, who runs the politics blog and newsletter “What the Fuck Just Happened Today?” tweeted that he suspects the changes are underway already:
Pretty sure it’s already in the US… the @WTFJHT page is down 24% in reach over last 7 days and engagement down 34%. Never seen such a dramatic drop.
Thank god I didn’t build my business on Facefuckingbook. https://t.co/mYaDlC2oCF
— Matt Kiser (@Matt_Kiser) February 10, 2021
A lot of political discussion happens within private groups — and Facebook’s own internal analysis, made public by The Wall Street Journal in January, found that “70% of the top 100 most active U.S. Civic Groups are considered non-recommendable for issues such as hate, misinfo, bullying and harassment.” Facebook said last year and reiterated in January that it hadstopped automatically recommending new political groups to its users — though an investigation by The Markup found it was continuing to do so, and was especially likely to recommend groups to Trump voters. It’s unclear how political content from groups will be treated in the new, experimental News Feed changes.