Nieman Foundation at Harvard
HOME
          
LATEST STORY
The media becomes an activist for democracy
ABOUT                    SUBSCRIBE
June 21, 2023, 2:37 p.m.
Audience & Social

People don’t want robots picking their headlines (but they don’t really want editors doing it either)

“People do not usually have contrasting views of human and algorithmic selection. If they are skeptical of one, they’re likely to be skeptical of the other.”

Even though the use of social media as a source of news has seen little growth in recent years, this year’s Digital News Report again highlights the centrality of social media, search engines, news aggregators, and other platforms that use algorithms to select news. Younger groups in particular continue to be exposed to more news chosen by algorithms rather than by journalists or editors via publishers’ websites and apps, but they also seem to be increasingly ambivalent ­about the results.

Despite early enthusiasm about the democratizing potential of search engines and social media, the last decade has mainly been characterized by concern over possible negative effects — not just over the spread of misinformation, but also the possible emergence of “filter bubbles,” where algorithms combine data on past behavior and inferred preferences to show people more of what they like at the expense of diverse news and information exposure.

There has been extensive research into whether algorithmically driven platforms really do trap people inside filter bubbles — and at least when it comes to the question of diverse news exposure, this does not appear to be happening. For now, although most people do not have particularly diverse news repertories, platform use appears to slightly increase the diversity of people’s general news exposure, even as some people use social media to form partisan clusters around specific political issues.

There’s far less research on what people think about algorithmic news selection, and what they feel about news on platforms more broadly. This is a crucial missing piece of the puzzle, as the effects of news exposure at least partly depend on how people receive and process it.

Skepticism about all forms of selections grows

As part of our 2023 Digital News Report survey, designed to research people’s news use, behaviors, and attitudes across 46 markets, we asked a series of questions to probe how people think about algorithmic news selection. We asked respondents whether they agree that “having stories automatically selected for me on the basis of ‘what I have consumed in the past’ or ‘what my friends have consumed’ is a good way to get news.” To help interpret the results, we also asked respondents a similarly worded question about news selected by “editors and journalists’. We asked these same questions in our 2016 survey, allowing us to look at change over time.1

The results reveal that people are skeptical of all these ways of selecting news. Just under one third (30%) think that having stories selected on the basis of their own past behavior is a good way to get the news. Even fewer — just one fifth (19%) — say the same about news selected based on their friends’ past behavior.

Perhaps surprisingly, people are similarly skeptical of news selected by editors and journalists (27%), perhaps because many see these traditional sources as also laden with agendas and biases.

What we find, then, is what we’ve previously called “generalized skepticism,” whereby people are skeptical of all forms of news selection, whether done by humans or by algorithms. What’s more, attitudes about all forms of selection are positively correlated. This means that people do not usually have contrasting views of human and algorithmic selection. If they are skeptical of one, they’re likely to be skeptical of the other.

Proportion that agrees that each is a good way to get news — average of selected countries

Q10D_2016a_1/2/3. Please indicate your level of agreement with the following statements. Having stories selected for me by editors and journalists/automatically selected for me on the basis of what I have consumed in the past/automatically selected for me on the basis of what my friends have consumed is a good way to get news. Base: Total samples in 2016 = 53,330, 2023 = 53,039. Note: Questions asked in USA, UK, Germany, France, Italy, Spain, Portugal, Ireland, Norway, Sweden, Finland, Denmark, Belgium, Netherlands, Switzerland, Austria, Hungary, Czech Republic, Poland, Greece, Turkey, Japan, South Korea, Australia, Canada, Brazil.

The results from 2023 have changed little since we last measured them in 2016. Across all markets, the proportion that think their past consumption is a good basis for news selection has dropped by 6 percentage points, with a smaller 3 percentage point fall in approval of editorial selection and social recommendations. Importantly, disapproval has remained stable, and instead there has been a parallel 4–6 percentage point increase in the middle “neither agree nor disagree” category.

Overall, this means that the attitudes have remained broadly stable, with a slight increase in ambivalence over time. But in some countries, such as the U.K., the changes from 2016 are relatively large — especially for news selected by algorithms on the basis of past consumption, and especially with users under the age of 35.

Proportion that agrees that having stories automatically selected for them based on “what I have consumed in the past” is a good way to get news — UK and USA

Q10D_2016a_2. To what extent do you agree with the following statement? Having stories automatically selected for me on the basis of what I have consumed in the past is a good way to get news. Base: U35s/35+ in each year in the UK ≈ 449/1658, USA ≈ 603/1478.

Why are we seeing this shift among younger users? Many young people express low levels of interest in the news, and those who are interested often feel overwhelmed by the negative nature of news in their social media feeds. Young people are also more likely to use a wider range of social networks, each using algorithmic news selection in a different way, which could make it harder for some people to form a clear view. And we continue to see reasonably high levels of concern that “overly personalized” news could lead to missing out on important information (48%) or being exposed to fewer challenging viewpoints (46%).

Proportion that worries about missing out due to personalization — Average of selected markets

Q10D_2016b_1/2. To what extent do you agree with the following statement? I worry that more personalized news may mean that I miss out on important information/challenging viewpoints. Base: Total samples in USA, UK, Germany, France, Italy, Spain, Portugal, Ireland, Norway, Sweden, Finland, Denmark, Belgium, Netherlands, Switzerland, Austria, Hungary, Czech Republic, Poland, Greece, Turkey, Japan, South Korea, Australia, Canada, Brazil = 53,039.

Given these concerns, it is perhaps not surprising that some people have tried to influence story selection by following or unfollowing, muting or blocking, or changing other settings. But for those who do, the key objective is not to make the feed more fun or more interesting, but rather to make it more reliable, less toxic, or with a greater diversity of views. Yet, despite these clearly stated preferences, some social media companies competing for attention and advertising continue to optimize for engagement, with less attention to increasing quality, reliability, or diversity.

Proportion that say they are trying to achieve each when changing what news and information they see on online platforms — selected countries

Q2_Algorithms_2023. You said that you try to change what news and information you see on online platforms. What are you trying to achieve? Please select all that apply. Base: All those that tried to change what news they see via algorithms in UK = 910, USA = 1205, Germany = 996.

Over the next few years, the increased use of artificial intelligence is likely to place even more focus on the role of algorithms in shaping the selection and understanding of news and information. Our research highlights generalized skepticism around how content is selected for them today, so there is clearly room for something better.

While news consumers like the convenience of automatically selected content, they also worry about missing out on important news or challenging perspectives. Many news organizations are now exploring whether it is possible to mix algorithmically selected news with editorial approaches in a way that delivers more relevant, reliable, and valuable content for consumers, but given current levels of skepticism, the journey is unlikely to be straightforward.

Richard Fletcher is the director of research at Oxford University’s Reuters Institute for the Study of Journalism. Nic Newman is a senior research associate there.

Photo by Possessed Photography on Unsplash.

  1. To have data that was comparable to 2016, we only asked these questions in markets that were included in the 2016 survey: USA, UK, Germany, France, Italy, Spain, Portugal, Ireland, Norway, Sweden, Finland, Denmark, Belgium, Netherlands, Switzerland, Austria, Hungary, Czech Republic, Poland, Greece, Turkey, Japan, South Korea, Australia, Canada, and Brazil. Here are more details about the methodology for the 2023 Digital News Report. []
POSTED     June 21, 2023, 2:37 p.m.
SEE MORE ON Audience & Social
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
The media becomes an activist for democracy
“We cannot be neutral about this, by definition. A free press that doesn’t agitate for democracy is an oxymoron.”
Embracing influencers as allies
“News organizations will increasingly rely on digital creators not just as amplifiers but as integral partners in storytelling.”
Action over analysis
“We’ve overindexed on problem articulation, to the point of problem admiring. The risk is that we are analyzing ourselves into inaction and irrelevance.”