Prediction
Adjusting to a tech-heavy but code-light world
Name
Daniel Trielli
Excerpt
“In a world where GenAI presents itself as the solution, we will return to the social sciences, Humanities, and even old-style journalism.”
Prediction ID
44616e69656c-25
 

In 2025 — and beyond — we will discover, once again, that the transformative new shiny gizmos we put to work in newsrooms are just the beginning of a long, continuous journey to improving journalism and society.

In the past couple of years, newsrooms have been flooded with offers for new state-of-the-art software solutions that promise to fix various issues, increase productivity, augment reach, and beef up engagement. The advent of generative AI made its impact noticed in any type of work — news publishing included — and became one the most talked-about topics (see the last year’s Nieman Lab Predictions). In 2025, as we domesticate AI and learn how to incorporate these tools to fit our purposes and capabilities, we’ll discover that the skill to use tech is just the starting point: Tech is just one of the many tools at our disposal to accomplish our goals of social impact, alongside other strengths like market position, storytelling skills, political power, and relationships with our audience and with the people and entities that we cover.

I’m no Luddite; if I were, I wouldn’t be teaching data journalism and the potential positive impact of AI in newsrooms. And I would have spent less time during my newsroom years telling people we should automate some of the stuff we did, like copying information off websites or cleaning up the same typos or style problems that came from the news wire. Technology can meet a lot of needs: How to be faster. How to have a prettier design. How to expand on what we can show to our readers.

But while technology enables us to do things, it gives little explanation of why we should be doing things. So in 2025, as those shiny new toys start to feel the stress of day-to-day journalism, the old essential questions will continue to bedevil us: How do readers (now with their own lives thoroughly impacted by AI) understand, internalize, and act upon what they see in the news? What changes will our reporting and publishing affect society and the human experience? Are we writing about the things we should be writing about — not to satisfy some algorithmic platform out there, but to make our existence have meaning? How much longer will we be able to report on what we should be reporting, given economic, technological, and political forces? How do these forces converge to impact what we do?

This has happened before. When search and social media emerged as news publishers’ primary sources of web traffic, the hype revolved around all the metrics that platforms could offer. With time, however, the race for raw audience reach lost its luster. Audience editors realized that having a large number was just part of the goal. The quality of that number — how engaged this audience was, how socially impactful the story was, and what readers would go on to do — those were the crucial things.

That’s the deal with technology: The solutions it creates can be great, but they can also be solutions for things that aren’t actually problems — or aren’t the problems we should be caring about. The internet allows us to see how many minutes a user stays on a page, how that user scrolls up or down, or what they click on. What we should focus on, however, is what that reader (as a reader, not a user) is thinking about when they are scrolling through news outlets’ pages.

Then, in 2016, at the peak of data journalism excitement and the hype around predictive polling, reality came to an industry that predicted a Hillary Clinton victory. In the post-mortem of that election, some commenters criticized election prediction in particular and data journalism in general. But what 2016 taught us was not that data and computation shouldn’t have a role in the news — it was that they are techniques that have limitations and have to be understood in the social context in which they are deployed.

My old doctoral department at Northwestern University had a great mix of technologists and social scientists. That was the nature of the department, which focused on researching the interaction between technology and how we communicate. In that environment, it was always illuminating to see great scholars of computer science and engineering realize the limitations of their solutions simply due to human behavior. Users often do what we don’t expect, and the implications of our designs are usually only visible by having a broader, more critical view of social mechanisms. A motto emerged: Social sciences are the hard sciences.

In a world where AI presents itself as the solution, we will return to the social sciences, humanities, and even old-style journalism. Why? Because complex answers require complex questions and complex analysis. AI aggregates knowledge, and it does that relatively well. But we still have plenty of knowledge to generate about the human condition.

In recent years, we’ve seen a dramatic upswing in computer science programs, motivated by the tech industry’s encroachment into everything. But with AI helping us code, demand for actual hard coders will subside. Computer science education is adjusting to the new times by increasing the focus on abstraction in its curriculum. Futurists are pointing to the importance of the humanities in a new tech-heavy but code-light world. Even in the boldest dreams about AI and journalism integration, it’s clear that market forces and human behavior will drive success, not technology alone. The new wave in journalism will not be what we can take from technologists to improve our industry; it is what journalism can do to shape these technologies and how we can go beyond them.

Daniel Trielli is an assistant professor of media and democracy at the University of Maryland.

In 2025 — and beyond — we will discover, once again, that the transformative new shiny gizmos we put to work in newsrooms are just the beginning of a long, continuous journey to improving journalism and society.

In the past couple of years, newsrooms have been flooded with offers for new state-of-the-art software solutions that promise to fix various issues, increase productivity, augment reach, and beef up engagement. The advent of generative AI made its impact noticed in any type of work — news publishing included — and became one the most talked-about topics (see the last year’s Nieman Lab Predictions). In 2025, as we domesticate AI and learn how to incorporate these tools to fit our purposes and capabilities, we’ll discover that the skill to use tech is just the starting point: Tech is just one of the many tools at our disposal to accomplish our goals of social impact, alongside other strengths like market position, storytelling skills, political power, and relationships with our audience and with the people and entities that we cover.

I’m no Luddite; if I were, I wouldn’t be teaching data journalism and the potential positive impact of AI in newsrooms. And I would have spent less time during my newsroom years telling people we should automate some of the stuff we did, like copying information off websites or cleaning up the same typos or style problems that came from the news wire. Technology can meet a lot of needs: How to be faster. How to have a prettier design. How to expand on what we can show to our readers.

But while technology enables us to do things, it gives little explanation of why we should be doing things. So in 2025, as those shiny new toys start to feel the stress of day-to-day journalism, the old essential questions will continue to bedevil us: How do readers (now with their own lives thoroughly impacted by AI) understand, internalize, and act upon what they see in the news? What changes will our reporting and publishing affect society and the human experience? Are we writing about the things we should be writing about — not to satisfy some algorithmic platform out there, but to make our existence have meaning? How much longer will we be able to report on what we should be reporting, given economic, technological, and political forces? How do these forces converge to impact what we do?

This has happened before. When search and social media emerged as news publishers’ primary sources of web traffic, the hype revolved around all the metrics that platforms could offer. With time, however, the race for raw audience reach lost its luster. Audience editors realized that having a large number was just part of the goal. The quality of that number — how engaged this audience was, how socially impactful the story was, and what readers would go on to do — those were the crucial things.

That’s the deal with technology: The solutions it creates can be great, but they can also be solutions for things that aren’t actually problems — or aren’t the problems we should be caring about. The internet allows us to see how many minutes a user stays on a page, how that user scrolls up or down, or what they click on. What we should focus on, however, is what that reader (as a reader, not a user) is thinking about when they are scrolling through news outlets’ pages.

Then, in 2016, at the peak of data journalism excitement and the hype around predictive polling, reality came to an industry that predicted a Hillary Clinton victory. In the post-mortem of that election, some commenters criticized election prediction in particular and data journalism in general. But what 2016 taught us was not that data and computation shouldn’t have a role in the news — it was that they are techniques that have limitations and have to be understood in the social context in which they are deployed.

My old doctoral department at Northwestern University had a great mix of technologists and social scientists. That was the nature of the department, which focused on researching the interaction between technology and how we communicate. In that environment, it was always illuminating to see great scholars of computer science and engineering realize the limitations of their solutions simply due to human behavior. Users often do what we don’t expect, and the implications of our designs are usually only visible by having a broader, more critical view of social mechanisms. A motto emerged: Social sciences are the hard sciences.

In a world where AI presents itself as the solution, we will return to the social sciences, humanities, and even old-style journalism. Why? Because complex answers require complex questions and complex analysis. AI aggregates knowledge, and it does that relatively well. But we still have plenty of knowledge to generate about the human condition.

In recent years, we’ve seen a dramatic upswing in computer science programs, motivated by the tech industry’s encroachment into everything. But with AI helping us code, demand for actual hard coders will subside. Computer science education is adjusting to the new times by increasing the focus on abstraction in its curriculum. Futurists are pointing to the importance of the humanities in a new tech-heavy but code-light world. Even in the boldest dreams about AI and journalism integration, it’s clear that market forces and human behavior will drive success, not technology alone. The new wave in journalism will not be what we can take from technologists to improve our industry; it is what journalism can do to shape these technologies and how we can go beyond them.

Daniel Trielli is an assistant professor of media and democracy at the University of Maryland.