Nieman Lab.
Predictions for
Journalism, 2024.
It is truly magical to speak to a machine using the same language you’d use to talk to a human. And that initial awe at the technology commonly known as generative AI caught a lot of us in the hype over artificial intelligence this past year.
We also learned that while the large language models (LLMs) that power generative AI are great at stringing words together, they have a tendency to BS their way through it. A few media outlets ran into trouble publishing AI-generated words without paying attention to what the words actually said. A promising year also ended with the very human turmoil at OpenAI between the nonprofit board and CEO Sam Altman, raising concerns about governance and control of these powerful tools.
All that shouldn’t lead us to dismiss how dramatically generative AI and advances in machine learning will change the world. On the product side, these challenges are where much AI work is now focusing, using techniques like retrieval-augmented generation (also known as RAG) to focus language models on citing specific sources for responses. Politicians are acting quickly to enact guardrails and regulations to manage potential risks, like the Biden administration’s executive order on AI.
We have barely scratched the surface of how this new way to interact with machines using simple language will supercharge human capabilities. AI developments relating to journalism will take a few directions in the coming year:
More AI helping behind the scenes, less handing reins to AI: One common refrain has been that “AI won’t replace humans, but humans who use AI will replace those who don’t.” That’s been true about many technological innovations, from the steam engine to the internet. Jobs change. To that end, we will see more useful AI-fueled products that empower journalists behind the scenes to help with mundane aspects of their jobs. There will be more AI “interns” or “co-pilots” assisting with fact-checking, setting up calls with sources, doing basic research — but the humans will definitely remain in the loop, and in control.
More AI models built in-house: So far the only major media company that has built its own large language model has been Bloomberg. The costs of training LLMs are going down, and surely other large media companies are working on similar projects to train or fine-tune models with their archives. As the technology becomes more democratized and cheaper, eventually media companies of every size will have their own custom LLM that reflects their unique style and voice, can pull from their archives, tracks the sources they cite and monitors them for updates, pulls data from public sources to find insights, and much more. Some of these models will also be useful as consumer-facing products for the communities they serve, for improved search and personalization that is actually useful.
More open source, less reliance on large platforms: We have a chance to try again and learn from the tortured history between technology companies and journalism — and not to tie our fates to companies where interests aren’t aligned. Open source is progressing quickly, and we can build technologies that are more resilient and not reliant on a single company that is a gatekeeper or a single point of failure. Journalism is inherently open source and all about sharing learnings to better our communities, and we should work with companies who operate in a similar spirit of sharing and transparency.
Burt Herman is cofounder and board chair of Hacks/Hackers.
It is truly magical to speak to a machine using the same language you’d use to talk to a human. And that initial awe at the technology commonly known as generative AI caught a lot of us in the hype over artificial intelligence this past year.
We also learned that while the large language models (LLMs) that power generative AI are great at stringing words together, they have a tendency to BS their way through it. A few media outlets ran into trouble publishing AI-generated words without paying attention to what the words actually said. A promising year also ended with the very human turmoil at OpenAI between the nonprofit board and CEO Sam Altman, raising concerns about governance and control of these powerful tools.
All that shouldn’t lead us to dismiss how dramatically generative AI and advances in machine learning will change the world. On the product side, these challenges are where much AI work is now focusing, using techniques like retrieval-augmented generation (also known as RAG) to focus language models on citing specific sources for responses. Politicians are acting quickly to enact guardrails and regulations to manage potential risks, like the Biden administration’s executive order on AI.
We have barely scratched the surface of how this new way to interact with machines using simple language will supercharge human capabilities. AI developments relating to journalism will take a few directions in the coming year:
More AI helping behind the scenes, less handing reins to AI: One common refrain has been that “AI won’t replace humans, but humans who use AI will replace those who don’t.” That’s been true about many technological innovations, from the steam engine to the internet. Jobs change. To that end, we will see more useful AI-fueled products that empower journalists behind the scenes to help with mundane aspects of their jobs. There will be more AI “interns” or “co-pilots” assisting with fact-checking, setting up calls with sources, doing basic research — but the humans will definitely remain in the loop, and in control.
More AI models built in-house: So far the only major media company that has built its own large language model has been Bloomberg. The costs of training LLMs are going down, and surely other large media companies are working on similar projects to train or fine-tune models with their archives. As the technology becomes more democratized and cheaper, eventually media companies of every size will have their own custom LLM that reflects their unique style and voice, can pull from their archives, tracks the sources they cite and monitors them for updates, pulls data from public sources to find insights, and much more. Some of these models will also be useful as consumer-facing products for the communities they serve, for improved search and personalization that is actually useful.
More open source, less reliance on large platforms: We have a chance to try again and learn from the tortured history between technology companies and journalism — and not to tie our fates to companies where interests aren’t aligned. Open source is progressing quickly, and we can build technologies that are more resilient and not reliant on a single company that is a gatekeeper or a single point of failure. Journalism is inherently open source and all about sharing learnings to better our communities, and we should work with companies who operate in a similar spirit of sharing and transparency.
Burt Herman is cofounder and board chair of Hacks/Hackers.