The giant robot in the corner would like a word, or 500. It’s been standing there quietly for a while, helping out with mundane tasks — sorting and filing, looking up info and the like — generally staying out of the way unless called on. You might not have even noticed it. But you will.
Earlier this year, the text-to-image generation tools Stable Diffusion, Midjourney, and DALL-E 2 made impressive public debuts. You’ve no doubt seen the results in your social media feeds. And just this month, OpenAI released ChatGPT, a chatbot based on its unsettlingly smart GPT-3 large language model — quickly driving home both the promise and peril of generative AI.
Made with DALL-E 2: “Illustration of a robot sitting awkwardly at a conference table in a newsroom alongside several editors, who look concerned.” (A human illustrator would no doubt have done this better.)
It’s hard to overstate the disruptive potential of the machine-creative revolution we are witnessing — though some are clearly trying: “The death of artistry.” “The end of high-school English” (and the college essay, too, evidently). We now have non-developers producing functional code and a children’s book written and illustrated entirely by machine.
Of course, AI tools have been all over our industry for a while now: We’ve used them for transcription, translation, grammar checking, content classification, named entity extraction, image recognition and auto-cropping, content personalization and revenue optimization — among other specific purposes.
But emerging use cases made possible by generative tools — including text and image creation and text summarization — will broaden the scope of AI’s impact on our work.
I don’t imagine we’ll see GPT-3-produced copy in the pages of The New York Times in 2023, but it’s likely we’ll turn to machines for some previously unthinkable creative tasks. As we do, we will hopefully reflect on the risks.
Even the best generative AI tools are only as good as their training, and they are trained with data from today’s messy, inequitable, factually challenged world, so bias and inaccuracy are inevitable. Because their models are black boxes, it is impossible to know how much bad information finds its way into any of them.
But consider this: More than 80% of the weighted total of training data for GPT-3 comes from pages on the open web — including, for example, crawls of outbound links from Reddit posts — where problematic content abounds.
Add to that the tools’ disconcerting habit of obscuring sources and presenting wildly incorrect information with the same cheery confidence it applies to accurate answers, and you have high potential for misinformation (to say nothing of the dangers of deliberate misuse).
ChatGPT is incredibly limited, but good enough at some things to create a misleading impression of greatness.
it’s a mistake to be relying on it for anything important right now. it’s a preview of progress; we have lots of work to do on robustness and truthfulness.
— Sam Altman (@sama) December 11, 2022
Will these tools get better? Undoubtedly. We may be in something of an uncanny valley stage, and who knows how long that will last?
Right now, though, my take is this: For all its promise, generative AI can get more wrong, faster — and with greater apparent certitude and less transparency — than any innovation in recent memory. It will be tempting to deploy these tools liberally, and we know that some black-hat SEOs will be unable to pass up the opportunity to publish thousands of seemingly high-quality articles with zero human oversight. But the possible uses will always be more numerous than the advisable ones.
(And what will happen to model training, fact-checking, and general user experience when more and more of the information on the open web is produced by AIs? Will the web become one big AI echo chamber? OpenAI is already trying to build watermarking into GPT-3 to facilitate the detection of AI-generated text, but some experts believe this is a losing battle.)
Applying these powerful tools surgically to narrowly-defined use cases, while keeping humans in the loop and providing needed sourcing transparency (and credit!), will enable us to wield them for good.
But if we fail to build the necessary checks on AI’s creations, then the likelihood of students passing off robot-written text as their own will be the least of our worries.
Eric Ulken is a product director at Gannett.
The giant robot in the corner would like a word, or 500. It’s been standing there quietly for a while, helping out with mundane tasks — sorting and filing, looking up info and the like — generally staying out of the way unless called on. You might not have even noticed it. But you will.
Earlier this year, the text-to-image generation tools Stable Diffusion, Midjourney, and DALL-E 2 made impressive public debuts. You’ve no doubt seen the results in your social media feeds. And just this month, OpenAI released ChatGPT, a chatbot based on its unsettlingly smart GPT-3 large language model — quickly driving home both the promise and peril of generative AI.
Made with DALL-E 2: “Illustration of a robot sitting awkwardly at a conference table in a newsroom alongside several editors, who look concerned.” (A human illustrator would no doubt have done this better.)
It’s hard to overstate the disruptive potential of the machine-creative revolution we are witnessing — though some are clearly trying: “The death of artistry.” “The end of high-school English” (and the college essay, too, evidently). We now have non-developers producing functional code and a children’s book written and illustrated entirely by machine.
Of course, AI tools have been all over our industry for a while now: We’ve used them for transcription, translation, grammar checking, content classification, named entity extraction, image recognition and auto-cropping, content personalization and revenue optimization — among other specific purposes.
But emerging use cases made possible by generative tools — including text and image creation and text summarization — will broaden the scope of AI’s impact on our work.
I don’t imagine we’ll see GPT-3-produced copy in the pages of The New York Times in 2023, but it’s likely we’ll turn to machines for some previously unthinkable creative tasks. As we do, we will hopefully reflect on the risks.
Even the best generative AI tools are only as good as their training, and they are trained with data from today’s messy, inequitable, factually challenged world, so bias and inaccuracy are inevitable. Because their models are black boxes, it is impossible to know how much bad information finds its way into any of them.
But consider this: More than 80% of the weighted total of training data for GPT-3 comes from pages on the open web — including, for example, crawls of outbound links from Reddit posts — where problematic content abounds.
Add to that the tools’ disconcerting habit of obscuring sources and presenting wildly incorrect information with the same cheery confidence it applies to accurate answers, and you have high potential for misinformation (to say nothing of the dangers of deliberate misuse).
ChatGPT is incredibly limited, but good enough at some things to create a misleading impression of greatness.
it’s a mistake to be relying on it for anything important right now. it’s a preview of progress; we have lots of work to do on robustness and truthfulness.
— Sam Altman (@sama) December 11, 2022
Will these tools get better? Undoubtedly. We may be in something of an uncanny valley stage, and who knows how long that will last?
Right now, though, my take is this: For all its promise, generative AI can get more wrong, faster — and with greater apparent certitude and less transparency — than any innovation in recent memory. It will be tempting to deploy these tools liberally, and we know that some black-hat SEOs will be unable to pass up the opportunity to publish thousands of seemingly high-quality articles with zero human oversight. But the possible uses will always be more numerous than the advisable ones.
(And what will happen to model training, fact-checking, and general user experience when more and more of the information on the open web is produced by AIs? Will the web become one big AI echo chamber? OpenAI is already trying to build watermarking into GPT-3 to facilitate the detection of AI-generated text, but some experts believe this is a losing battle.)
Applying these powerful tools surgically to narrowly-defined use cases, while keeping humans in the loop and providing needed sourcing transparency (and credit!), will enable us to wield them for good.
But if we fail to build the necessary checks on AI’s creations, then the likelihood of students passing off robot-written text as their own will be the least of our worries.
Eric Ulken is a product director at Gannett.
Dominic-Madori Davis Everyone finally realizes the need for diverse voices in tech reporting
Jaden Amos TikTok personality journalists continue to rise
Paul Cheung More news organizations will realize they are in the business of impact, not eyeballs
Mary Walter-Brown and Tristan Loper Mission-driven metrics become our North Star
Walter Frick Journalists wake up to the power of prediction markets
Barbara Raab More journalism funders will take more risks
Victor Pickard The year journalism and capitalism finally divorce
Susan Chira Equipping local journalism
Tamar Charney Flux is the new stability
Alan Henry A reckoning with why trust in news is so low
AX Mina Journalism in a time of permacrisis
Lisa Heyamoto The independent news industry gets a roadmap to sustainability
Errin Haines Journalists on the campaign trail mend trust with the public
Gordon Crovitz The year advertisers stop funding misinformation
Jakob Moll Journalism startups will think beyond English
Christina Shih Shared values move from nice-to-haves to essentials
Parker Molloy We’ll reach new heights of moral panic
Sarah Alvarez Dream bigger or lose out
Hillary Frey Death to the labor-intensive memo for prospective hires
Sue Schardt Toward a new poetics of journalism
Gina Chua The traditional story structure gets deconstructed
Peter Bale Rising costs force more digital innovation
Priyanjana Bengani Partisan local news networks will collaborate
Elite Truong In platform collapse, an opportunity for community
Taylor Lorenz The “creator economy” will be astroturfed
Matt Rasnic More newsroom workers turn to organized labor
Nik Usher This is the year of the RSS reader. (Really!)
Anthony Nadler Confronting media gerrymandering
Cassandra Etienne Local news fellowships will help fight newsroom inequities
Snigdha Sur Newsrooms get nimble in a recession
Larry Ryckman We’ll work together with our competitors
Jim VandeHei There is no “peak newsletter”
Joanne McNeil Facebook and the media kiss and make up
Cari Nazeer and Emily Goligoski News organizations step up their support for caregivers
Jenna Weiss-Berman The economic downturn benefits the podcasting industry. (No, really!)
Amethyst J. Davis The slight of the great contraction
Tim Carmody Newsletter writers need a new ethics
Emily Nonko Incarcerated reporters get more bylines
David Skok Renewed interest in human-powered reporting
Michael Schudson Journalism gets more and more difficult
Sarabeth Berman Nonprofit local news shows that it can scale
Mauricio Cabrera It’s no longer about audiences, it’s about communities
Alexandra Borchardt The year of the climate journalism strategy
Andrew Donohue We’ll find out whether journalism can, indeed, save democracy
Shanté Cosme The answer to “quiet quitting” is radical empathy
John Davidow A year of intergenerational learning
Ben Werdmuller The internet is up for grabs again
A.J. Bauer Covering the right wrong
Emma Carew Grovum The year to resist forgetting about diversity
Jessica Clark Open discourse retrenches
Doris Truong Workers demand to be paid what the job is worth
Mario García More newsrooms go mobile-first
Kirstin McCudden We’ll codify protection of journalism and newsgathering
Sam Gregory Synthetic media forces us to understand how media gets made
Jennifer Brandel AI couldn’t care less. Journalists will care more.
Ryan Nave Citizen journalism, but make it equitable
Megan Lucero and Shirish Kulkarni The future of journalism is not you
Stefanie Murray The year U.S. media stops screwing around and becomes pro-democracy
Delano Massey The industry shakes its imposter syndrome
Anika Anand Independent news businesses lead the way on healthy work cultures
Al Lucca Digital news design gets interesting again
Leezel Tanglao Community partnerships drive better reporting
Laura E. Davis The year we embrace the robots — and ourselves
Molly de Aguiar and Mandy Van Deven Narrative change trend brings new money to journalism
Kavya Sukumar Belling the cat: The rise of independent fact-checking at scale
Ayala Panievsky It’s time for PR for journalism
Eric Nuzum A focus on people instead of power
Burt Herman The year AI truly arrives — and with it the reckoning
Mar Cabra The inevitable mental health revolution
Amy Schmitz Weiss Journalism education faces a crossroads
Eric Ulken Generative AI brings wrongness at scale
David Cohn AI made this prediction
Joni Deutsch Podcast collaboration — not competition — breeds excellence
Ryan Kellett Airline-like loyalty programs try to tie down news readers
Jarrad Henderson Video editing will help people understand the media they consume
Felicitas Carrique and Becca Aaronson News product goes from trend to standard
Simon Galperin Philanthropy stops investing in corporate media
Nicholas Diakopoulos Journalists productively harness generative AI tools
Laxmi Parthasarathy Unlocking the silent demand for international journalism
Eric Thurm Journalists think of themselves as workers
Esther Kezia Thorpe Subscription pressures force product innovation
Jody Brannon We’ll embrace policy remedies
Jennifer Choi and Jonathan Jackson Funders finally bet on next-generation news entrepreneurs
Jonas Kaiser Rejecting the “free speech” frame
Anita Varma Journalism prioritizes the basic need for survival
Ryan Gantz “I’m sorry, but I’m a large language model”
Mariana Moura Santos A woman who speaks is a woman who changes the world
Surya Mattu Data journalists learn from photojournalists
Cindy Royal Yes, journalists should learn to code, but…
Nicholas Jackson There will be launches — and we’ll keep doing the work
Alex Perry New paths to transparency without Twitter
Sue Robinson Engagement journalism will have to confront a tougher reality
Kathy Lu We need emotionally agile newsroom leaders
Anna Nirmala News organizations get new structures
Julia Beizer News fatigue shows us a clear path forward
Moreno Cruz Osório Brazilian journalism turns wounds into action
Masuma Ahuja Journalism starts working for and with its communities
Jessica Maddox Journalists keep getting manipulated by internet culture
Mael Vallejo More threats to press freedom across the Americas
Kaitlin C. Miller Harassment in journalism won’t get better, but we’ll talk about it more openly
Julia Angwin Democracies will get serious about saving journalism
Jim Friedlich Local journalism steps up to the challenge of civic coverage
Ståle Grut Your newsroom experiences a Midjourney-gate, too
Rachel Glickhouse Humanizing newsrooms will be a badge of honor
Jacob L. Nelson Despite it all, people will still want to be journalists
Elizabeth Bramson-Boudreau More of the same
Gabe Schneider Well-funded journalism leaders stop making disparate pay
Bill Adair The year of the fact-check (no, really!)
Brian Moritz Rebuilding the news bundle
Eric Holthaus As social media fragments, marginalized voices gain more power
Valérie Bélair-Gagnon Well-being will become a core tenet of journalism
Tre'vell Anderson Continued culpability in anti-trans campaigns
James Salanga Journalists work from a place of harm reduction
Johannes Klingebiel The innovation team, R.I.P.
Raney Aronson-Rath Journalists will band together to fight intimidation
Brian Stelter Finding new ways to reach news avoiders
Alexandra Svokos Working harder to reach audiences where they are
Karina Montoya More reporters on the antitrust beat
Peter Sterne AI enters the newsroom
Martina Efeyini Talk to Gen Z. They’re the experts of Gen Z.
Khushbu Shah Global reporting will suffer
Janet Haven ChatGPT and the future of trust
Francesco Zaffarano There is no end of “social media”
Don Day The news about the news is bad. I’m optimistic.
Sarah Marshall A web channel strategy won’t be enough
Kerri Hoffman Podcasting goes local
Michael W. Wagner The backlash against pro-democracy reporting is coming
Rodney Gibbs Recalibrating how we work apart
Nicholas Thompson The year AI actually changes the media business
Cory Bergman The AI content flood
Joe Amditis AI throws a lifeline to local publishers
Christoph Mergerson The rot at the core of the news business
Daniel Trielli Trust in news will continue to fall. Just look at Brazil.
S. Mitra Kalita “Everything sucks. Good luck to you.”
Kaitlyn Wells We’ll prioritize media literacy for children
Pia Frey Publishers start polling their users at scale
Josh Schwartz The AI spammers are coming
Dannagal G. Young Stop rewarding elite performances of identity threat
Sarah Stonbely Growth in public funding for news and information at the state and local levels
Zizi Papacharissi Platforms are over
Joshua P. Darr Local to live, wire to wither
Wilson Liévano Diaspora journalism takes the next step
Andrew Losowsky Journalism realizes the replacement for Twitter is not a new Twitter
Sue Cross Thinking and acting collectively to save the news
Richard Tofel The press might get better at vetting presidential candidates
Sam Guzik AI will start fact-checking. We may not like the results.
Upasna Gautam Technology that performs at the speed of news
Basile Simon Towards supporting criminal accountability
J. Siguru Wahutu American journalism reckons with its colonialist tendencies
Danielle K. Brown and Kathleen Searles DEI efforts must consider mental health and online abuse
Jesse Holcomb Buffeted, whipped, bullied, pulled
Sumi Aggarwal Smart newsrooms will prioritize board development
Dana Lacey Tech will screw publishers over
Bill Grueskin Local news will come to rely on AI
Ariel Zirulnick Journalism doubles down on user needs
Juleyka Lantigua Newsrooms recognize women of color as the canaries in the coal mine