Nieman Lab.
Predictions for
Journalism, 2025.
“Don’t read the comments.”
It’s advice editors often give writers when their first piece goes viral online. Yet it’s always struck me as odd guidance — why publish on the internet, a medium built for interconnection, only to ignore how readers respond to your work? What “don’t read the comments” almost always really means is “our comments section is trash.” And a trash comments section is reflective of a larger, longer-term mistake of not investing in moderation.
In the best circumstances, the web’s interplay creates something magical: Writers share insights while readers — often writers themselves — deepen the conversation and add valuable context. It’s a meaningful interaction for all involved, and a reliable level for growth, if done well.
And yet. We’ve all witnessed comment sections deteriorate, whether on a single post or across an entire platform (see: x.com).
One key difference between thriving and flailing online spaces? Well-crafted moderation policies, effectively enforced.
2025 will amplify an existing trend: the growing divide between online spaces that invest in moderation and those that don’t. X’s first transparency report since Elon Musk took over shows the platform received over 66 million user reports of hateful conduct in the first half of 2024, but suspended only 2,361 accounts. Part of the reason is likely the change in direction of its content policy over the last few years, including removing protections for transgender users. And many platforms are doing what TikTok did in October — laying off human content moderation teams and replacing them with AI.
Bluesky, meanwhile, recently announced it was quadrupling the size of its content moderation team.
Readers want some level of protection from hateful content. They want a robust, understandable set of rules about what conduct is and isn’t acceptable, and the knowledge those rules will be enforced fairly and consistently.
There are good comments sections online, but they’re never an accident. It takes intention, focus, and the will to have rules and believe in them to create a shared online space that works. This year we’ll see that tension — the laissez-faire approach vs. robust moderation — play out across the web in ways big and small. Engaged readers will be watching.
Scott Lamb is the VP of content at Medium.
“Don’t read the comments.”
It’s advice editors often give writers when their first piece goes viral online. Yet it’s always struck me as odd guidance — why publish on the internet, a medium built for interconnection, only to ignore how readers respond to your work? What “don’t read the comments” almost always really means is “our comments section is trash.” And a trash comments section is reflective of a larger, longer-term mistake of not investing in moderation.
In the best circumstances, the web’s interplay creates something magical: Writers share insights while readers — often writers themselves — deepen the conversation and add valuable context. It’s a meaningful interaction for all involved, and a reliable level for growth, if done well.
And yet. We’ve all witnessed comment sections deteriorate, whether on a single post or across an entire platform (see: x.com).
One key difference between thriving and flailing online spaces? Well-crafted moderation policies, effectively enforced.
2025 will amplify an existing trend: the growing divide between online spaces that invest in moderation and those that don’t. X’s first transparency report since Elon Musk took over shows the platform received over 66 million user reports of hateful conduct in the first half of 2024, but suspended only 2,361 accounts. Part of the reason is likely the change in direction of its content policy over the last few years, including removing protections for transgender users. And many platforms are doing what TikTok did in October — laying off human content moderation teams and replacing them with AI.
Bluesky, meanwhile, recently announced it was quadrupling the size of its content moderation team.
Readers want some level of protection from hateful content. They want a robust, understandable set of rules about what conduct is and isn’t acceptable, and the knowledge those rules will be enforced fairly and consistently.
There are good comments sections online, but they’re never an accident. It takes intention, focus, and the will to have rules and believe in them to create a shared online space that works. This year we’ll see that tension — the laissez-faire approach vs. robust moderation — play out across the web in ways big and small. Engaged readers will be watching.
Scott Lamb is the VP of content at Medium.