Ask any journalist today — especially a woman, person of color, and anyone else from a marginalized community — about how it feels to be a journalist on the internet and the answer will probably be along the lines of exhausting, unpleasant, and scary.
For most, if not all, journalists, having a public profile online is expected or required as part of the job. And while that can be helpful to build trust, get tips, and create community, it also makes a person more susceptible to online and potentially offline abuse.
And if journalists need to put ourselves out there for work, what responsibility do employers have to protect them?
At the United Kingdom’s largest news publisher, Reach PLC, the company — which employs 3,000 journalists across nine national newspapers, 110 regional papers, and 80 online-only sites — has decided to shoulder some of that responsibility. In November, Dr. Rebecca Whittington started as its first-ever online safety editor. She’s in charge of making the internet a safer place for Reach’s journalists and readers.
That’s hardly an easy or straightforward job. I recently caught up with Whittington to talk about what it’s like to be the first online safety editor in the U.K., how Reach currently addresses online abuse, and how she plans to help journalists feel safer while they work. The interview is lightly edited for length and clarity.
[I found] that journalists recognized the opportunities posed by online tools — but also a lack of understanding, from news organizations and from individual journalists, about how to manage the changes that these tools bring about in terms of news production.
For example, if you have social media and you’re using social media to promote your work and to interact with communities online, that’s great. That also means that when you’re chilling out at night after a day in the office, you get messages coming through. They can be invasive or abusive and it makes you more vulnerable in some ways, while also strengthening your position as a journalist in the community in other ways.
I don’t think I’d be doing this role if it weren’t for the research skills that doing my PhD gave me. I’ll be applying quite a lot of those in terms of gathering evidence at Reach and hopefully contributing to some of the amazing research that’s already been done in this area.
It’s very early days, but I am starting to find out already from journalists I’m working with what their personal issues have been, what the issues are that their colleagues have faced, and what Reach [has already] done.
[Online abuse] is an unfortunate fact of life. The opportunities that are opened by the internet have also resulted in clear disadvantages to people working as public figures or as journalists within the online sphere. The UNESCO Research Group demonstrated the rise in online hate against women journalists. Really interesting research conducted at Cardiff University, called the Hate Lab, demonstrates that there is a correlation between online abuse and violence in the physical space.
In the U.K., the Online Safety Bill is being considered. A member of Parliament was recently stabbed to death and he’d been receiving threatening messages online. So it’s not just journalism, and it’s not just Reach; it goes into our public spheres and society as a whole.
How do we regulate online spaces that have so far been difficult to regulate? I’m not going to be able to solve this issue by myself. Nobody is going to be able to solve this issue solely. We need to work collaboratively with online platforms, we need to look at accountability within our own networks, and we need to get other people to do the same. I think collaboration is going to be absolutely key to this, because collaborative voices can be heard more strongly.
Not all of it is as extreme as that in terms of volume or number. But we are also seeing journalists, on a daily basis, being told they’re worth rubbish, why don’t they go and get another job somewhere else. That kind of thing, being undermined and demoralized on a daily basis. It goes from one extreme to the other.
We have seen hate crimes take place, racism, threats against life. In those circumstances, we always encourage the victim to report it to the police and we support them with that reporting.
In a case recently, where there were very indirect threats being made on a social media platform, it was really difficult to know who they were coming from. We ended up having our security people involved with that so that they could support that journalist.
It’s a real variety of things that we’re seeing. And I think that’s why it’s such a challenging issue in some ways — we don’t expect anyone to take any kind of abuse or to accept any kind of abuse, but there is a variety of abuse. Not only that, but there are lots of different reasons that people engage in abuse and so we’re not going to find a one-size-fits-all solution. We can find a series of solutions that hopefully work together.
As part of that, we need to work out good systems so that people can report easily and so that we can then help them be heard within the organization. Reach has got some really good policies and procedures in place but part of my role is going to be making sure that they are up to scratch.
[Reach did] a health survey in 2021 that asked journalists about online safety issues. Of the editorial staff who responded, half said they’d had some experience of online abuse. And out of that, 85% said that it was in response to something that they’d published or promoted or shared online. That’s a big challenge, because we need to help our workforce be confident in working in these online space.
The other key thing, for me, is going to be working with the audience that’s there for the right reason: Because they’re engaged, because they’re interested in what we’re producing, because they’ve got a connection to our products. I think it’s really important to help them feel safe online, too. We’ve got some people that are coming into those spaces and making [others] feel unsafe. It’s not just our journalists who are offected by that.