While the research described below focuses on Australia, 5G-related conspiracy theories are definitely not native to that country. In a survey last month, Pew found that 71 of Americans had heard at least “a little” about the conspiracy theory that the coronavirus was intentionally planned by powerful people: and, of those who’d heard something about it, “36% say it is ‘definitely’ or ‘probably’ true — equivalent to a quarter of all U.S. adults.” — Ed.
“Fake news” is not just a problem of misleading or false claims on fringe websites, it is increasingly filtering into the mainstream and has the potential to be deeply destructive.
My recent analysis of more than 500 public submissions to a parliamentary committee on the launch of 5G in Australia shows just how pervasive misinformation campaigns have become at the highest levels of government. A significant number of the submissions peddled inaccurate claims about the health effects of 5G.
These falsehoods were prominent enough the committee felt compelled to address the issue in its final report. The report noted that “community confidence in 5G has been shaken by extensive misinformation preying on the fears of the public spread via the internet, and presented as facts, particularly through social media.”
This is a remarkable situation for Australian public policy — it is not common for a parliamentary inquiry to have to rebut the dodgy scientific claims it receives in the form of public submissions.
While many Australians might dismiss these claims as fringe conspiracy theories, the reach of this misinformation matters. If enough people act on the basis of these claims, it can cause harm to the wider public.
In late May, for example, protests against 5G, vaccines, and Covid-19 restrictions were held in Sydney, Melbourne, and Brisbane. Some protesters claimed 5G was causing Covid-19 and the pandemic was a hoax — a “plandemic” — perpetuated to enslave and subjugate the people to the state.
Misinformation can also lead to violence. Last year, the FBI for the first time identified conspiracy theory-driven extremists as a terrorism threat.
Conspiracy theories that 5G causes autism, cancer and COVID-19 have also led to widespread arson attacks in the UK and Canada, along with verbal and physical attacks on employees of telecommunication companies.
To better understand the nature and origins of the misinformation campaigns against 5G in Australia, I examined the 530 submissions posted online to the parliament’s standing committee on communications and the arts.
The majority of submissions were from private citizens. A sizable number, however, made claims about the health effects of 5G, parroting language from well-known conspiracy theory websites.
A perceived lack of “consent” (for example, here, here, and here) about the planned 5G roll-out featured prominently in these submissions. One person argued she did not agree to allow 5G to be “delivered directly into” the home and “radiate” her family.
To connect sentiments like this to conspiracy groups, I looked at two well-known conspiracy sites that have been identified as promoting narratives consistent with Russian misinformation operations — the Center for Research on Globalization (CRG) and Zero Hedge.
CRG is an organization founded and directed by Michel Chossudovsky, a former professor at the University of Ottawa and opinion writer for Russia Today.
CRG has been flagged by NATO intelligence as part of wider efforts to undermine trust in “government and public institutions” in North America and Europe.
Zero Hedge, which is registered in Bulgaria, attracts millions of readers every month and ranks among the top 500 sites visited in the U.S. Most stories are geared toward an American audience.
Researchers at RAND have connected Zero Hedge with online influencers and other media sites known for advancing pro-Kremlin narratives, such as the claim that Ukraine, and not Russia, is to blame for the downing of Malaysia Airlines flight MH17.
For my research, I scoured the top posts circulated by these groups on Facebook for false claims about the health threats posed by 5G. Some stories I found had headlines like “13 reasons 5G wireless technology will be a catastrophe for humanity” and “Hundreds of respected scientists sound alarm about health effects as 5G networks go global.“
I then tracked the diffusion of these stories on Facebook and identified 10 public groups where they were posted. Two of the groups specifically targeted Australians — Australians for Safe Technology, a group with 48,000 members, and Australia Uncensored. Many others, such as the popular right-wing conspiracy group QAnon, also contained posts about the 5G debate in Australia.
To determine the similarities in phrasing between the articles posted on these Facebook groups and submissions to the Australian parliamentary committee, I used the same technique to detect similarities in texts that is commonly used to detect plagiarism in student papers.
The analysis rates similarities in documents on a scale of 0 (entirely dissimilar) to 1 (exactly alike). There were 38 submissions with at least a 0.5 similarity to posts in the Facebook group 5G Network, Microwave Radiation Dangers and other Health Problems and 35 with a 0.5 similarity to the Australians for Safe Technology group.
This is significant because it means that for these 73 submissions, 50% of the language was, word for word, exactly the same as the posts from extreme conspiracy groups on Facebook.
The process for soliciting submissions to a parliamentary inquiry is an important part of our democracy. In theory, it provides ordinary citizens and organizations with a voice in forming policy.
My findings suggest Facebook conspiracy groups and potentially other conspiracy sites are attempting to co-opt this process to directly influence the way Australians think about 5G.
In the pre-internet age, misinformation campaigns often had limited reach and took a significant amount of time to spread. They typically required the production of falsified documents and a sympathetic media outlet. Mainstream news would usually ignore such stories and few people would ever read them.
Today, however, one only needs to create a false social media account and a meme. Misinformation can spread quickly if it is amplified through online trolls and bots.
It can also spread quickly on Facebook, with its algorithm designed to drive ordinary users to extremist groups and pages by exploiting their attraction to divisive content.
And once this manipulative content has been widely disseminated, countering it is like trying to put toothpaste back in the tube.
Misinformation has the potential to undermine faith in governments and institutions and make it more challenging for authorities to make demonstrable improvements in public life. This is why governments need to be more proactive in effectively communicating technical and scientific information, like details about 5G, to the public.
Just as nature abhors a vacuum, a public sphere without trusted voices quickly becomes filled with misinformation.
Michael Jensen is a senior research fellow at the University of Canberra’s Institute for Governance and Policy Analysis. This article is republished from The Conversation under a Creative Commons license.