"The vast majority of respondents believe the COVID-19 vaccines are safe, that human activity is responsible for climate change, and that 5G technology is not harmful," the report said.
The report felt that the impact of so-called echo chambers may be similarly exaggerated and there's little evidence to support the "filter bubble" hypothesis (basically, algorithm-fuelled extremist rabbit holes).
The researchers also highlighted that many debates about what constitutes misinformation are rooted in disputes within the scientific community and that the anti-vax movement is far broader than any one set of beliefs or motivations.
It said that government and social media companies should not rely on "constant removal" of misleading content [because it is] not a "solution to online scientific misinformation."
It warned that if conspiracy theorists are driven out of places like Facebook, they could retreat into parts of the web where they were unreachable.
The report makes a distinction between removing scientific misinformation and other content like hate speech or illegal media, where removals may be more effective.
"Whilst this approach may be effective and essential for illegal content (eg hate speech, terrorist content, child sexual abuse material) there is little evidence to support the effectiveness of this approach for scientific misinformation, and approaches to addressing the amplification of misinformation may be more effective. In addition, demonstrating a causal link between online misinformation and offline harm is difficult to achieve, and there is a risk that content removal may cause more harm than good by driving misinformation content (and people who may act upon it) towards harder-to-address corners of the internet."
Instead of removal, the Royal Society researchers advocate developing what they call "collective resilience." Pushing back on scientific disinformation may be more effective via other tactics, such as demonetization, systems to prevent amplification of such content, and fact-checking labels.
The report encourages the UK government to continue fighting back against scientific misinformation but to emphasize society-wide harms that may arise from issues like climate change rather than the potential risk to individuals for taking the bait.
Other strategies the Royal Society suggests are continuing the development of independent, well-financed fact-checking organizations; fighting misinformation "beyond high-risk, high-reach social media platforms"; and promoting transparency and collaboration between platforms and scientists. Finally, the report mentions that regulating recommendation algorithms may be effective.