August 17, 2021 / Ed Rutkowski

New Research on Scientific Misinformation

Since the COVID-19 pandemic began, rapid advances in knowledge related to the disease have contributed to contradictory messaging from public health authorities and policymakers. The public has had to grapple with not only the understandable and expected disagreements among scientists about an evolving crisis but a disheartening amount of misinformation (incorrect information perceived as valid) and disinformation (false information intended to mislead). To the extent that misinformation and disinformation contribute to vaccine hesitancy, they are prolonging the pandemic. What, if anything, can be done about it?

A deeper understanding of the problem may be found in what the National Academies of Sciences, Engineering, and Medicine refers to as “the science of science communication.” Since 2012, the National Academies has held four colloquia on this relatively young field. Papers from the most recent colloquium appear in the April 2021 issue of The Proceedings of the National Academies of Sciences. These papers, all of which are freely available on the PNAS website, address a wide range of issues related to the effects of misinformation and disinformation. In this post, I’ll summarize the three April PNAS papers that have the most potential value for occupational and environmental health and safety professionals, both for OEHS practitioners, who have an opportunity to address misinformation and disinformation within the workforces they protect, and OEHS academics.

Misinformation and Public Opinion of Science and Health: Approaches, Findings, and Future Directions

By Michael A. Cacciatore

Cacciatore’s paper explains the “continued influence effect,” or CIE, which is “the tendency for information that is initially presented as true, but later revealed to be false, to continue to affect memory and reasoning.” According to Cacciatore, there are three main explanations for this phenomenon:

  • People build on-the-fly mental models of events in real time and are unwilling to alter them. For example, if new information calls into question the assumed cause of an event but an alternative explanation isn’t available, people may reject the new information. In effect, people prefer inconsistent models over incomplete ones.
  • A “retrieval failure” causes people to pull incorrect information from their memory even when that information has been discredited.
  • The very act of retracting incorrect information leads to its persistence in peoples’ memories. To the extent that a retraction repeats the incorrect claim, it may inadvertently reinforce that claim.

Other explanations for the CIE include motivated reasoning, echo chambers (where people who think alike reinforce each other’s understanding), and filter bubbles (communities where algorithms constrain members’ choices).

One way to counter the CIE is through narratives. “An alternative narrative fills the gap in a recipient’s mind when a key piece of evidence is retracted,” Cacciatore writes. “To maximize effectiveness, the alternative narrative should be plausible, should account for the information that was removed by the retraction, and should explain why the misinformation was believed to be correct.”

Misinformation in and about Science

By Jevin D. West and Carl T. Bergstrom

West and Bergstrom discuss several practices in modern scientific research that, the authors argue, contribute to misinformation. Scientific publishers, like news organizations, are competing for readers, and many use tactics similar to those employed by journalism sites, such as clickbait titles. “Scientific communication has fallen victim to the ill effects of an attention economy,” the authors observe, and “[t]he unvarnished truth is not always enough to capture our attention.” Scientists themselves are pressured to hype their research, given that tenure and promotions often hinge on publication in prestigious journals.

Another consequence of the competition for readers and the outsize influence of publication on academic careers is that journals tend to publish studies with statistically significant, positive results. Such studies get more attention and are easier for university press offices to promote than negative studies. In addition, researchers who achieve negative results may not bother to spend time writing them up since they are unlikely to further their careers. These trends lead to publication bias—a written record where the high preponderance of positive studies does not accurately reflect the science. This distortion is compounded by citation bias (the greater likelihood for papers that support a claim to be cited than those that don’t support it) and quotation errors (quotations used to justify claims that the source doesn’t actually support).

To address these issues, West and Bergstrom advocate for changes to the practices of hiring committees, promotion committees, and funding agencies that would encourage researchers to produce a smaller number of higher-quality papers. An additional recommendation is for more journals to adopt a new form of research paper known as a “registered report” in which a journal commits to publishing the results of a study, whether positive or negative, as long as the study design is sound.

The Narrative Truth about Scientific Misinformation

By Michael F. Dahlstrom

Dahlstrom examines the role of storytelling in both perpetuating and combating scientific misinformation. Storytelling—a mode of communication that employs characterization, temporality, and causality—has a bad rap among some scientists, who distrust the emotional appeals and particularization of experience that give stories their power. Dahlstrom explains how these features can benefit science; for example, studies show that narratives help increase comprehension and recall. Stories can also help people make connections between events.

In the end, Dahlstrom writes, both stories and science have a common aim: to help people understand the world “and find our place in it.”

For Further Reading

These and other papers from the fourth National Academies colloquium on advancing science communication are available from the PNAS website. Related SynergistNOW posts include “Talking about Science: Goodbye—and Good Riddance—to the Deficit Model” and “Science and Storytelling.”

Ed Rutkowski

Ed Rutkowski is the editor-in-chief of The Synergist.

Comments

Misinformation In and About Science

The problem of the need for Journals to publish articles outside of their immediate expertise, with inadequate peer review is something we all need to call out when it happens. Recently, a completely, invalid article in JAMA pediatrics made the argument of the harm of face masks in children due to "high CO2 level." The "science" was bad and unfortunately it was not reviewed adequately and it made its way into social media and was quoted as being authoritative. I and others contacted JAMA and the paper was retracted. We must all be vigilant and call out these false references as required.

By Brian Berke on August 17, 2021 4:43pm

Add a Comment