A recent study from the University of Michigan suggests that when media reports on a scientific study omit the numerical magnitude of its effect, there’s a notable increase in the risk of people developing biases. Without the numerical information provided by media outlets, individuals might wrongly perceive the study’s findings as more significant and impactful than they actually are.
“People often make everyday decisions based on science findings they read about in the media,” the researchers explain. “However, people might assume that scientific findings are more impactful than they truly are.”
The true magnitude
The study proposes that when numerical information indicating the magnitude of a discovery is absent, people tend to assume that the finding is substantially significant, a concept scientists refer to as “practically significant.”
This assumption could lead individuals to adopt ineffective health, dietary, and lifestyle interventions based on incomplete information, according to the study. On the contrary, openly providing the magnitude of findings enables people to make more informed decisions in their daily lives. The study involved 800 adults and their responses to interventions of varying sizes.
Participants who read about the benefits of a costly intervention (e.g., improved student math performance with a new, expensive math curriculum) without reported magnitude (“Group A improved more than Group B”) were more likely to support the intervention compared to those who read about a minimally effective benefit (e.g., “Group A improved 2% more than Group B”). Interestingly, their likelihood of endorsing the intervention was similar to those who read about a substantially impactful benefit (e.g., “Group A improved 10% more than Group B”).
Understanding the findings
“Laypeople tended to assume that scientific findings had meaningfully large effects or were of high practical significance,” the authors say. “Failing to report the magnitude of science findings is thus potentially misleading for the general public.”
In summary, individuals who read about findings with a notably significant benefit were more inclined to support the intervention compared to those who read about a minimally impactful benefit. The researchers posit that this highlights the importance of reporting the magnitudes of scientific findings in media coverage for people to make well-informed decisions.
Interestingly, participants with lower numeracy skills were more prone to endorse interventions with trivially sized benefits than those with higher numeracy skills. This implies that individuals with lower numeracy may require additional assistance in comprehending whether the magnitude of scientific findings holds meaningful significance.