Can More Knowledge Be A Bad Thing?

Can more knowledge ever be a bad thing? Researchers at Cornell University say it can—especially when people use it for their own benefit rather than for the greater good.

In groups of rational individuals, more knowledge can sometimes backfire. For example, better understanding the cost-benefit of wearing a face mask to prevent disease might reduce cooperation among those driven by self-interest.

“We usually think that scientific breakthroughs, which help us understand the world better, are always good,” the researchers say. “But our study shows that in reality, where people often act in their own interest, this isn’t always true. Science isn’t always the cure-all we think it is.”

Drawing on history

The researchers draw on history to make their point. From Robert Oppenheimer’s fears after the first atomic bomb test to Galileo’s moon studies, the Prisoner’s Dilemma, and Elizabeth Newton’s 1990 experiment at Stanford, Basu and Weibull show that the “knowledge curse” can happen when only a few people know more.

In their Base Game, each player has two actions to choose from, creating four possible outcomes with different payoffs. Each player tries to maximize their own payoff. However, adding another set of options, where one player might get nothing while both could get a small reward, makes the mutual small reward more appealing. This is similar to the Prisoner’s Dilemma, where two “prisoners” can either cooperate for mutual benefit or betray each other for personal gain. In this way, more “knowledge” can lead to worse outcomes for the group.

The paper goes further, showing that even without new options, simply a deeper understanding of payoffs can make players worse off.

The authors apply their theory to real-world issues, like making policy without knowing all the details of a problem. For instance, writing a nation’s constitution requires anticipating future problems under uncertain conditions. “Such preemptive laws have given large benefits to humankind,” the authors say.

They argue that just because a bad situation is human-caused doesn’t mean it can be prevented.

“Game theory reminds us that individual responsibility doesn’t always translate to group responsibility,” they explain. “We need to create rules to guard against collective problems.”

In the end, the researchers offer a practical takeaway. “By pointing out this paradox,” they conclude, “we urge policymakers and the public to consider preemptive actions and moral commitments to prevent disasters that future scientific advances might cause. Science can bring huge benefits, but we need safeguards.”

Facebooktwitterredditpinterestlinkedinmail