Research Reveals Why We So Often Think The World Is Getting Worse

By most measures, the world is getting better.  Whether it’s health, wealth, education or equality, most areas of the world are showing clear signs of improvement, yet in numerous polls, people tend to think things are getting infinitely worse.  A recent Harvard study suggests the reason for this is something known as ‘prevalence induced concept change.’

Across a number of experiments, the researchers highlight how we tend to redefine a problem when the prevalence of it is reduced.  In other words, as the size of the problem becomes smaller, we often perceive it to be larger.

“Our studies show that people judge each new instance of a concept in the context of the previous instances,” the researchers say. “So as we reduce the prevalence of a problem, such as discrimination for example, we judge each new behavior in the improved context that we have created.”

“Another way to say this is that solving problems causes us to expand our definitions of them,” they continue. “When problems become rare, we count more things as problems. Our studies suggest that when the world gets better, we become harsher critics of it, and this can cause us to mistakenly conclude that it hasn’t actually gotten better at all. Progress, it seems, tends to mask itself.”

All change is at stake

What’s more, it isn’t just large, societal level problems that suffer from this phenomenon.  The research found that it even emerged during the relatively minor problem solving tasks the participants were asked to complete.  What’s more, it even occurred when people were specifically warned to look out for it, and even when they were incentivized to avoid it.

A good example of this came from one experiment that required participants to observe the prevalence of threatening faces in a lineup.  When the number of threatening faces was reduced, people instead began to identify neutral faces as being threatening.

The phenomenon is not all bad of course, and the researchers give an example of an emergency room doctor who can use it productively when triaging patients.

“If the ER is full of gunshot victims and someone comes in with a broken arm, the doctor will tell that person to wait,” they explain. “But imagine one Sunday where there are no gunshot victims. Should that doctor hold her definition of “needing immediate attention” constant and tell the guy with the broken arm to wait anyway? Of course not! She should change her definition based on this new context.”

In many other instances, however, such a mindset is problematic.  For instance, it would be very odd for a radiologist to update their definition of what a tumor is and continue to find them even when they’ve gone.  That’s a good example of when you should accept your work is done.  The research suggests this is not as easy to do as we might think.

Fixing our thinking

As with so many studies, the authors raise more questions than provide answers, but they do nonetheless point out the prevalence of holding constant views in circumstances when they should really be expanded.

Whilst not providing solutions per se, they do nonetheless believe that our organizations may benefit from having institutional mechanisms to help guard against this prevalence-induced concept change.

“Anyone whose job involves reducing the prevalence of something should know that it isn’t always easy to tell when their work is done,” they conclude. “On the other hand, our studies suggest that simply being aware of this problem is not sufficient to prevent it. What can prevent it? No one yet knows. That’s what the phrase ‘more research is needed’ was invented for.”