Overconfidence in one’s knowledge has long been accepted as a problem in terms of decision making. Research from Laboratório de Instrumentação e Física Experimental de Partículasm, in Lisbon, highlights the scale of the problem.
“Overconfidence occurs when individuals subjectively assess their aptitude to be higher than their objective accuracy [and] has long been recognized as a critical problem in judgment and decision making,” the researchers explain. “Past research has shown that miscalibrations in the internal representation of accuracy can have severe consequences but how to gauge these miscalibrations is far from trivial.”
Excess confidence
In the world of science, being too sure of yourself can be a problem. Not realizing you don’t know something can lead to bad decisions, affect public policies, and even harm your health.
The study looked at data from four big surveys done in Europe and the U.S. over 30 years. The goal was to come up with a new way to measure how confident people are in their knowledge, and how it relates to what they actually know.
The researchers used surveys with questions like “True,” “False,” or “Don’t know.” They looked at the ratio of wrong answers to “Don’t know” responses. If people gave wrong answers when they didn’t know the right one, it meant they were too confident about what they thought they knew. This new measure is easy to use and doesn’t require people to compare themselves to others or say how sure they are.
Bias toward overconfidence
The study found two important things. First, people tend to be more overconfident than knowledgeable, especially when they have some knowledge but not a lot. Second, people with moderate knowledge and high confidence were the least positive about science.
“This combination of overconfidence and negative attitudes towards science is dangerous, as it can lead to the dissemination of false information and conspiracy theories, in both cases with great confidence,” the authors explain.
The researchers followed this up with a quantitative survey that measured a couple of metrics of trust. The results confirmed that trust tends to increase much faster than knowledge does. They believe their findings are crucially important for areas such as science communication.
Effective communication
“Science communication and outreach often prioritize simplifying scientific information for broader audiences,” they explain. “While presenting simplified information might offer a basic level of knowledge, it could also lead to increased overconfidence gaps among those with some (albeit little) knowledge. There is a common sense idea that ‘a little knowledge is a dangerous thing’ and, at least in the case of scientific knowledge that might very well be the case.”
So, the study tells us that just trying to make people more knowledgeable isn’t enough. We also need to help them understand that there’s still a lot they don’t know. This is important because it can have unexpected results. The study also says that we should focus our efforts on people who know a moderate amount about a subject because they tend to have less positive views about it, and they make up most of the population.
However, the researchers say we should be careful. The way they measured confidence may not work for all subjects, and their study doesn’t prove that one thing causes another. They also found that different people and cultures can react differently.
In summary, this study suggests that we should look for better ways to measure what people know and how confident they are about it, taking into account possible differences between individuals and cultures.