The Dunning Kruger effect famously describes a phenomenon whereby those with low ability lack the fundamental ability to assess their capabilities, and therefore usually have a much higher belief in their talents than is warranted.
New research from UCL finds a similar phenomenon in those with radical political beliefs, with those people who hold political beliefs at the extremes of the political spectrum less able to know when they’re wrong, even on matters unrelated to politics.
The findings emerged from an experiment whereby participants were asked to complete a perceptual task. The researchers observed little noticeable difference between the participants in terms of performance, but did notice that those with radical beliefs tended to overestimate the certainty with which they believed incorrect answers were correct.
“We were trying to clarify whether people who hold radical political beliefs are generally overconfident in their stated beliefs, or if it boils down to differences in metacognition, which is the ability we have to recognise when we might be wrong,” the authors explain. “We found that people who hold radical political beliefs have worse metacognition than those with more moderate views. They often have a misplaced certainty when they’re actually wrong about something, and are resistant to changing their beliefs in the face of evidence that proves them wrong.”
Knowing our limits
Participants were first asked to complete a survey that was designed to gauge their political beliefs, with those respondents on both the far left and far right typically holding the most radical views, with a general intolerance for opposing points of view.
The volunteers were then asked to complete the perceptual task that involved comparing two images to determine which had the most dots. After making their prediction, they were asked to rate their confidence level in that prediction.
Despite performing equally well on the task itself, those with a more moderate political disposition appeared better equipped to gauge their own knowledge. The authors hypothesized that this was because moderates were better able to absorb new information, even if it conflicts with their core position.
This was tested via a second experiment whereby after guessing the picture with the most dots, they were given extra information about the correct answer that they could use to update their original guess. In theory, had they made the wrong guess originally, the extra information should have weakened their confidence in that original guess. Alas, it only appeared to do this with those moderates in the group.
“The differences in metacognition between radicals and moderates were robust and replicated across two data sets, but this self-knowledge ability only explained a limited amount of the variance in radicalism. We suspect that this is because the task is completely unrelated to politics – people may be even more unwilling to admit to being wrong if politics had come into play,” the authors explain.