That human decision making can be prone to be irrational and distorted by a wide range of cognitive biases is well known, and ordinarily we think those biases stick with us through thick and thin, even when evidence suggests we should discard them. A recent study from Columbia University suggests that’s not always the case however, and that the brain is easily capable of applying logic to such a situation.
“As we interact with the world every day, our brains constantly form opinions and beliefs about our surroundings,” the researchers say. “Sometimes knowledge is gained through education, or through feedback we receive. But in many cases we learn, not from a teacher, but from the accumulation of our own experiences. This study showed us how our brains help us to do that.”
The researchers wanted to understand what extent prior knowledge could be modified if new or conflicting evidence emerged. They found out by presenting participants in the study with a computer simulation that saw a group of dots move across the screen. They were asked to judge whether the dots moved to the left or the right. Whilst this sounds easy, the movement patterns of the dots weren’t immediately clear, so it was actually quite challenging.
After several rounds of the task, the participants were given a second to perform that required them to judge whether the computer program behind the simulation had any underlying bias. Unbeknownst to the participants, the researchers had indeed introduced a bias into the computer so that the movement of the dots was not evenly distributed.
“The bias varied randomly from one short block of trials to the next,” the team explain. “By altering the strength and direction of the bias across different blocks of trials, we could study how people gradually learned the direction of the bias and then incorporated that knowledge into the decision-making process.”
Overcoming bias
The team took two distinct approaches to evaluate how we learn biases in the first place. The first of these monitored how bias influenced the decision making of participants, and especially their confidence in those decisions. They then explicitly asked people to report on the most likely direction of the dots during the trials. The two approaches highlighted how the participants would use sensory evidence to update, and if necessary override their directional bias. What’s more, they did so without being told that their original decisions were wrong.
“Originally, we thought that people were going to show a confirmation bias, and interpret ambiguous evidence as favoring their preexisting beliefs” the authors say. “But instead we found the opposite: People were able to update their beliefs about the bias in a statistically optimal manner.”
The team believe this occurred because the brain was weighing up two options at once. The first of these was whether a bias existed, whilst the second was whether it didn’t.
“Even though their brains were gradually learning the existence of a legitimate bias, that bias would be set aside so as not to influence the person’s assessment of what was in front of their eyes when updating their belief about the bias,” they explain. “In other words, the brain performed counterfactual reasoning by asking ‘What would my choice and confidence have been if there were no bias in the motion direction?’ Only after doing this did the brain update its estimate of the bias.”
Suffice to say, it’s a surprising outcome, not least to the researchers themselves, who found that the brain is capable of performing in an unexpectedly rational way, despite the plethora of ways in which we ourselves behave irrationally.
Whilst they didn’t explore why the brain does this, the team hypothesize that it may revolve around the stories we tell ourselves and the influence they have on the decision-making process.
“We tend to navigate through particularly complex scenarios by telling stories, and perhaps this storytelling — when layered on top of the brain’s underlying rationality — plays a role in some of our more irrational decisions; whether that be what to eat for dinner, where to invest (or not invest) your money or which candidate to choose,” they conclude.