A new study looks at participants’ brain activity as they compare their own opinions to others’ to find out why it can be so very difficult to change someone’s mind.
Whether or not we like to admit it, each and every one of us is liable to exhibit confirmation bias. That is, we are more likely to seek people and information that appear to agree with our own beliefs.
In part, this explains why debates can be so stressful and often unrewarding: individuals are usually more inclined to stick to their own ideas, sometimes even when faced with solid evidence against them.
A team of researchers from City University and University College London — both in the United Kingdom — and Virginia Tech Carilion in Ronake, and the Museum of Science and Industry in Chicago, IL, questioned what, exactly, happens in the brain that makes people unlikely to change their opinions.
In their study paper — which now features in Nature Neuroscience — the investigators explain that, as previous research shows, “[p]eople are more influenced when others express judgments with high confidence than low confidence.”
The researchers illustrate this point with a couple of hypothetical examples: “All else being equal, if an eye witness is confident she observed Jim stabbing George, the jury would treat such testimony as strong evidence that Jim is guilty and would be more likely to convict Jim than if the eye witness was unsure it was Jim they observed. If a doctor is confident in her diagnosis, the patient is more likely to follow the recommended treatment.”
However, they go on to add, in many cases, people refuse to believe the ideas put forth by others, regardless of who they are and how strong — and evidence-based — they are.
“For instance,” the researchers note, “over the last decade climate scientists have expressed greater confidence that climate change is man-made. Yet, the percentage of the population that believes this notion to be true has dropped over the same period of time.”
Confirmation bias at work
To understand why there is this disconnect, and what makes it sometimes virtually impossible to change other people’s minds, the researchers recruited 42 participants who agreed to take part in an experiment that also involved undergoing functional MRI scans.
The researchers first split the participants randomly into pairs, showed them images of properties listed on a real estate website. They asked each person to decide how much they thought the asking price of these various houses was — whether more or less than an amount set by the investigators.
Every participant then had to decide how much they would be willing to invest in each one of those properties.
Finally, the researchers asked the participant pairs to undertake functional MRI scans. Paired participants lay in twinned scanners that faced each other, with a glass screen dividing them.
On the side of the screen that was facing them, each participant in a pair could see images of the properties, as well as their asking price estimates, and how much they said they would be willing to invest.
After these reminders, the screens showed what their partners had said — their house value estimates, and the sum they would be willing to pay for those properties.
The researchers found that, when their partners agreed with their evaluation of the property value, they would be more likely to say then that they would be willing to invest more in those houses, especially if their partners had said they would invest larger sums.
Yet when the partnered participants disagreed about the property value, their opinions would fail to influence each other’s final decision as to how much they would be willing to invest in that house. This was the case even when a disagreeing partner said they would pay a higher sum for the property, suggesting a high level of confidence in their evaluation of the house.
‘Brains fail to encode’ opposing views
When they studied participants’ brain activity, as revealed by the functional MRI scans, the researchers zeroed in on the brain area that appeared to be involved in evaluating and absorbing someone else’s ideas: the posterior medial prefrontal cortex.
The team saw that brain activity in the posterior medial prefrontal cortex fluctuated, depending on the strength of a partner’s conviction, as suggested by the value of the investment they were willing to make.
However, this was only the case when paired participants agreed about the value of the house. When they were in disagreement, there was no change in brain activity in the posterior medial prefrontal cortex.
“We found that when people disagree, their brains fail to encode the quality of the other person’s opinion, giving them less reason to change their mind.”
Senior author. Prof. Tali Sharot
This makes sense, the researchers note, considering that neuroscientists already know that this brain region plays an important role in decision making processes.
And, it is the fact that our brains ignore the strength or urgency of ideas that contradict our own that may explain why so many people are likely to persist in mistaken beliefs, establishing a gap between themselves and individuals with different ideas and belief systems.
“Our findings could help make sense of some puzzling observations in domains including science and politics,” says first author Andreas Kappes, Ph.D.
“Opinions of others are especially susceptible to the confirmation bias, perhaps because they are relatively easy to dismiss as subjective,” senior author Prof. Tali Sharot also notes.
“Because humans make the vast majority of decisions — including professional, personal, political and purchase decisions — based on information received from others, the identified bias in using the strength of others’ opinions is likely to have a profound effect on human behavior,” she points out.