As this is an hour-long podcast, I took notes.
People do not update their views automatically when new facts come along (as we know, they're not Bayes nets). They cling to their beliefs in the face of strong contrary evidence. Indeed such evidence can produce a "backfire effect", causing them not just to dig in their heels and defend that belief more strongly, but actually believe more strongly.
There's a flip side of confirmation bias : not only are people more trusting and less skeptical of things they already believe, but they're also less trusting and more skeptical of contrary positions.
Scientists do not suffer from this to such a great extent as other groups in part because of the system in which they operate, which rewards presenting data and forming a conclusion. This is not always the case in other spheres, and scientists being more objective than other people is not (at least wholly) due to their being innately more rational than everyone else.
It seems that the backfire effect may have a breaking point : give people enough information and they do change their minds. Researchers conducted a study in which participants were given information about a fake presidental primary election. Beforehand they were given a questionnaire to fill in about their pre-existing political beliefs. The researchers then gave them news stories about the various candidates they could vote for. Participants were divided into different groups which received different fractions of neutral and negative news, from entirely neutral up to 80% negative. This negative news was based on the individual questionnaires, tailored to paint the candidates in a way the participants were likely to personally, viscerally object to.
They found that levels of around 10-15% negative news produced the backfire effect. At around 15-30% the backfire effect was still present, but participants began to also start to question their preferred candidate more critically. Beyond 30% the backfire effect dropped away and they began to change their opinion about their candidates.
This is a laboratory test, of course, where the researchers had total control over what the participants were reading (though they were allowed to decide for themselves how much they read). In the real world people have much more control, selecting trusted news sources for themselves. So they can be much harder to reach because they never see the negative facts or alternative opinions. This means that actually getting them to hear this much negative information is extremely difficult : it's not that people don't know, it's that they refuse to know. A more optimistic take-away point is that people do change their minds if you can reach them.
Being aware of this effect can help, but it doesn't stop it entirely. The researchers conclude that to win people over, you have to keep trying. Keep bringing facts and entering discussions. Those are important, however, they are not the whole story.
It's true that people don't learn if you don't give them anything to learn, so you do have to keep debating and bringing evidence. But there are other factors at work. If they have become especially convinced of an idea, it becomes part of their identity. And the brain does not like having its identity threatened, so it rejects the attack. An alternative view may seem terrifying as it threatens a believer's world view and their whole identity. That's why people do have a motivation to disbelieve alternative ideas even when it seems they should have no good reason to do so.
The most deeply convinced hardcore supporters are not impossible to reach but it will require a long struggle. Even those who are more persuadable have to be approached with more than just data. One approach is not to explicitly try and debunk an idea, but engage a believer as though you were helping to solve a mystery together (also, even just mentioning whatever myth they believe can reinforce it in their own minds). Perhaps the overt effort to debunk is unconsciously seen as a threat, "you must be stupid to believe that".
Another approach is not to try and change people's beliefs to change their behaviour, but to change their behaviour to change their beliefs. E.g. get employees to wash their hands by stamping them with something that takes a few washes to remove, thus forming a habit which then becomes a belief in the importance of cleanliness.
While it's okay to engage in multiple lines of inquiry in an openly-debating situation, where the stated goal is to reach a conclusion and change minds, this doesn't necessarily work in all situations. If people think you're going for overkill, they may sense that you're just trying to change their mind and not establish the truth : if one reason is enough, why are you giving me ten ? Hence politicians go for repetitive, limited sound bites with strong messages.
Finally, people like narrative, causal explanations. Never leave a gap. If your message undermines their existing belief for why a thing happens, you must simultaneously replace it with another one. People prefer an incorrect model to an incomplete one - if you simply take something away from them without offering any replacement, you will do more harm than good.
https://boingboing.net/2017/02/13/how-to-fight-back-against-the.html
Sister blog of Physicists of the Caribbean in which I babble about non-astronomy stuff, because everyone needs a hobby
Subscribe to:
Post Comments (Atom)
Whose cloud is it anyway ?
I really don't understand the most militant climate activists who are also opposed to geoengineering . Or rather, I think I understand t...
-
"To claim that you are being discriminated against because you have lost your right to discriminate against others shows a gross lack o...
-
For all that I know the Universe is under no obligation to make intuitive sense, I still don't like quantum mechanics. Just because some...
-
Hmmm. [The comments below include a prime example of someone claiming they're interested in truth but just want higher standard, where...
Commenting to bookmark for later more careful review
ReplyDeleteExcellent summary, Rhys Taylor.
ReplyDeleteThat last sentence though....
ReplyDelete