I'm only online intermittently at the moment but this was quite interesting. Not enthralling, just quite interesting.
It's no use simply telling people they have their facts wrong. To be more effective at correcting misinformation in news accounts and intentionally misleading "fake news," you need to provide a detailed counter-message with new information—and get your audience to help develop a new narrative. Those are some takeaways from an extensive new meta-analysis of laboratory debunking studies published in the journal Psychological Science. The analysis, the first conducted with this collection of debunking data, finds that a detailed counter-message is better at persuading people to change their minds than merely labeling misinformation as wrong.
"The effect of misinformation is very strong," said co-author Dolores Albarracin, professor of psychology at the University of Illinois at Urbana-Champaign. "When you present it, people buy it. But we also asked whether we are able to correct for misinformation. Generally, some degree of correction is possible but it's very difficult to completely correct. Simply stating that something is false or providing a brief explanation is largely ineffective."
That last statement seems to be somewhat at odds with earlier findings that the backfire effect is eliminated most effectively by presenting just the facts.
The study found that "the more detailed the debunking message, the higher the debunking effect. But misinformation can't easily be undone by debunking. The formula that undercuts the persistence of misinformation seems to be in the audience. A detailed debunking message correlated positively with the debunking effect. Surprisingly, however, a detailed debunking message also correlated positively with the misinformation-persistence effect."
I wonder if that's just a selection effect. People who are willing to read longer articles tend to be the most interested, so tend to be either the most willing to debunk the debunking or support it. But perhaps they've accounted for that in their statistical analysis, I don't know.
However, Albarracin said the analysis also showed that debunking is more effective - and misinformation is less persistent - when an audience develops an explanation for the corrected information. "What is successful is eliciting ways for the audience to counterargue and think of reasons why the initial information was incorrect," she said. For news outlets, involving an audience in correcting information could mean encouraging commentary, asking questions, or offering moderated reader chats - in short, mechanisms to promote thoughtful participation.
.
https://phys.org/news/2017-09-debunking-ways-counter-misinformation-fake.html
Sister blog of Physicists of the Caribbean in which I babble about non-astronomy stuff, because everyone needs a hobby
Subscribe to:
Post Comments (Atom)
Whose cloud is it anyway ?
I really don't understand the most militant climate activists who are also opposed to geoengineering . Or rather, I think I understand t...
-
"To claim that you are being discriminated against because you have lost your right to discriminate against others shows a gross lack o...
-
For all that I know the Universe is under no obligation to make intuitive sense, I still don't like quantum mechanics. Just because some...
-
Hmmm. [The comments below include a prime example of someone claiming they're interested in truth but just want higher standard, where...
In a trust vacuum, doesn't matter who says what. Confirmation bias puts its fat thumb on the scales: people only believe what what they want to hear. Thus, we can cancel out terms and reduce credibility to whom among the various tale tellers seems more believable. Advertising has always understood this: they'll dress up some actor in a lab coat and put a stethoscope around his neck - hey presto, instant gravitas, medical authority.
ReplyDeleteIt's not enough to furnish the facts, explain how the problem is more complex than some breathless account on a blog, or - sadly - in some mainstream science writer's account of a Huge Scientific Breakthrough, replete with misused statistics and out-of-context quotes.
The article alludes to the solution: Snopes, FactCheck, that sort of site - must wear the armour of trust. Trust is earned. But cultivating distrust in the general population, that's harder. Overcoming confirmation bias is a problem, even among well-meaning, scientifically-minded people.