The team prepared three documents. First, they wrote a ‘truth brief’ explaining that 97% of climate scientists agree that humans are responsible for climate change. They also prepared a ‘counter-brief’ revealing the flaws in the Oregon Petition – for instance, that among the Petition’s 31,000 names are people like the deceased Charles Darwin and the Spice Girls, and that fewer than 1% of the signatories are climate scientists.
When participants first were asked about the scientific consensus on climate change, they calculated it to be around 72% on average. But they then changed their estimates based on what they read. When the scientists provided a group with the ‘truth brief’, the average rose to 90%. For those who only read the Oregon Petition, the average sank to 63%. When a third group read them both – first the ‘truth brief’ and then the petition – the average remained unchanged from participants’ original instincts: 72%.
Enter inoculation. When a group of participants read the ‘truth brief’ and also were told that politically motivated groups could try to mislead the public on topics like climate change – the ‘vaccine’ – the calculated average rose to almost 80%. Strikingly, this was true even after receiving the Oregon Petition.
There is one great weakness of this approach: it takes a lot of time and effort to go case by case, inoculating people... if you receive the counterarguments to climate denial, you might still be vulnerable to fake news on other topics.
Fake news epistemic bubbles already use a different but related sort of inoculation : they sow distrust of any source that goes against their
Before believing a piece of new information, most people scrutinise it in at least five ways, they found. We usually want to know if other people believe it, if there is evidence supporting this new claim, if it fits with our previous knowledge on the matter (hence the grey-haired man, who might fit your idea of a senior citizen), if the internal argument makes sense and whether the source is credible enough.
So there's no reason to suppose that fake news can't use this same inoculation technique. Indeed by discrediting personal motivations, it already does. The second proposal in the article (which I've not quoted) of using a game to describe the general methods of manipulation techniques is better, but still suffers from the difficulty of being able to reach large numbers of people.
The best way to immunise people is through better educational systems, as that's the only common theatre of knowledge. If sufficiently aware of the problem, clickbait headlines lose their appeal and therefore advertising revenue. The second best way, given the existence of people who already believe ridiculous stuff and aren't in school any more, is regulation (either by direct action against the information itself or through more subtle financial or personal controls). I believe it is possible to do this without inducing any kind of backfire or Streisand effect and without impacting freedom of thought. And if I ever get the chance to finish writing it up, I shall tell you how.
http://www.bbc.com/future/story/20181114-could-this-game-be-a-vaccine-against-fake-news
No comments:
Post a Comment
Due to a small but consistent influx of spam, comments will now be checked before publishing. Only egregious spam/illegal/racist crap will be disapproved, everything else will be published.