Sister blog of Physicists of the Caribbean in which I babble about non-astronomy stuff, because everyone needs a hobby

Monday, 3 December 2018

Controlling the spread of information : part I

First of a three-part trilogy examining the backfire effect, persuasion, and the spread of information in general. What's the best way to persuade someone and what will cause them to believe the exact opposite of what you wanted ? Is it possible to prevent unwanted ideas from spreading or is this hopeless ? Can we do so without killing free and critical inquiry or is it all just so much unbearable authoritarian censorship ? I don't know, but here's my best guess based on other people's philosophy and sociological studies.

This first post is a sort of introductory overview to the whole shebang. Part 2 is about persuasion between individuals and part 3 looks at why this can sometimes be useless in a network context. I begin by looking at some extreme examples, to try and define some upper limits from which we might try and understand the much more interesting and murky middle ground.


The backfire effect happens largely when you already dislike something : either that specific piece of information, or, perhaps more interestingly, you have other reasons to be predisposed to disliking a new idea. There's also a very important distinction between disliking the source and disliking what it says. For example, I loathe Gizmodo's sanctimonious reviews of apolitical products, e.g. movies, but they tend to make me hate Gizmodo itself more than they ever make me hate whatever it is they're reviewing.

Note the key difference between changing a strength of belief and changing the stance. Techniques to cause both effects are not necessarily the same : insulting members of different groups isn't going to cause them to like you, but it may strengthen the bonds within your own group. Most political memes, on that basis, seem to make a massive error by trying to rally support, seeking to motivate the existing troops rather than gain new recruits.

So while hating the source doesn't automatically lead to disbelieving an argument, it can eventually have an even stronger effect. One possibility is, as mentioned, that we don't trust anything at all, becoming pretty much entirely irrational about everything. Another is that we fall victim to extreme tribalism, entering what some people call an echo chamber, an epistemic bubble, or what I like to call a bias spiral.

When you despise a source strongly enough, you may be capable of hearing their arguments but not really listening to them. Every vile word that comes out of their whore mouth is processed not as legitimate information, but wholly as evidence of their partisan bias. If they give you evidence - even really good, objective evidence - that counters your belief, you may see it only as a further indication of their own ideological ensnarement. This is both a manifestation of and route to absolutist thinking, where you already know the facts so any contradictory statements are simply evidence that the other side is lying, stupid, or themselves trapped in their own cult-like filter bubble.

And that's where I think people tend to get confused about the effect restricting information has. It doesn't necessarily mean the Streisand effect will cause the idea to propagate further. It won't necessarily mean the idea gains a single new devotee. But what it can very well mean is that in certain circumstances existing believers become even more convinced, if not of the idea itself then certainly of the bias of the other side. The effect on non-believers can be more subtle, but more on this in part three.

And we've also seen that there's an important distinction between your opinion of the source and your opinion of the idea. The two are interrelated, but in a complex way. Arguments you dislike cause you to distrust a source, and vice-versa. It really should come as absolutely no surprise to anyone that if you get people to listen to opposing political opinions, i.e. people they don't like saying things they don't like, they become more polarised, not less. It's nothing to do with people just not listening to the other side, that's not it at all. Although people do sometimes attack straw men, in my experience this claim is massively over-used. More typically, it's precisely because they've already listened to the other side that they decided they don't like them, so further contact isn't going to be of any help.

But that isn't always unavoidable. More on that next time.

2 comments:

  1. "Streisand effect will cause the idea will propagate": 2nd "will" -> "to"?

    ReplyDelete

Due to a small but consistent influx of spam, comments will now be checked before publishing. Only egregious spam/illegal/racist crap will be disapproved, everything else will be published.

Whose cloud is it anyway ?

I really don't understand the most militant climate activists who are also opposed to geoengineering . Or rather, I think I understand t...