Sister blog of Physicists of the Caribbean in which I babble about non-astronomy stuff, because everyone needs a hobby

Thursday 15 July 2021

Wibbly-wobbly mindy-windy

You might remember back in January there was an interesting study showing how political party policy shapes the opinion of party members. The strength and speed of this effect, with no backfire reported at all, rather surprised me. My (purely anecdotal) experience is that people select a political party to support based on an alignment of existing policy, with hardly anyone supporting a party in absolutely everything they say and do. Parties seem to be driven by preferences of their members more than the other way around : witness continuous in-fighting in Labour about which direction to pursue, or the total collapse of the Liberal Democrats following their abandonment of abolishing tuition fees.

I suggested that a possible reason for this might be that people support some policies rather casually and emotively : if you like and trust a party overall, you'll probably go along with them on the things you're not all that interested in. So if they change stance, you'll change stance. You're not actually thinking very deeply about the issues at all, you're deferring to a perceived source of expertise instead. You're using the party as a means of extended cognition.

This might sometimes be the case. But looking back, I don't think it fits the particular policies of the original study all that well, which were prominent issues and not minor technicalities. Another study has come to my attention which suggests it's a lot more subtle than that, and in a sense is even due to the exact opposite effect.

The press release linked above is decent enough, but I wanted more details so I read the original paper as well. The main aspect is choice blindness. As they describe in the press release :

Choice blindness was discovered in 2005 by a team of Swedish researchers. They presented participants with two photos of faces and asked participants to choose the photo they thought was more attractive, and then handed participants that photo. Using a clever trick inspired by stage magic, when participants received the photo it had been switched to the person not chosen by the participant — the less attractive photo. Remarkably, most participants accepted this card as their own choice and then proceeded to give arguments for why they had chosen that face in the first place. This revealed a striking mismatch between our choices and our ability to rationalize outcomes. This same finding has since been replicated in various domains including taste for jam, financial decisions, and eye-witness testimony.

So what they did here was to quiz people as to their responses on various political issues and get them to respond with a classic "how strongly do you agree with blah..." as in innumerable internet tests. Shortly after, they'd bring the participants back to review their results. In one group the answers were not altered at all, while in another the answers were inverted. They would have respondents either simply confirm that this was their response or elaborate as to why they responded as they did. They did the same thing a week later as well.

When the answers were unchanged, they found high consistency in the participants across time, both in terms of the actual answers and their confidence. So ordinarily these people do seem to just go about their daily lives really believing these issues. As they should, since they chose them to be "salient political topics" that participants ought to have clear opinions about, e.g. "The Swedish elementary school should be re-nationalized". The statements also included a brief explanation. Participants were instructed to interpret any ambiguities without any guidance from the researchers.

When the answers were manipulated, overall about half were accepted by the participants as being their original answer. That is, people could sometimes be very easily tricked into thinking they held the opposite opinion to what they were usually very consistent on. They were more likely to notice the differences when they were also asked to justify their reasons (as opposed to just acknowledging them), if their response was extreme and/or their confidence was high, and if they had a high score in the Cognitive Reflection Test. So the more deeply they thought about their answer, the more likely they were to spot the manipulation*.

*A caveat is that CRT may simply reflect better memories rather than analytic abilities. Also, there was no correlation with political involvement. 

I found the paper a bit of a slog and it's tough to extract numbers from. They also don't present all the statements they showed to participants. But the gist of it is clear enough. You can, under certain circumstances, very easily persuade people to believe the exact opposite of what they profess, and this holds for at least a week*. You can't fool everyone about everything, but you can still fool 'em plenty. It would have been nice to know what was the highest degree of agreement/confidence that they succeeded in reversing, but as I said it's a bit tricky to extract the numbers.

*This prolonged consistency suggests to me that they're not just saying things to conform, that they really have changed their minds. But also, people remain embedded in their social networks for very much longer after than experiment lasts, so I'd be surprised if this change of stance was all that long-lived  - weeks, perhaps, but not months.

What seems to be happening is that people engage in "confabulation", or, they rationalise their response rather than critically analyse it. This strongly reminds me of research in inducing rich false memories, where people were asked to continuously recall something that never happened. In this case the change occurred much more rapidly and easily, but the principle is similar. Unlike the memory experiment, in this case things happened very quickly indeed.

A caveat is that when asked to justify their responses, the rate of corrections (i.e. spotting the manipulation) increased - but not by very much. It could be that if no justification was explicitly asked for, the participants did in fact justify their reasons internally, and just didn't think about them as carefully as when they had to explain them out loud. And when asked to justify, their attitude shift was greater than when only asked to acknowledge, strongly suggesting that rationalisation is a key factor. Not by much though, so this will hardly be the last word on the subject.

Even so, this suggests a possibly more compelling reason why people shift their stance according to party policy. Rather than simply "going with the flow", they are actively thinking about the policy in question. But only in a very biased way. They are being, in effect, asked to provide reasons why this new stance is actually a good one, thus coming up with actual concrete reasons to support it rather than simply the metadata that everyone else they know believes it. And by doing so themselves, they are inherently keeping this within their own world views and ideologies, not fighting against political opponents who they probably don't perceive as trustworthy. Once they accept their manipulated answer, they can hardly disagree with themselves, or construct reasons they wouldn't believe.

This is not entirely mutually exclusive with my earlier idea. Deferring to the perceived expertise of and trust in their political leaders, they nevertheless still do have to alter their own views accordingly. Policy change therefore induces a rationalisation rather than being quite as crude as "because I said so" level of reasoning. And importantly, there are still people within parties who disagree on certain issues - they just disagree a bit less when policy changes, rather than becoming full-throated enthusiasts.

The big question as I see it is the limits to which this applies. Again, there are some policy shifts which are just too radical - people do switch allegiances between parties, so parties are driven by opinions as well as the other way around. The general conditions as to what sort of policy change a party affiliate can accept and what will discourage them would be a very interesting study indeed. And it may be that some of these issues are ones of which people have never really thought deeply about before (which would be the easiest to rationalise a change in stance), whereas others they have a much deeper, independent, core level of belief in (which cannot be so easily altered).

Another question is to what extent changing stance changes underlying moral ideology. The authors raise the fascinating suggestion that perhaps we evaluate our own beliefs much as we evaluate those of others : we see people behaving in a certain way and conclude they believe something; we see our own behaviour and conclude what it is we actually believe. Interesting, but I'm not sure about that one. My guess is that a lot of issues are ideologically fuzzy, with relatively few policies flowing directly from ideology. For example a higher taxation policy could be justified on the different ideological moral grounds of restricting the power of the wealthiest, improving civil infrastructure, or for investment for long-term economic growth. Culling animals could be justified on grounds of disease control or economic necessity. Universal Basic Income can be justified on grounds of welfare or reducing bureaucracy. So changing stance on one issue doesn't necessarily alter basic ideology or morality one bit, and it would have been fascinating to hear some of the justifications participants came up with. 

The final question this raises for me is : how do we get people to evaluate issues more objectively ? People are apparently very, very good at rationalising. This requires a high degree of analytic intelligence. Might there also be a hidden capacity for greater levels of critical thinking as well ? We don't need to get everyone to do this for everything, but if we could at least do this for the major policy issues, that'd be nice.

All in all, people remain unpredictable and almost paradoxically weird. They'll fight tooth and nail against a suggestion from outside their group, but defend the same idea to the death if it comes from their own clan. Their combination of absurd stubbornness and ludicrous flexibility is utterly baffling.

No comments:

Post a Comment

Due to a small but consistent influx of spam, comments will now be checked before publishing. Only egregious spam/illegal/racist crap will be disapproved, everything else will be published.

It's okay to like vinyl

Here's a nice if somewhat over-lengthy piece about why people prefer antiquated technologies like vinyl records instead of digital medi...