Sister blog of Physicists of the Caribbean in which I babble about non-astronomy stuff, because everyone needs a hobby

Sunday 28 January 2024

Incoherency

Feel free top skip ahead if you don't need any background and just want a look at what is meant by "incoherence" and the problem of whether we can really hold contradictory beliefs.


Decoherently

I called this blog Decoherency for a simple reason. I wanted to make it sound scientific(ish) but also imply that posts wouldn't necessarily have any connection to each other. I would deliberately permit myself to write posts that would suffer from flagrant contradiction and not have to worry too much about what I'd written elsewhere. I think of this as a public notepad : a safe space for my immediate thoughts, with no real attempt to reconcile any paradoxes, generalise anything or establish robust principles.

What I jot down should hopefully be interesting enough to be worth bothering with, and at least have something to offer in its immediate context – but nothing whatever beyond that is guaranteed. Each piece should be self-consistent but that's it.

Incidentally, I do from time to time re-read this, and you know what ? I'm pretty happy with it, actually. Most posts seem to contain at least something of moderate interest, even if there are plenty of times when I think, "what the hell was I on about?". Sometimes I'm consciously aware of problems when I set something down but I just let it go because overall the content still holds and it makes a rhetorical point more clearly. Other times I genuinely don't realise until long after the fact, and only the benefit of hindsight reveals that something is now "obviously" just wrong, or at least lacking context. 

And that's okay. That's what a notepad is for. It's a sort of conversation with myself, fleshing something out to the point where the fundamental essence of a thing is preserved, but not bringing it to a full flowering, as it were. It's a notepad after all, not a book.

I also think all would-be commentators should give this a go. I find that when I try and really articulate precisely what it is I want to say, be that from something that's just popped into my head spontaneously or (more often) in response to something I've read, the process takes me in directions I rarely anticipate. Often I find that my initial ideas were just wrong-headed, and the final piece, even in this notepad-level form, isn't much like what I initially set out to write. The process of writing stuff down, re-reading the original text, trying to paraphrase it, all that generates ideas I simply never would have had otherwise. 

Of course discussion with others is often even more important. But the writing process for me is something special. It has the the powerful advantage of setting everything out in a more permanent way, something I can point back to and reference in a self-contained unit : here are my ideas at this particular moment, without too much of a meandering journey that some of the mega-long threads on social media can become. And the journey is one that I direct, where I get to decide what to focus on, where I can concentrate on what I'm interested in with with a freedom to fail all on my own terms.

These days it's almost a compulsion. I don't feel that I've completed the reading experience until I've blogged it up. I do this mainly for my own benefit and if anyone else gets anything out of it, well that's good for them, but that's a bonus, not a goal.


Incoherently

I take decoherency to mean, in this context, this process of working things out, of getting words on a page that attempt to be self-consistent only in their immediate environment. But this is a made-up word for my own purposes. What about the truly incoherent ? What do we do when we encounter ideas which are mutually incompatible ?

This Aeon piece is quite a nice (albeit overly-lengthy) examination of the whole notion. Incoherency is essentially just this synonym of incompatibility : two things which cannot both be true. So in the address/title the author asks a very valid question : is it really possible to believe things which are mutually exclusive ? The answer, I think, is very much yes, but we'll get to that.

Philosophers call the kind of incoherence that’s involved in these states means-end incoherence – I intend an end (getting new shoes), believe that a means (going to the mall) is necessary for that end, but do not intend the means. There are many other kinds of incoherence. For example, it’s incoherent to have ‘cyclical’ preferences – say, to prefer chocolate ice cream to vanilla, prefer vanilla to strawberry, but prefer strawberry to chocolate. And it’s incoherent to have beliefs that are straightforwardly logically inconsistent – say, to believe that great cooks never overcook eggs, believe that you are a great cook, but also believe that you have overcooked the eggs.

It seems obvious that if you believe you need to go shopping because you need new shoes but you also believe you don't, you've hit a very hard kind of incoherence. Likewise if you believe great cooks both do and do not overcook eggs, you're in a bind. But I must object to the cyclic example here because this is obviously wrong. For example, the Welsh rugby team frequently beats England, and England occasionally beat New Zealand, but Wales haven't beaten New Zealand basically ever. This can't be incoherent because it's simply a fact.

So it's perfectly possible to have cyclical preferences. The style of play of rugby teams is qualitatively, not just quantitatively different, and what works well against one team can be useless against another. And in Robot Wars (a "sport" I followed far more closely than rugby) it became clear that there was no perfect design, that Robot A could beat Robot B which could in turn beat Robot C, but Robot C was perfectly capable of trouncing Robot A. These things happen all the time. 

Ice cream flavour preferences are if anything an even better example of this, because preferences are so utterly subjective : the difference between the flavours is qualitative, not quantitative, and they can't really be ordered in a linear scheme like this at all. As with which movies or books you "should" like according to the critics, none of this changes how you actually do emotionally respond. Empirical data cannot itself be incoherent, only the interpretation allows for that... if you believe people must have a linear sequence of flavour preferences, you haven't understood people very well.

This doesn't invalidate the notion of incoherence as incompatibility, however. Not at all. The author continues :

It helps to contrast being incoherent with merely being unreasonable. Consider someone – call him Derek – who believes that the 2020 US presidential election was stolen for Joe Biden, and that in reality Donald Trump received far more votes. His beliefs certainly could be logically consistent. Moreover, Derek might think that his beliefs are well supported by the available evidence, thinking that the information provided on QAnon message boards, by One America News, and by Trump himself is extremely weighty evidence, and that information provided by the mainstream media is entirely unreliable. Like many conspiracy theorists, Derek might dismiss the evidence against his views by saying that it has been fabricated by malicious actors.

What’s enticing about charges of incoherence, by contrast, is that they seem to skirt these kinds of disputes. If I can show that Derek’s worldview doesn’t make sense from the inside – that it doesn’t even hang together coherently – then, the thought is, I can show that he’s being irrational without having to settle which sources of information are reliable, or what counts as good evidence for what. This, I think, is part of what makes us inclined to reach for charges of incoherence (or inconsistency) in political debate. When we reveal incoherence in someone’s political beliefs, we’re tempted to think, then we’ve really got ’em. Or, at least, then we’ve really shown that they are being irrational.

Pointing out that a vote against a bill would knowingly frustrate the politician’s own goals is both an easier way to show the irrationality of his intentions, and more likely (though far from certain) to be effective in changing his mind.

This is all well and good. The author continues to note that that incoherence could be taken as the hallmark of irrationality, with unreasonableness not really being irrational at all – but then, thankfully, rejects this. He notes instead that rather this points to different levels of irrationality. To believe in the Flat Earth is irrational, not merely unreasonable, but to hold in your head entire systems of mutually incompatible thoughts ("structural" irrationality) is surely worse still than merely in believing in one system which has been refuted. And here too I agree with the point. 

He goes on to say that inconsistency largely happens simply because we haven't noticed it, that we hold two thoughts fully independently, acting on them without realising the incompatibility :

My contention is that the cases where people most clearly have incoherent mental states are those in which their mental states are not perfectly transparent to them. It’s not particularly hard to make sense of incoherence in these cases; what’s harder to make sense of is incoherence that persists even when the incoherent states in question are brought to the attention of the person who has them... we hold incoherent beliefs, but never think about them together, and that’s how we manage to sustain the incoherence.

It fits with the fact that reporting one’s own incoherent states aloud in speech seems a lot stranger than merely being incoherent: this is because reporting the state aloud in speech requires bringing all the states to one’s conscious attention, making them transparent. And it explains why, when our incoherence is brought to our attention, we scramble to revise or reinterpret our mental states to make them coherent: ‘When I said “all”, I didn’t really mean all’; ‘I’ll do anything to help small businesses within reason’; and so on.

Again I not only agree but make an active conscious effort to search out and resolve inconsistencies. When I realise that I'd said something which is inconsistent with my other assertions, I try and generalise to keep everything consistent. Sometimes this means examining the full implications of what I said and finding that actually everything is fine. Sometimes I have to abandon one or more statements and admit I was mistaken. Sometimes I realise I was missing data which helps the whole thing hang together, or necessarily changes my interpretation of what's going on.

A simple example that's stuck with me : there was a meme explaining why women are distrustful of men in certain situations because they might be dangerous, comparing them to some fraction of snakes being venomous. I disliked this because I also dislike the notion that we should be distrustful of certain ethnicities or religions because of terrorism. The resolution in this case was a simple one, that the quantitative difference in the dangerous fraction is so high as to point to a qualitative discrepancy. If 0.0002% of your population is dangerous, that can't be taken as evidence that they're a bad lot. If it's 20%, well, there the claim has an awful lot more substance to it.

But... often I'm unable to reconcile the propositions. When confronted with the "transparency" of the inconsistency, as the Aeon piece describes it, I'm sometimes left with a nagging doubt. Like being presented with a brilliant, coherent, well-constructed argument about why a particular movie was terrible, it doesn't actually stop me from enjoying it, or vice-versa : you can't really persuade people to like or dislike something, these are things we simply do. And I've found myself more than once being unable to refute an argument but know, or at least doggedly believe, that the argument must have been deeply flawed despite being unable to express why. 

Consequently I go away in a state of confusion, still believing what I originally believed, unable to refute the counter-argument but unable to accept it either. 

This can happen to varying degrees. I might intellectually accept the argument but just not emotionally subscribe to it, or I might be partially persuaded (thinking perhaps, "yes this is true in these particular conditions"), or I might end up in a state of utter bewilderment. Or I might start veering back and forth between the two claims. And sometimes my eyes simply glaze over either with total incomprehension or utter boredom when someone else tries to convince me of something.

Perhaps intelligence also plays a role here, where even if you notice the two disparate propositions, you're simply unable to understand how they're inconsistent – thus you haven't really spotted the incoherency at all.

But at a deeper level I think the key to this is bullshit. There is a tendency for the very rational to assume that this is how everyone thinks, that everyone else must be fundamentally rational, logical, careful, and therefore unable to accept inconsistencies. I disagree. I think people can have wildly irrational beliefs, that are not just inconsistent with empirical data, but are even internally inconsistent. If they ever do notice, then they just don't care (the essence of bullshit). And if you don't care about consistency then you're free to believe the most outlandish, incoherent, self-contradictory nonsense. It's not that people have simply subscribed to a different set of trustworthy sources (something I've noted before at length in response to other Aeon essays), it's that their whole world view just is not rational. 


Ignorantly, Malevolently

To understand irrational ideas by rational means is, sadly, the height of folly. It can't be done. You cannot reason people out of positions they haven't been reasoned in to. Sometimes reason just plays no role in belief whatsoever. It's like people who openly admit that their favourite politician is a liar but then believe them anyway : I don't get it, but nevertheless it definitely happens. This is about as far as we can get by rational means, to simply acknowledge that it happens. 

This idea of people having different beliefs, even irrational ones, as a result of an information deficit is a dangerously compelling one. It would mean putting everyone on the same intellectual podium : not necessarily at equal heights, but at least thinking in similar ways. Differences could be reconciled simply by providing more or more comprehensible information (look, this source claims something in contradiction to the evidence so you can't trust them; look here's a simpler explanation of how internal combustion engines work). The problem is that if we insist this is true in the face of evidence, if we continue to insist that people are believing in fascism and despotism out of some perverted but fundamentally rational viewpoint, any efforts to thwart them will fail.

If I don't stop here this post will spiral out of control, so I just want to close with a few points. First, things are complicated. Sometimes we can and do behave rationally, acting much like a Bayesian net. Second, we consider metadata as well as direct data : that is, we consider who-believes-what to be a form of evidence in itself. But finally, that the human brain is also capable of being totally irrational. It can, under some circumstances, look two mutually incompatible ideas squarely in the eye and say, "yep, both of those are true".

It doesn't matter if these last two points are fallacies. Giant lists of the types of errors people make are of no help whatever, any more than yelling, "be more rational you dumb twat !" is likely to actually make them calm down and think more carefully. Rather we have to, if we want any chance of overcoming the lunatics, first begin by realising that we won't succeed with rational arguments. We cannot simply convince unreasonable people that they should be more reasonable; we cannot go around pointing out the logical flaws in illogical arguments and expect much in the way of successful persuasion. 

It would be nice to believe that people are basically rational. Conversely, the cynics seem to draw a weird comfort in the idea that people are irrational angry baboons who cannot ever be reasoned with. I think both neither and both of these positions are true. People, I suggests, aren't fundamentally rational or irrational. Rather both aspects, like it or not, are fundamental to being human.

No comments:

Post a Comment

Due to a small but consistent influx of spam, comments will now be checked before publishing. Only egregious spam/illegal/racist crap will be disapproved, everything else will be published.

Review : Human Kind

I suppose I really should review Bregman's Human Kind : A Hopeful History , though I'm not sure I want to. This was a deeply frustra...