Sister blog of Physicists of the Caribbean in which I babble about non-astronomy stuff, because everyone needs a hobby

Thursday, 12 December 2019

Review : You Are Not So Smart


I picked up this very nice book from a budget bookshop for the insanely low price of 99 CZK (about £3), and for that price I feel guilty about writing anything negative about it at all. The bottom line is it's a great little read but not a patch on The Idiot Brain, and it's got enough major flaws that I give it a respectable but imperfect 7/10.

The book is, as you might have guessed, all about common fallacies and misconceptions. Overall it does a nice job of describing and explaining everyday delusions. It's lively, engaging, funny, and doesn't make the reader feel stupid by explaining in some detail just how incredibly stupid they are. The author often steps in to say, "but here's how you can avoid this particular bias".

Best of all, I really liked how he mostly pointed the finger of blame at YOU, the reader - rather than say, "everyone else does this stupid thing", he gets the reader to look at themselves first. He very deftly manages to do this without blaming the reader for their errors. And full marks to him for that. All too often, arguments on the internet degenerate into a shouting match as to who's committed what kind of fallacy or which one is worse* - this focus being on getting the reader to check themselves rather than go around shouting at everyone else for being thick is very welcome. Likewise, despite being dedicated to mistakes, I never felt overwhelmed by the possible sources of error which are apparently plaguing me at every waking moment.

*"Playing at contradiction for sport", Plato called it, although it often feels a lot nastier than a game.

Whenever I read about particular biases and common mistakes on the internet, I often want to say, "yeah, but..". Especially the fallacy referee memes : they're nice little summaries, but most fallacies come with serious caveats that shouldn't be overlooked. Alas, McRaney's book is a bit of a mixed bag when it comes to covering the "yeah, but" stuff. To be fair, most chapters are well done. For example, he's careful to point out that the "argument from authority" fallacy doesn't mean you shouldn't listen to experts, and kudos to him for that. But not all of his examples are carefully thought-through or explored in as much detail as they should be. For example :
"Should you listen to a highly trained scuba diver's advice before plunging into the depths of the ocean ? Yes. Should you believe that person when the diver talks about seeing a mermaid making love to a dolphin ? No."
I would have to say, "no, I wouldn't believe them, but I'd give their argument a lot more credence than if an airline pilot had said the same thing". Similarly, I was unconvinced by parts of the chapter on ad hominem attacks - I would say that the past criminal history of a defendant is extremely relevant, at least insofar as I might need to evaluate trustworthiness. Ad hominem is not a fallacy if personal character is directly relevant, which it often can be.

The problem is the book covers a wide range of logical errors but has no underlying theory for establishing what's true. Sure, this is a big ask, but it would be nice to set out some vague framework as to how we know what's correct, otherwise how can anyone justify what's a fallacy and what isn't ?

For instance, he describes well how statistical thinking isn't natural. I couldn't agree more. But by way of illustration, he cites something that would cause anyone even casually acquainted with Bayes' theorem to vomit with rage. He gives a description of a well-to-do man who drives an expensive car, saying he was chosen from a study in which they interviewed seventy engineers and thirty lawyers :
It is more likely, statistically, that he is an engineer, no matter how well the description matches your heuristic model for lawyers.
That is patently wrong. Sure, if I'm given no description at all and told that an individual was selected at random, then it's always more likely an engineer is selected. But it would not be at all sensible to ignore this extra information - accounting for it is absolutely the rational thing to do. If a single individual in the UK was selected at random, the chances that they're an MP is almost zero, but it's massively higher if I'm told that person visits the House of Commons regularly. Ignoring this extra information is profoundly stupid : weird events often demand weird explanations, though of course we should be aware that our stereotypes aren't always accurate.

Another example : he gives a description of fictitious studies showing that old people can either learn more slowly or more easily than the young, attempting to frame both as "common knowledge". The problem is that the idea that old people learn more quickly is anything but common knowledge and immediately stands out as weird and unconvincing; it doesn't undermine his main point, but it would have been easy to come up with a better example.

The example which most irritated me most of all was when he described normalcy bias, the tendency to think that everything is normal when in fact it's not. His overall description was very good, but as a particular example he chooses the survivors of the horrific Tenerife airport disaster. He mentions that there are cases where people are simply stunned into inaction, but suggests that the aircraft survivors fell victim instead to thinking that everything was fine and so that's why they didn't get off the plane. I found that idea to be ridiculous - they've just been through an extreme shock to the system; it's easy enough to see how they could be too shocked to think rationally, but not at all plausible to suggest they thought everything was fine. I don't doubt that normalcy bias is a thing, but in a horrific aircraft disaster ? Come on.

While the book is excellently concise and doesn't usually skip anything too crucial, there are a few cases where McRaney sacrifices too much. One of these is the chapter on confirmation bias, an enormously important aspect of filter bubbles that has a large role in the polarisation that's dominating politics right now. Another was the chapter on the "third person effect", wherein we tend to assume that other people will be affected differently to certain messages than ourselves. It is right and proper to point out that we all believe ourselves to be rational and objective. But it's a heck of a mistake to use this to dismiss any and all censorship - a position which completely ignores the whole sorry history of propaganda. It is demonstrably true that people are affected differently, that we can reach rational decisions - otherwise the book could not even exist.

Perhaps worst of all for a book about fallacies, I found the whole thing rather uncritical and frequently lapse with its standards of statistics. Often McRaney will say something like, "52% of people believed one thing, whereas a lot more believed something else" - confounding quantitative and qualitative descriptions. Once, he even described 44% as a majority rather than a plurality. Sample numbers in the various studies quoted are usually absent, making it very hard to get an idea of the significance. A tenth of a second difference in running speed is held as evidence of the power of belief  - well, maybe, but that seems weak to me. And graphs and illustrations do not grace the pages of this book.

Two chapters stood out for me as an indication to take the book with a pinch of skepticism. The opening chapter deals with priming, where we can be influenced by unconscious clues. Some of this felt bloomin' obvious : of course advertisers are gonna show you pictures of happy people in nice houses when they want to tell you things. They're not exactly going to sell their new and improved moisturising hand cream with bleak images of dystopian wastelands covered in dying kittens, now are they ? Conversely, having just read When The Earth Was Flat, which discards subliminal messages as little more than a hoax, I found many of the claims in the chapter a bit strong.

The final chapter is on how our behaviour can be more a product of our situation than our innate personality. It's pretty good, but at the end it has a lengthy description of the infamous Stanford Prison Experiment. The problem is that there are very serious allegations that the whole thing was a sham, which McRaney appears to be ignorant of. Likewise, when he describes the bystander effect, he takes it as gospel, whereas that too is in doubt. And in the famous invisible gorilla experiment, he ignores that what is supposedly obvious - the person in the gorilla suit - is anything but; what's obvious is highly subjective. Meaning and interpretation happen in our heads and nowhere else - to pronounce arbitrary judgement on what we think should be important and accurate is foolish indeed.


Overall, it's a jolly good read, but weakened by some slapdash statistics and little or no effort at looking at opposing viewpoints : he presents the evidence in favour of the importance of each fallacy, but rarely or never looks at the counter-arguments. That's a serious flaw for a book that tries to encourage skepticism and critical thinking.

The Idiot Brain does an excellent job of explaining similar biases, their limitations, and the limits of current understanding. In contrast YANSS never mentions any of this. Again, to be fair he encourages us to remember these biases so we can guard against them, but I think it would have been strengthened quite a lot if he would remind us that under normal circumstances are memory is not totally wrong : it isn't based entirely on wishful thinking, and we don't always make crappy arguments that have nothing to do with the data.

I, for one, would love to hear more about the conditions that are best for promoting rational thinking. Being on guard against fallacies and misconceptions is all very well, but I want more than this. You can a billion websites explaining biases and errors, but precious few looking at what we need to do to get the most rational, objective viewpoint. Interpreting data is often very, very, very hard, and frankly it's a feckin' miracle we're ever able to do this at all. Maybe explaining that wouldn't be as amusing as describing all our silly human failings, but it might be a damn sight more useful.

No comments:

Post a Comment

Due to a small but consistent influx of spam, comments will now be checked before publishing. Only egregious spam/illegal/racist crap will be disapproved, everything else will be published.

Whose cloud is it anyway ?

I really don't understand the most militant climate activists who are also opposed to geoengineering . Or rather, I think I understand t...