Ignore the headline. But it really would be interesting to see a study attempting to measure whether taking philosophy classes actually does cause people to become less biased. This would have to involve people who don't want to take a philosophy class, otherwise it runs the risk of a strong selection effect.
The study found that the computer-mediated counterargument had no effect on people with no or a low level of confirmation bias. On the other hand, for those who had a high degree of confirmation bias, the counterargument was very effective in reducing their confidence in their beliefs.
There are a couple of takeaways here. The first is that some people may just be less biased than others (as Yagoda himself implies when, after correctly completing the Wason Selection Task, he says he may just be an “unbiased guy”). The second is that expert counterargument was effective in reducing the confirmation bias in some people.
Nisbett’s advice—that is, trying to prove oneself wrong—may not lead someone to expert counterargument. One of my Facebook friends, for example, recently posted about her attempt to follow the advice of her friends and listen to the other side. (She’s a political progressive.) How did she do this? She talked to some redneck who was spouting conservative views at a coffee shop. According to her post, she came away from the incident believing that she had attempted to prove herself wrong, consequently making her even more certain of her own worldview as a result of the weak arguments presented by the conservative coffee shop patron.
I suppose my concern there is what to do when you reach an impasse with well-intentioned, rational, intelligent, informed individuals (https://plus.google.com/u/0/+RhysTaylorRhysy/posts/iKcQTYKfim4; I still haven't read all the links in the comments). I'm not sure anyone has a definitive answer to this. However, I would hazard that such events tend to be rare, and that most such situations, at least, do result when one side isn't as rational/intelligent/well-intentioned/informed as they claim to be.
Kahneman adheres to a dual process model of mind, according to which we think in two different ways: fast and slow. Fast thinking refers to immediate processing of information, such as noticing that two structures are basically the same height. Slow thinking refers to more deliberative processing, like counting the number of raised hands in a senate meeting to determine a vote. Kahneman calls fast thinking System 1 and slow thinking System 2.
Kahneman says: "My position is that none of these things [bias-reduction techniques] have any effect on System 1. You can’t improve intuition. Perhaps, with very long-term training, lots of talk, and exposure to behavioral economics, what you can do is cue reasoning, so you can engage System 2 to follow rules. Unfortunately, the world doesn’t provide cues. And, for most people, in the heat of argument the rules go out the window."
https://areomagazine.com/2018/09/18/how-philosophy-can-reduce-your-confirmation-bias/
Sister blog of Physicists of the Caribbean in which I babble about non-astronomy stuff, because everyone needs a hobby
Subscribe to:
Post Comments (Atom)
Review : Pagan Britain
Having read a good chunk of the original stories, I turn away slightly from mythological themes and back to something more academical : the ...
-
"To claim that you are being discriminated against because you have lost your right to discriminate against others shows a gross lack o...
-
I've noticed that some people care deeply about the truth, but come up with batshit crazy statements. And I've caught myself rationa...
-
For all that I know the Universe is under no obligation to make intuitive sense, I still don't like quantum mechanics. Just because some...
Whether taking philosophy classes actually does cause people to become less biased is a great question yet to be answered, the fact that it makes people more biased toward needlessly sesquipedalian communication is settled.
ReplyDeleteExhibit "A" ^^
I had to look up 'sesquipedalian'....
ReplyDelete