Archive for June 2012
That’s somewhat of a foundational belief for me, so I was interested to read Jonah Lehrer’s (@jonahlehrer) article in the New Yorker, “Why smart people are stupid” that cites a new study from James Madison University and the University of Toronto saying that when it comes to cognitive biases, we’re pretty much all toast.
Cognitive bias, of course, is a problem because it means that you’re … wrong. The column has plenty of examples of the sort of seemingly simple issues that can short circuit clear thinking and lead to biased (wrong) answers, like the following:
A bat and ball cost a dollar and ten cents. The bat costs a dollar more than the ball. How much does the ball cost?
At first glance, a lot of people will say 10¢. On second glance, that’s obviously wrong (correct answer being 5¢ for the ball and $1.05 for the bat). While mental shortcuts can save us a lot of time, they can also lead to mistakes both small and large.
The study suggests that we all are prone to dumb mistakes like that. More worrisome, intelligence and even explicit knowledge of your own cognitive biases is no help and is actually no more than a “subtle curse”, making you more likely to make mistakes. Lehrer writes:
The results were quite disturbing. For one thing, self-awareness was not particularly useful: as the scientists note, “people who were aware of their own biases were not better able to overcome them.” This finding wouldn’t surprise Kahneman, who admits in “Thinking, Fast and Slow” that his decades of groundbreaking research have failed to significantly improve his own mental performance.
Disturbing, indeed. These findings seem particularly poignant coming just after I finished reading Chris Mooney’s (@ChrisMooney_) The Republican Brain (review coming soon, I swear), which, perhaps surprisingly given the name, chronicles common biases from both sides of the left-right political spectrum. I read that book holding the belief that increased familiarity of the cognitive biases associated with my political values would help me overcome them and get a more undistorted view of the political world. But Lehrer’s column suggests maybe I was only making things worse?
Lehrer gives a hint of explanation — a phenomenon he calls the “bias blind spot.” It’s another version of the fundamental attribution error, the tendency to explain away your own behavior based on the situation while seeing others’ actions as reflecting their core personality. It’s why, for example, if you snap at someone blocking the escalator you might blame it on being stressed out and in a rush. But if that other guy does it? Well, he’s probably a jerk.
Interestingly, the study’s authors wrote that “more cognitively sophisticated participants [at least as measured by S.A.T. scores] showed larger bias blind spots” . They don’t speculate much on why this correlation might exist. A few thoughts:
- Where would humility fit in?
I doubt I’m the only one who tends to think that intelligence often goes hand in hand with arrogance or cockiness. If trusting our first, biased instincts is the issue, would a level of “cognitive humility” that makes us distrust our first impressions help? I’d love to see a similar study that measures the correlation for humility as well as intelligence.
- The solution might lie in groups
Even if it is effectively impossible to overcome your own biases, it’s usually easy to pick them out in others. For this to be a strong defense against irrationality, though, there have to be group norms that promote calling out other people’s biases. That’s why the scientific method works so well: prove someone else makes an error, you get published in a journal.
- Would anticipation help?
Sure, we might always have the same cognitive tendencies and fall victim to them time after time, but what about going into a situation knowing what biases are likely to arise? For example, when you go to the supermarket, you can know that you’re likely to suffer from anchoring bias when you look at things that are on sale. The higher original price makes the sale price look even better by comparison than it would on its own. Perhaps we just need to enter the grocery store distrusting our tendency to jump at a product just because it’s half off. Or maybe that’s just me.
Psychology, especially as it pertains to politics, is increasingly becoming an interest of mine, so expect more posts in the future.