Climate, politics, campaign finance, media criticism

Archive for the ‘political psychology’ Category


with 4 comments

Quick post, but something that has been bothering me:

Fact checkers like PolitiFact have been coming under fire from all corners recently, some of it very well deserved. But many conservatives have jumped on the fact that more statements made by Republicans or conservatives are rated “false” or “pants-on-fire” as opposed to statements by Democrats or liberals. Look!, they say, clear evidence that the fact checkers are biased!

Here are a few examples:

“In July you printed a chart with two years of PolitiFact Ohio results. It showed Democrats with 42 ratings of Mostly False, False or Pants on Fire, while the Republicans had a total of 88 in those categories. Doesn’t that prove you guys are biased?”

Cleveland Plain Dealer reader

And, slightly more sophisticatedly:

“A data-driven analysis of PolitiFact Florida’s 554 rulings on statements made by individuals appears to show a clear bias against Republicans and in favor of Democrats. As the truthfulness of a statement increases, so does the percentage of Democratic claims included in PolitiFact Florida’s rating.

… This dynamic appears to be a textbook example of what statisticians call ‘selection bias.’”

(This is followed by two cherry picked examples of how and whether PolitiFact chose to review certain statements.)

— Sean Davis, Red State

Here’s the thing these two posts neglect to mention: not reporting an exact 50/50 split is only evidence of bias if the split is in reality 50/50. Read the rest of this entry »

Written by rethoughtblog

September 9, 2012 at 10:52 pm

Are smart people stupid? Or just cocky?

leave a comment »

ImageKnow your tendencies. And correct for the bad ones.

That’s somewhat of a foundational belief for me, so I was interested to read Jonah Lehrer’s (@jonahlehrer) article in the New Yorker, “Why smart people are stupid” that cites a new study from James Madison University and the University of Toronto saying that when it comes to cognitive biases, we’re pretty much all toast.

Cognitive bias, of course, is a problem because it means that you’re … wrong. The column has plenty of examples of the sort of seemingly simple issues that can short circuit clear thinking and lead to biased (wrong) answers, like the following:

A bat and ball cost a dollar and ten cents. The bat costs a dollar more than the ball. How much does the ball cost?

At first glance, a lot of people will say 10¢. On second glance, that’s obviously wrong (correct answer being 5¢ for the ball and $1.05 for the bat). While mental shortcuts can save us a lot of time, they can also lead to mistakes both small and large.

The study suggests that we all are prone to dumb mistakes like that. More worrisome, intelligence and even explicit knowledge of your own cognitive biases is no help and is actually no more than a “subtle curse”, making you more likely to make mistakes. Lehrer writes:

The results were quite disturbing. For one thing, self-awareness was not particularly useful: as the scientists note, “people who were aware of their own biases were not better able to overcome them.” This finding wouldn’t surprise Kahneman, who admits in “Thinking, Fast and Slow” that his decades of groundbreaking research have failed to significantly improve his own mental performance.

Disturbing, indeed. These findings seem particularly poignant coming just after I finished reading Chris Mooney’s (@ChrisMooney_) The Republican Brain (review coming soon, I swear), which, perhaps surprisingly given the name, chronicles common biases from both sides of the left-right political spectrum. I read that book holding the belief that increased familiarity of the cognitive biases associated with my political values would help me overcome them and get a more undistorted view of the political world. But Lehrer’s column suggests maybe I was only making things worse?

Lehrer gives a hint of explanation — a phenomenon he calls the “bias blind spot.” It’s another version of the fundamental attribution error, the tendency to explain away your own behavior based on the situation while seeing others’ actions as reflecting their core personality. It’s why, for example, if you snap at someone blocking the escalator you might blame it on being stressed out and in a rush. But if that other guy does it? Well, he’s probably a jerk.

Interestingly, the study’s authors wrote that “more cognitively sophisticated participants [at least as measured by S.A.T. scores] showed larger bias blind spots” . They don’t speculate much on why this correlation might exist. A few thoughts:

  1. Where would humility fit in?
    I doubt I’m the only one who tends to think that intelligence often goes hand in hand with arrogance or cockiness. If trusting our first, biased instincts is the issue, would a level of “cognitive humility” that makes us distrust our first impressions help? I’d love to see a similar study that measures the correlation for humility as well as intelligence.
  2. The solution might lie in groups
    Even if it is effectively impossible to overcome your own biases, it’s usually easy to pick them out in others. For this to be a strong defense against irrationality, though, there have to be group norms that promote calling out other people’s biases. That’s why the scientific method works so well: prove someone else makes an error, you get published in a journal.
  3. Would anticipation help?
    Sure, we might always have the same cognitive tendencies and fall victim to them time after time, but what about going into a situation knowing what biases are likely to arise? For example, when you go to the supermarket, you can know that you’re likely to suffer from anchoring bias when you look at things that are on sale. The higher original price makes the sale price look even better by comparison than it would on its own. Perhaps we just need to enter the grocery store distrusting our tendency to jump at a product just because it’s half off. Or maybe that’s just me.

Psychology, especially as it pertains to politics, is increasingly becoming an interest of mine, so expect more posts in the future.

Written by rethoughtblog

June 19, 2012 at 10:33 pm

Unexplained issues for Mooney’s “Republic Brain” thesis

with 3 comments

Journalist Chris Mooney recently came out with a new book called The Republican Brain: The Science of Why They Deny Science—and Reality, which he was in this weekend’s Washington Post talking about in an article titled “Liberals and conservatives don’t just vote differently. They think differently”. (I also just got an invite to the book release party, but that’s neither here nor there…).

Liberals and conservatives have access to the same information, yet they hold wildly incompatible views on issues ranging from global warming to whether the president was born in the United States to whether his stimulus package created any jobs. But it’s not just that: Partisanship creates stunning intellectual contortions and inconsistencies. Republicans today can denounce a health-care reform plan that’s pretty similar to one passed in Massachusetts by a Republican — and the only apparent reason is that this one came from a Democrat.

None of these things make sense — unless you view them through the lens of political psychology. There’s now a large body of evidence showing that those who opt for the political left and those who opt for the political right tend to process information in divergent ways and to differ on any number of psychological traits.

I’m excited to read the book soon, but wanted to raise a few points I felt the Post article left unanswered.

His argument is fairly simple: genetically influenced psychological differences in how people process the world largely determines their eventual political ideology. Those on the left are characterized by a greater openness to the new, including, very significantly, new ideas. Conservatives, on the other hand, prioritize structure and order, and have a need for certainty and “cognitive closure”.

Someone with a high need for closure tends to seize on a piece of information that dispels doubt or ambiguity, and then freeze, refusing to consider new information. Those who have this trait can also be expected to spend less time processing information than those who are driven by different motivations, such as achieving accuracy.

Mooney’s hypothesis is intriguing – all the more so because of the taboo against linking ideology to any immutable psychological characteristics (especially anything resembling intelligence!). A taboo like that typically signals to me that there may be a hidden uncomfortable truth. However, while I recognize he might address these fully in his book, his column raised several unanswered questions that I’d be eager to see explored more fully, including on some of the central phenomena he says he is aiming to explain.


“… at a time of unprecedented polarization in America, we need a more convincing explanation for the staggering irrationality of our politics. Especially since we’re now split not just over what we ought to do politically but also over what we consider to be true.”

Here Mooney is saying that America’s polarization is “unprecedented” and that the disagreement over what is true is also new. But the immutable psychological characteristics he says are so critical seem like the exact sort of thing that would fail to explain sudden shifts in political discourse. Evolution, after all, is a slow process. The task for Mooney seems to be explaining what new factors are interacting with unchanging psychology to produce our unprecedented level of polarization.

I would suspect the following two factors. The first I know Mooney has alluded to; I don’t know if he has addressed the second.

  1. Closed information ecosystems and information choice: It’s not just Fox. Mooney and others have talked about how Roger Ailes and others have embarked on a long campaign to discredit the very notion of “unbiased” news sources. Repeated attacks on an imagined “liberal media” have been a very conscious attempt to discredit centrist institutions and place all sources of information into an us/them (“fair and balanced” v. “liberal bias”) split.I would suggest technological change as at least an equal factor. No longer are people limited to reading their local daily paper supplemented by nightly news on one of three major networks. In their place is a wide variety of media that one can choose to access, either via cable or the internet. This certainly may interact with a desire for “cognitive closure” or a more basic desire for psychological comfort.No matter what one’s set of beliefs, one can find validation for it somewhere. In the case of Republicans, there is a set of news providers that at the very least resembles the “authoritative” institutions of the past, creating an entire information ecosystem, separate and almost wholly independent of other media. While some of this may reflect a concerted effort by right wing operatives or a conservative need for cognitive closure, I think much of polarization is due to the proliferation of niche media where one is unlikely to come across seriously challenging (not just contrarian Slate pieces) writing.
  2. Increased ideological homogenization of communities: Diana Mutz in her fantastic book Hearing the Other Side: Deliberative versus Participatory Democracy argues among other things that those who have the most choice over what community they live in are least likely to be exposed to “cross-cutting” political discourse (This same phenomenon has been described as “the Big Sort”). In other words, they are unlikely to have their ideas challenged. Those with little geographic mobility, primarily the poor, nonwhite, and uneducated, are most likely to be “stuck” in communities where other, uncomfortable ideas are on full display.In addition to virtual communities online, Americans have shown quickly rising levels of geographic mobility and decreasing regional and community ties (although the aftermath of the financial crisis has put a reprieve on this trend). More choice over the community that one lives in has contributed to the Red State/Blue State divide and the further development of places like Portland, OR into liberal enclaves.Politicians have only increased this trend with gerrymandering districts into safely Republican or Democratic districts (homogenous political communities) with less chance of a political race prompting a true debate and competition of ideas.


If so much of ideology is due to innate traits, it’s curious that we would see such polarization. The majority of physical traits in humans, like height for instance, follow something like a bell curve, where most people’s traits lie in the center:

While it’s possible that cognitive tendencies don’t follow this pattern, but it seems like a challenge to using innate properties to explain polarization. Most physical characteristics tend toward the center.


I don’t see Mooney’s thesis as untenable. I’m just eager to see how he would address polarization (or how he does in his books – we’ll see soon!) these challenges to using innate psychology to explain the increasing divide in American politics. I’d be interested to hear other people’s thoughts as well.

There are other unsatisfactory aspects to Mooney’s argument in the column that I wonder if his book will address. The left/right divide is rather facile and doesn’t seem to map to a reality where American parties are tenuously cobbled together coalitions rather than monolithic entities reflecting two fundamental sets of psychological tendencies. If we think about the Political Compass, including an authoritarian v. libertarian axis as well as a left-right axis, how would Mooney account for Authoritarian Leftists or Libertarian Rightist? They seem to shatter his binary left/right plotting of ideology.


Written by rethoughtblog

April 18, 2012 at 12:02 am