What To Do About the Science of Political Bias

2013-09-13 Irrational Politics

It’s not new news, but here’s another article about another study that confirms that people loose their ability to think rationally (including doing fairly basic math) when politics are at stake.

Everyone likes to share these stories, but most folks don’t really think about them very much. Here are two thoughts I’ve had. 

1. Question the Experts

The article in question makes this crucial observation:

What’s more, it turns out that highly numerate liberals and conservatives were even more – not less — susceptible to letting politics skew their reasoning than were those with less mathematical ability.

But then it also makes the fundamental assumption that experts are right because, you know, they are experts.

For study author Kahan, these results are a fairly strong refutation of what is called the “deficit model” in the field of science and technology studies — the idea that if people just had more knowledge, or more reasoning ability, then they would be better able to come to consensus with scientists and experts on issues like climate change, evolution, the safety of vaccines, and pretty much anything else involving science or data (for instance, whether concealed weapons bans work).

Anyone else see a problem here? If the results are really true–the smarter you are the more sensitive you are to politics hijacking your conclusions–what does that do to our faith in the consensus of experts? In short: are we so sure that we want everyone to come to “consensus” with scientists and experts?

Obviously we can’t just assume that the smartest people are wrong any more than that they are right. And there are additional mechanisms (like peer review) that are intended to find the kinds of mistakes that political bias might create. But there are also serious problems with these mechanisms, and my general conclusion is that we ought to be more skeptical and tentative about scientific consensus.

2. The Way Out

As best I understand it, the fundamental problem here is that people want to have beliefs that comport with their view of the world, and politics is just one example of this. But what can we do about this?

I think there’s a solution, or at least a partial one, that involves leveraging the phenomena that is at the heart of this problem. Your brain is conditioned to supply thoughts that meet a certain criteria, right? Well change the criteria.

What I mean is simply this: imagine that instead of being happy when your team won a political debate, you tried to shift your sense of happiness to be based on concepts like integrity of the data or rigor of the analysis. The same process that can instantly detect and respond to politically uncomfortable situations could, in principle, detect and respond to empirically uncomfortable situations.

I’m not saying this is a perfect solution, mostly because it’s hard to do in practice. But my thought is simply this: if something is going to be pulling my strings, I’d like to try and decide what that something is. I have tried to enact this in my life by doing things like 1 – go to graduate school to study economics to learn if my political views on the topic held water. (A lot of the time: they didn’t.). Or 2 – try to refrain from celebrating partisan political victories. And then there’s 3 – maintain friendly relationships with people who think very, very differently politically and cultivate respect for their opinions. (Note: you’ve got to find smart political nemeses, because if you pick a dumb one you’ll just end up feeling more superior. If you can’t find a smart one, I suggest the problem is with you.)

I don’t think these strategies make the problem disappear. Nothing will ever do that, I suspect, and you’ll just have to get used to looking over your political shoulder the rest of your life. Deal with it. But hey: at least this is something we can do to proactively address a real and present danger, right? That’s my response, anyway.

EDIT: I like Bryan Caplan’s take as well:

The Enlightenment claim was never, “Everyone is already reasonable.”  Voltaire himself lamented that, “Common sense is not so common.”  The Enlightenment claim, rather, was “Most people aren’t reasonable – and the evils of the world are part, the predictable consequence of their irrationality.”  The evidence on motivated numeracy doesn’t justify fatalism.  It should instead inspire commitment toepistemic Puritanism – an ethic of intellectual self-control, dispassion, and disdain for groupthink.