Just think twice

If you talk about politics fairly often, you've probably had the experience many times: witnessing an intelligent, educated person you normally respect spouting off pure lunacy. Usually this is someone whose views you disagree with, but it might even be someone whose politics you generally agree with, albeit in a less extreme direction. You may have wondered how this could happen-and, if you were really paying attention, you may even have asked if you yourself could ever be similarly irrational.

Economists have long modeled humans as rational decision-makers. This is at best a useful approximation. In finance, failures of rationality lead to inefficient allocation of capital: Companies that will waste capital get it, while companies that need it don't. But at least there's a strong corrective force here. Anyone who can identify other investors' irrationality can profit by buying undervalued securities and selling overvalued ones, driving the prices toward the efficient level.

If it's important to make correct decisions in economics, the stakes are even higher in the political sphere. But unsurprisingly, the evidence is that humans are no better here than they are in their economic decisions. Bryan Caplan, for instance, identifies four major biases in his book, "The Myth of the Rational Voter," which he calls "make-work bias," "anti-foreign bias," "anti-market bias" and "pessimistic bias."

An even more overarching source of irrationality is confirmation bias. This is the human tendency to ignore contradictory evidence and blindly accept supporting evidence. Admittedly, this makes a certain kind of sense: Nobody likes to admit that they've made a mistake, even to themselves. And anyone who's publicly espoused a certain point of view will find it even harder to recant. So, the next best thing to do is to fool yourself. Of course, you'd never do it yourself, would you?

It's important to emphasize that this doesn't take place at the conscious level. We're talking about innate tendencies that you have to be consciously on the defense against.

One simple example of this tendency at work is to watch a sporting event in a mixed group of, say, Duke and Carolina fans. Find some example of a questionable call: Was it a blocking foul or a charge? Duke fans will argue that it should have gone their way, and Carolina will argue the opposite.

This is a trivial example, but our political allegiances are sometimes less different from our sports fandom than we might hope. On Facebook, the same algorithm lets you "become a fan" of Manchester United and "become a supporter" of Barack Obama. And political conventions could easily be mistaken for pep rallies, complete with musical acts and balloons in the team colors of red, white and blue.

Shankar Vedantam of The Washington Post has pointed to additional evidence of this partisan polarization. Diehard supporters of both parties see huge gaps between presidential candidates and can't imagine how anyone could vote for the other candidate, while moderates see few substantial differences. More tellingly, those who agree with a given party on one issue are likely to agree on almost all issues, even when there is no obvious philosophical guiding principle. There's a good chance that those who favor tax cuts will also be social conservatives.

It's easy enough to see how these correlated political views might arise. Those who listen only to Rush Limbaugh will naturally tend to adopt similar views. Social groups can have a similar effect: If everyone else at a dinner party is happily bashing Republicans, no one's going to object if you join in the fun, but you might get an odd look or two if you speak up in contradiction. We're not that far out of middle school. The opinion of the cool kids is still important.

You may object that it's acceptable to show a little bias toward your party because the other party is so bad that they need to be kept out of power by any means necessary. The problem is that those on the other side of the aisle are thinking the exact same thing about your party.

But let's assume you agree that bias is a bad thing and want to make sure it doesn't happen to you. How do you prevent it? The first principle is just the basic rule of critical thinking: Consider the source. Realize that certain people are paid political operatives, and others you agree with may be biased.

At the same time, give those you disagree with the benefit of the doubt. True, they may also be biased, but shouldn't that serve as a warning flag that humans are prone to bias on that topic? Perhaps you should be wary of your own bias then.

Most importantly, unbiasedness requires complete willingness to admit that your beliefs might actually be wrong in the face of new evidence. It also requires the willingness to face that evidence, even if that means seeking out real, intelligent arguments from those you disagree with, not just straw men you can easily demolish.

But finally, let's be clear on what unbiasedness is not. It doesn't mean adopting pure neutrality and eliminating all opinions. In fact, the goal of seeking unbiasedness is to form the correct opinion. Sometimes you will carefully examine the evidence with an open mind and find that your initial reactions were right all along.

Just think twice if it happens every time.

Matt Atwood, Trinity '03, was editor of Towerview's fourth volume. This is the third in a series of columns celebrating the magazine's 10th anniversary year.

Discussion

Share and discuss “Just think twice” on social media.