Monday, September 12, 2011

Political Misperceptions and the Resistance to New Information

One of the things political scientists have understood for quite a long time is that it is difficult to change people's minds regarding things about which they are passionate.  Take, for example, the avid sports fan who believes his team is the best in the league.  When the team fails to live up to that billing on the field or court, the passionate fan seeks to shift the blame away from his team and onto the referees, injuries, the manager, a player she doesn't like, or a multitude of other factors having to do with anything other than the admittance that the team just isn't very good.  Some might call this faith in one's team a form of a worldview and acknowledging that part of that view is wrong could create cracks in the rest of the foundation.  So our passionate fan lives in denial, insisting that 'but for ...' the team would have won it all.  A similar thing happens in politics with political presuppositions.  Before long, facts don't matter, what becomes important is how strongly we hold to our worldview (and reject anything that contradicts it, even if true). 

Political Scientist Larry Zaller argues that in order for people to change their views about anything (right or wrong), three things must occur.  First, they must be exposed to the new information.  This is a large problem when it comes to political viewpoints because while most people have one they aren't very attentive to politics.  It also has become a problem due to the self-selection problem.  In today's world there is a wealth of (dis)information available to anyone who seeks it, and sometimes to those who don't.  Many people simply gravitate toward sources that confirm their preexisting worldview.  Conservatives flock to Fox News, the Wall Street Journal, and Rush Limbaugh.  Liberals head for MSNBC, the Daily Kos, and NPR.  Thus, rarely are partisans exposed to new or disagreeable information.  The second step, Zaller argues, is reception.  Not only must an individual be exposed to new information, she must also receive that information.  Reception indicates a willingness (and ability) to hear and think about new or conflicting information.  Once the information has been received the individual moves to the third stage and either accepts or rejects the new information.  Many people never make it past step one let alone get to step three.  All three steps are necessary to effect an attitude change in a person.  The fact that such a large percentage of the population never experiences these three steps leads political scientists to doubt the claims of many partisans about media effects on attitudes. 

Let's go back to our passionate sports fan to see how this works.  Our fan loves the Boston Red Sox and thinks they are the best team in baseball.  He either listens to all the games on WEEI or watches them on NESN and hears the home team announcers reinforcing his views again and again.  Suppose one day both stations are off the air and our fan has to listen to the game feed from the opposing team, the Phillies.  Their announcers rave about how great the pitching is, the offense and defense, and how the Phils will win it all.  Our fan has been exposed to conflicting information.  He must either receive that new information or dismiss it as the ravings of the opponents fans.  If he thinks about it and starts to study the stats, he has received the info.  Finally, our fan must either accept what the others are saying and conclude that he was wrong about the Red Sox or he must reject it on the basis of his investigation.  Like most political partisans, our fan probably never goes beyond the first step. 

Getting back to the point, some recent research in political science has asked why it is so difficult to dislodge people from errant beliefs.  Brendan Nyhan has done some excellent work in this area.  The following is the abstract of his latest manuscript:

"People often resist information that contradicts their preexisting beliefs. This disconfirmation bias is a particular problem in the context of political misperceptions, which are widespread and frequently difficult to correct. In this paper, we examine two different hypotheses about the prevalence of misinformation. First, people tend to resist unwelcome information because it is threatening to their worldview or self-concept. Drawing from social psychology research, we test whether affirming individuals' self-worth and thereby buttressing them against this threat can make them more willing to acknowledge uncomfortable facts. Second, corrective information is often presented in an ineffective manner. We therefore also examine whether graphical corrections may be more effective than text at reducing counter-arguing by individuals inclined to resist counter-attitudinal information. Results from three experiments show that self-affirmation substantially reduces reported misperceptions among those most likely to hold them, suggesting that people cling to false beliefs in part because giving them up would threaten their sense of self. Graphical corrections are also found to successfully reduce incorrect beliefs among potentially resistant subjects and to perform better than an equivalent textual correction. However, contrary to previous research, affirmed subjects rarely differ from unaffirmed subjects in their willingness to accept new counter-attitudinal information."
 If you're interested, check out his blog and download the paper.  Next up...President Obama's jobs speech and the CNN/Tea-Party debate in Tampa.

No comments:

Post a Comment