Monday, July 19, 2010

How facts backfire

Here is another story about how people refuse to accept facts contrary to their beliefs:
New research, published in the journal Political Behavior last month, suggests that once those facts — or “facts” — are internalized, they are very difficult to budge. In 2005, amid the strident calls for better media fact-checking in the wake of the Iraq war, Michigan’s Nyhan and a colleague devised an experiment in which participants were given mock news stories, each of which contained a provably false, though nonetheless widespread, claim made by a political figure: that there were WMDs found in Iraq (there weren’t), that the Bush tax cuts increased government revenues (revenues actually fell), and that the Bush administration imposed a total ban on stem cell research (only certain federal funding was restricted). Nyhan inserted a clear, direct correction after each piece of misinformation, and then measured the study participants to see if the correction took.

For the most part, it didn’t.
The paper is here (and also here), and I don't find it so convincing. It seems possible that the revenue decline was caused by an economic downturn, and that the tax cuts led to more tax revenue than there would have been without the tax cuts.

The correction is not so effective. If the US President says one thing, and some reporter says something different, then why should I believe the reporter over the President? In the experiment, the reporter was not explicitly saying that the President was lying, so maybe the reader might infer that the President is correct.

No comments: