Tuesday, September 16, 2008

Why It’s Better To Fight Lies With Different Lies

I’m being a bit flip, but the point is that research continues to show me that if you can’t fight lies with “the truth” than its better start telling new lies. Or at least change the subject.

Shankar Vedantam’s Human Behavior column points to another fascinating study about what people think when they are told something isn’t true. Most of the time it doesn’t matter, the effect has already happened.
In experiments conducted by political scientist John Bullock at Yale University, volunteers were given various items of political misinformation from real life. One group of volunteers was shown a transcript of an ad created by NARAL Pro-Choice America that accused John G. Roberts Jr., President Bush's nominee to the Supreme Court at the time, of "supporting violent fringe groups and a convicted clinic bomber."

Bullock then showed volunteers a refutation of the ad by abortion-rights supporters. He also told the volunteers that the advocacy group had withdrawn the ad. Although 56 percent of Democrats had originally disapproved of Roberts before hearing the misinformation, 80 percent of Democrats disapproved of the Supreme Court nominee afterward. Upon hearing the refutation, Democratic disapproval of Roberts dropped only to 72 percent.
Basically if you already were primed to dislike John Roberts, the information had the most effect on you, even after you were told it wasn’t true. If you weren’t primed to dislike him, it had less effect. Vedantam doesn’t mention what about the people who weren’t primed either way, but I would bet it still had some effect, perhaps even a lot, but less than on those who already disliked him.

I can’t find the original study but I can speculate a few reasons why it would work that way. If I'm already in an anti-John Roberts frame of mind, I think hearing “John Roberts supported a convicted clinic bomber” has the effect of reminding me why I don't like him (His extreme positions about women’s rights), even when I find out later that this specific fact isn't true. I remain in a slightly elavated state of John Roberts-hating despite the fact the new cause of the hate is wrong. (Just to be clear, I'm using pretty broad terms to discuss what are really more subtle emotions and thoughts. But being in a "John Roberts-slightly-elevated state of increased dislike" just doesn't roll off the tongue.)

Another aspect of the study I would like to know more about is how the corrections were presented to the test subjects. The article says the subjects were shown an "ad by abortion-rights supporters." I’m not sure I would trust pro-life group to tell me the sky is blue. It's possible that in this particular study the source of the refutation is the problem, and hence why hearing it didn't change the democrats' feelings about Roberts. However if it was presented as coming from a more neutral source, say from factcheck.org or the Washington Post, they might have found it more trustworthy and had a bigger impact. However the Republicans might not have had the same reaction from the source.

Which leads to the second study Vedantam quotes.
Political scientists Brendan Nyhan and Jason Reifler provided two groups of volunteers with the Bush administration's prewar claims that Iraq had weapons of mass destruction. One group was given a refutation -- the comprehensive 2004 Duelfer report that concluded that Iraq did not have weapons of mass destruction before the United States invaded in 2003. Thirty-four percent of conservatives told only about the Bush administration's claims thought Iraq had hidden or destroyed its weapons before the U.S. invasion, but 64 percent of conservatives who heard both claim and refutation thought that Iraq really did have the weapons. The refutation, in other words, made them misinformation worse.

In a paper approaching publication, Nyhan, a PhD student at Duke University, and Reifler, at Georgia State University, suggest that Republicans might be especially prone to the backfire effect because conservatives may have more rigid views than liberals: Upon hearing a refutation, conservatives might "argue back" against the refutation in their minds, thereby strengthening their belief in the misinformation. Nyhan and Reifler did not see the same "backfire effect" when liberals were given misinformation and a refutation about the Bush administration's stance on stem cell research.
Again, I’m wondering if the source of the refutation matters? Republicans are more likely to distrust the so-called mainstream media outlets, your Washington Post, New York Times, NBC, CBS, ABC, 60 Minutes, Newsweek, Time, etc, etc, etc. But I’m wondering if they heard that The National Journal refuted Bush administration's prewar claims that Iraq had weapons of mass destruction would that change the results? Possibly not:
A similar "backfire effect" also influenced conservatives told about Bush administration assertions that tax cuts increase federal revenue. One group was offered a refutation by prominent economists that included current and former Bush administration officials. About 35 percent of conservatives told about the Bush claim believed it; 67 percent of those provided with both assertion and refutation believed that tax cuts increase revenue.
This is why whenever I read about people hearing that Sarah Palin is telling lies I know it won't faze Republican voters. They think it’s the media who are the liars.

But the other part of charge-countercharge that these studies can’t duplicate is that even when we hear a refutation, we can often find a contradictory opinion. Especially if it supports a belief we already want to believe. Don’t like factcheck.org, don’t worry. Someone on Newsmax already explained why “the media” is just spinning lies.

I would rather live in a world where untruths can be countered by facts. But that doesn’t seem to be the world we live in. So rather than fighting fire with sand its probably better to fight with fire. Cause it doesn’t matter how much sand you put on some lies, it never puts them out.

Cross-posted at Feminist Underground

2 comments:

  1. Oh no, the study is spreading more lies! It really is true that observation always affects that which is being observed!

    ReplyDelete
  2. Wow - great post Newscat. Discouraging, though. I really don't want to play dirty, I want people to think critically. It looks like that means asking us to overcome our natures, though.

    ReplyDelete