The Debunking Handbook Part 1: The first myth about debunking

The Debunking Handbook is a guide to debunking myths, by John Cook and Stephan Lewandowsky. Although there is a great deal of psychological research on misinformation, unfortunately there is no summary of the literature that offers practical guidelines on the most effective ways of reducing the influence of misinformation. This Handbook boils down the research into a short, simple summary, intended as a guide for communicators in all areas (not just climate) who encounter misinformation. The Handbook will be available as a free, downloadable PDF at the end of this 6-part blog series.

This has been cross-posted at Shaping Tomorrow's World.

Introduction

Debunking myths is problematic. Unless great care is taken, any effort to debunk misinformation can inadvertently reinforce the very myths one seeks to correct. To avoid these “backfire effects”, an effective debunking requires three major elements. First, the refutation must focus on core facts rather than the myth to avoid the misinformation becoming more familiar. Second, any mention of a myth should be preceded by explicit warnings to notify the reader that the upcoming information is false. Finally, the refutation should include an alternative explanation that accounts for important qualities in the original misinformation.

Debunking the first myth about debunking

It’s self-evident that democratic societies should base their decisions on accurate information. On many issues, however, misinformation can become entrenched in parts of the community, particularly when vested interests are involved.1,2  Reducing the influence of misinformation is a difficult and complex challenge.

A common misconception about myths is the notion that removing its influence is as simple as packing more information into people’s heads. This approach assumes that public misperceptions are due to a lack of knowledge and that the solution is more information - in science communication, it’s known as  the “information deficit model”. But that model is wrong: people don’t process information as simply as a hard drive downloading data. 

Refuting misinformation involves dealing with complex cognitive processes. To successfully impart knowledge, communicators need to understand how people process information, how they modify their existing knowledge and how worldviews affect their ability to think rationally. It’s not just what people think that matters, but how they think.

First, let’s be clear about what we mean by the label “misinformation” - we use it to refer to any information that people have acquired that turns out to be incorrect, irrespective of why and how that information was acquired in the first place. We are concerned with the cognitive processes that govern how people process corrections to information they have already acquired - if you find out that something you believe is wrong, how do you update your knowledge and memory?

Once people receive misinformation, it’s quite difficult to remove its influence. This was demonstrated in a 1994 experiment where people were exposed to misinformation about a fictitious warehouse fire, then given a correction clarifying the parts of the story that were incorrect.3 Despite remembering and accepting the correction, people still showed a lingering effect, referring to the misinformation when answering questions about the story. 

Is it possible to completely eliminate the influence of misinformation? The evidence indicates that no matter how vigorously and repeatedly we correct the misinformation, for example by repeating the correction over and over again, the influence remains detectable.4 The old saying got it right - mud sticks.

There is also an added complication. Not only is misinformation difficult to remove, debunking a myth can actually strengthen it in people’s minds. Several different “backfire effects” have been observed, arising from making myths more familiar,5,6 from providing too many arguments,7 or from providing evidence that threatens one’s worldview.8

The last thing you want to do when debunking misinformation is blunder in and make matters worse. So this handbook has a specific focus - providing practical tips to effectively debunk misinformation and avoid the various backfire effects. To achieve this, an understanding of the relevant cognitive processes is necessary. We explain some of the interesting psychological research in this area and finish with an example of an effective rebuttal of a common myth.

References

  1. Jacques, P. J., & Dunlap, R. E. (2008). The organisation of denial: Conservative think tanks and environmental skepticism. Environmental Politics, 17, 349-385.
  2. Oreskes, N., & Conway, E. M. (2010). Merchants of doubt. Bloomsbury Publishing. 
  3. Johnson, H. M., & Seifert, C. M. (1994). Sources of the continued influence effect: When discredited information in memory affects later inferences. Journal of Experimental Psychology: Learning, Memory, and Cognition, 20 (6), 1420-1436.
  4. Ecker, U. K., Lewandowsky, S., Swire, B., & Chang, D. (2011). Correcting false information in memory: Manipulating the strength of misinformation encoding and its retraction. Psychonomic Bulletin & Review, 18, 570-578.
  5. Skurnik, I., Yoon, C., Park, D., & Schwarz, N. (2005). How warnings about false claims become recommendations. Journal of Consumer Research, 31, 713-724.
  6. Weaver, K., Garcia, S. M., Schwarz, N., & Miller, D. T. (2007). Inferring the popularity of an opinion from its familiarity: A repetitive voice sounds like a chorus. Journal of Personality and Social Psychology, 92, 821-833. 
  7. Schwarz, N., Sanna, L., Skurnik, I., & Yoon, C. (2007). Metacognitive experiences and the intricacies of setting people straight:Implications for debiasing and public information campaigns. Advances in Experimental Social Psychology, 39, 127-161.
  8. Nyhan, B., & Reifler, J. (2010). When Corrections Fail: The Persistence of Political Misperceptions. Political Behavior, 32, 303-330.

 

Posted by John Cook on Wednesday, 16 November, 2011


Creative Commons License The Skeptical Science website by Skeptical Science is licensed under a Creative Commons Attribution 3.0 Unported License.