Climate Science Glossary

Term Lookup

Enter a term in the search box to find its definition.

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Settings


All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Home Arguments Software Resources Comments The Consensus Project Translations About Support

Bluesky Facebook LinkedIn Mastodon MeWe

Twitter YouTube RSS Posts RSS Comments Email Subscribe


Climate's changed before
It's the sun
It's not bad
There is no consensus
It's cooling
Models are unreliable
Temp record is unreliable
Animals and plants can adapt
It hasn't warmed since 1998
Antarctica is gaining ice
View All Arguments...



Username
Password
New? Register here
Forgot your password?

Latest Posts

Archives

The Debunking Handbook Part 1: The first myth about debunking

Posted on 16 November 2011 by John Cook

The Debunking Handbook is a guide to debunking myths, by John Cook and Stephan Lewandowsky. Although there is a great deal of psychological research on misinformation, unfortunately there is no summary of the literature that offers practical guidelines on the most effective ways of reducing the influence of misinformation. This Handbook boils down the research into a short, simple summary, intended as a guide for communicators in all areas (not just climate) who encounter misinformation. The Handbook will be available as a free, downloadable PDF at the end of this 6-part blog series.

This has been cross-posted at Shaping Tomorrow's World.

Introduction

Debunking myths is problematic. Unless great care is taken, any effort to debunk misinformation can inadvertently reinforce the very myths one seeks to correct. To avoid these “backfire effects”, an effective debunking requires three major elements. First, the refutation must focus on core facts rather than the myth to avoid the misinformation becoming more familiar. Second, any mention of a myth should be preceded by explicit warnings to notify the reader that the upcoming information is false. Finally, the refutation should include an alternative explanation that accounts for important qualities in the original misinformation.

Debunking the first myth about debunking

It’s self-evident that democratic societies should base their decisions on accurate information. On many issues, however, misinformation can become entrenched in parts of the community, particularly when vested interests are involved.1,2  Reducing the influence of misinformation is a difficult and complex challenge.

A common misconception about myths is the notion that removing its influence is as simple as packing more information into people’s heads. This approach assumes that public misperceptions are due to a lack of knowledge and that the solution is more information - in science communication, it’s known as  the “information deficit model”. But that model is wrong: people don’t process information as simply as a hard drive downloading data. 

Refuting misinformation involves dealing with complex cognitive processes. To successfully impart knowledge, communicators need to understand how people process information, how they modify their existing knowledge and how worldviews affect their ability to think rationally. It’s not just what people think that matters, but how they think.

First, let’s be clear about what we mean by the label “misinformation” - we use it to refer to any information that people have acquired that turns out to be incorrect, irrespective of why and how that information was acquired in the first place. We are concerned with the cognitive processes that govern how people process corrections to information they have already acquired - if you find out that something you believe is wrong, how do you update your knowledge and memory?

Once people receive misinformation, it’s quite difficult to remove its influence. This was demonstrated in a 1994 experiment where people were exposed to misinformation about a fictitious warehouse fire, then given a correction clarifying the parts of the story that were incorrect.3 Despite remembering and accepting the correction, people still showed a lingering effect, referring to the misinformation when answering questions about the story. 

Is it possible to completely eliminate the influence of misinformation? The evidence indicates that no matter how vigorously and repeatedly we correct the misinformation, for example by repeating the correction over and over again, the influence remains detectable.4 The old saying got it right - mud sticks.

There is also an added complication. Not only is misinformation difficult to remove, debunking a myth can actually strengthen it in people’s minds. Several different “backfire effects” have been observed, arising from making myths more familiar,5,6 from providing too many arguments,7 or from providing evidence that threatens one’s worldview.8

The last thing you want to do when debunking misinformation is blunder in and make matters worse. So this handbook has a specific focus - providing practical tips to effectively debunk misinformation and avoid the various backfire effects. To achieve this, an understanding of the relevant cognitive processes is necessary. We explain some of the interesting psychological research in this area and finish with an example of an effective rebuttal of a common myth.

References

  1. Jacques, P. J., & Dunlap, R. E. (2008). The organisation of denial: Conservative think tanks and environmental skepticism. Environmental Politics, 17, 349-385.
  2. Oreskes, N., & Conway, E. M. (2010). Merchants of doubt. Bloomsbury Publishing. 
  3. Johnson, H. M., & Seifert, C. M. (1994). Sources of the continued influence effect: When discredited information in memory affects later inferences. Journal of Experimental Psychology: Learning, Memory, and Cognition, 20 (6), 1420-1436.
  4. Ecker, U. K., Lewandowsky, S., Swire, B., & Chang, D. (2011). Correcting false information in memory: Manipulating the strength of misinformation encoding and its retraction. Psychonomic Bulletin & Review, 18, 570-578.
  5. Skurnik, I., Yoon, C., Park, D., & Schwarz, N. (2005). How warnings about false claims become recommendations. Journal of Consumer Research, 31, 713-724.
  6. Weaver, K., Garcia, S. M., Schwarz, N., & Miller, D. T. (2007). Inferring the popularity of an opinion from its familiarity: A repetitive voice sounds like a chorus. Journal of Personality and Social Psychology, 92, 821-833. 
  7. Schwarz, N., Sanna, L., Skurnik, I., & Yoon, C. (2007). Metacognitive experiences and the intricacies of setting people straight:Implications for debiasing and public information campaigns. Advances in Experimental Social Psychology, 39, 127-161.
  8. Nyhan, B., & Reifler, J. (2010). When Corrections Fail: The Persistence of Political Misperceptions. Political Behavior, 32, 303-330.

 

0 0

Printable Version  |  Link to this page

Comments

Comments 1 to 15:

  1. Looking forward to your "Mythbuster" manual!
    0 0
  2. Great effort - this site just continues to break new ground and provide excellent links to real science to help us understand the implications of AGW. It's a pity that so much time has to be spent establishing the scientific reality of AGW to deal with the denialism that is so rife as it robs time and energy from the very necessary debates that are needed about what is/are the least cost/most effective actions to take. Perhaps the most contentious of those issues is the nucelar question - but that's a different forum. In the meantime I look forward to more in this series. Great work by John and Stephan
    0 0
  3. This could be a very important handbook, particularly for people like me who find themselves engaged with the misinformed from time to time. I already see myself mentioned in the statements above about providing more evidence, better theories etc while rubbishing their, now probably strongly held beliefs. It is very frustrating to try and explain that people's beliefs are in error only to come away thinking that you have gotten nowhere or even feel they are even more entrenched. I hope the advice here will help change that.
    0 0
  4. And so Skeptical Science rockets to the stratospheric top of must-read sites providing reliable and objective scientific analysis in the quagmire that is scientific denialism. Kudos, and kudos again.
    0 0
  5. I had no idea about this, this will be a fantastic resource. Thanks to John Cook and Stephan Lewandowsky. As others have noted above, SkS continue its tradition of developing innovative tools for communicating climate science and for refuting myths parroted by those in denial about AGW and "skeptics".
    0 0
  6. This might be the most important thing you've ever published. Bravo.
    0 0
  7. This will undoubtedly prove useful in other fields of work. I think I'll end up sharing this with co-workers. Thanks.
    0 0
  8. Nice summary; I look forward to more detail in the posts to follow. To my mind, what's particularly tricky is the type of effect in the Skurnik et al. paper (your reference #5 above). In that research, people understand that information is false when it's first presented. The problem arises when the "core proposition," so to speak, stays familiar over time, but the details of the warning/debunking fade from memory. That's when people show a strong bias to think that the familiar information is true, even when they accepted that it was false when they first learned it. This "backfire" develops especially quickly for older adults, people under time pressure, etc., and it's not clear what to do to alleviate the problem. Best advice is probably to rely on written back-ups to one's memory, but that's not always practicable...
    0 0
  9. Thanks John, I need this urgently for my future lectures :)
    0 0
  10. Can't wait to read this - thanks to both of you for writing it.
    0 0
  11. I am guilty of trying to get others to see my point of view by stuffing 'facts' down their throats. If I can learn from this how to actually effect change in my audience, I will be delighted. No doubt, I will need to start by checking my own world view against this book: no good getting people to see things my way if I am wrong! Thank you John and Stephan.
    0 0
  12. As alan_marshall #1 said!
    0 0
  13. John... This looks to be a great resource that I'd like to hand out to a few people. A suggestion, please consider adding permission to the PDF to be duplicated as many commercial copy shops will not allow copies to be made by them without it.
    0 0
    Response: [JC] Good idea, thanks for the suggestion.
  14. 'Refuting misinformation involves dealing with complex cognitive processes. To successfully impart knowledge, communicators need to understand how people process information, how they modify their existing knowledge and how worldviews affect their ability to think rationally. It’s not just what people think that matters, but how they think.' That sounds like a brainwashing technique to me. Oh, wait... it is! It gets worse. 'First, let’s be clear about what we mean by the label “misinformation” - we use it to refer to any information that people have acquired that turns out to be incorrect, irrespective of why and how that information was acquired in the first place.' Any information that turns out to be incorrect. I shouldn't have to point this out, for it should be obvious to everyone with a functioning mind, but what if it turns out that the one trying to modify a person's beliefs (i.e the brainwasher) is the one who is actually mistaken, but doesn't realise it? For example, I've noticed on the left a column titled 'Most Used Climate Myths' and the list of ten 'myths' which are actually... well, TRUE! 1. 'Climate has changed before' - yes it has, and it will continue to do so as long as we have a climate to speak of. And so it goes.
    0 0
  15. Peter, you sound like you're trolling. Let's hope not. Climate myths, as used here, are those claims that are used as evidence against the dominant theory. If you had gone to the page for that myth, this would be obvious. SkS is very much obviously not making the claim that climate hasn't changed before. As for "brainwashing," do a little research. Brainwashing is a systematic process that requires isolating the subject and submitting the subject to a careful reconditioning. What do you find heinous in the quoted passage? Understanding how people think is critical to delivering effective and efficient communication. Would you rather everyone floundered about? As for incorrect information, what is your method for generating "knowledge"? Is it science? Upon what basis do you choose one claim over another?
    0 0

You need to be logged in to post a comment. Login via the left margin or if you're new, register here.



The Consensus Project Website

THE ESCALATOR

(free to republish)


© Copyright 2024 John Cook
Home | Translations | About Us | Privacy | Contact Us