The Debunking Handbook Part 2: The Familiarity Backfire Effect
Posted on 18 November 2011 by John Cook, Stephan Lewandowsky
Please scroll down for an update posted on June 22, 2017 regarding The Familiarity Backfire Effect!
The Debunking Handbook is a freely available guide to debunking myths, by John Cook and Stephan Lewandowsky. Although there is a great deal of psychological research on misinformation, unfortunately there is no summary of the literature that offers practical guidelines on the most effective ways of reducing the influence of misinformation. This Handbook boils down the research into a short, simple summary, intended as a guide for communicators in all areas (not just climate) who encounter misinformation. The Handbook will be available as a free, downloadable PDF at the end of this 6-part blog series.
This post has been cross-posted at Shaping Tomorrow's World.
To debunk a myth, you often have to mention it - otherwise, how will people know what you’re talking about? However, this makes people more familiar with the myth and hence more likely to accept it as true. Does this mean debunking a myth might actually reinforce it in people’s minds?
To test for this backfire effect, people were shown a flyer that debunked common myths about flu vaccines.1 Afterwards, they were asked to separate the myths from the facts. When asked immediately after reading the flyer, people successfully identified the myths. However, when queried 30 minutes after reading the flyer, some people actually scored worse after reading the flyer. The debunking reinforced the myths.
Hence the backfire effect is real. The driving force is the fact that familiarity increases the chances of accepting information as true. Immediately after reading the flyer, people remembered the details that debunked the myth and successfully identified the myths. As time passed, however, the memory of the details faded and all people remembered was the myth without the “tag” that identified it as false. This effect is particularly strong in older adults because their memories are more vulnerable to forgetting of details.

How does one avoid causing the Familiarity Backfire Effect? Ideally, avoid mentioning the myth altogether while correcting it. When seeking to counter misinformation, the best approach is to focus on the facts you wish to communicate.

Not mentioning the myth is sometimes not a practical option. In this case, the emphasis of the debunking should be on the facts. The often-seen technique of headlining your debunking with the myth in big, bold letters is the last thing you want to do. Instead, communicate your core fact in the headline. Your debunking should begin with emphasis on the facts, not the myth. Your goal is to increase people’s familiarity with the facts.
Update June 2017: Some news about The Familiarity Backfire Effect
Stephan Lewandowsky published the post Claiming that Listerine alleviates cold symptoms is false: To repeat or not to repeat the myth during debunking? on June 22 which contains this annotated screenshot from the handbook:
You can read more about this latest research in a series of three blog posts on Shaping Tomorrow's World:
- Can Repeating False Information Help People Remember True Information? - published on June 19, 2017 by Tania Lombrozo
- Qualifying the Familiarity Backfire Effect - published on June 20, 2017 by Briony Swire
- Familiarity-based processing in the continued influence of misinformation - published on June 21, 2017 by Stephan Lewandowsk
References
- Skurnik, I., Yoon, C., Park, D., & Schwarz, N. (2005). How warnings about false claims become recommendations. Journal of Consumer Research, 31, 713-724.
Arguments


































I actually took a survey of my friends and acquaintances on this point when I was very young (late teens, and updated on into my 30's). In fact different people have very different inner representations when they think. I, for example, typically think in words. Others, including presumably jmorpuss, think in pictures. My elder sister and my father thinks in "concepts", they are quite emphatic that it is neither pictures nor words. One friend I had had no inner representation associated with thought, and I knew one philosopher who had no inner representation even for sight, although they were quite literate and otherwise functional. That would be a very high level functioning blind sight.
Just as there are a variety of forms of inner representation, there are a variety of ways of learning, with some learning best. Some learn best by reading, others by listening, and some by doing. I am sure there are other modes as well.
The key point is that we are all different. People who generalize from their own experience (including some noted psychologists) merely demonstrate their lack of imagination. That certainly applies to jmorpuss' five step process to thought.
If I state that that is simplistic, you won't question my statement ?
And if I state that I disagree with you what you have written but that I accept your right to believe it, I am negating myself somehow ?
jmorpuss : "Try and explain the word myth or fact to someone that has no schooling"
Myth - A story
Fact - The truth
Instead of "Former Vice-President Al Gore believes in climate change", how does "Fat, balding Al Gore (64 years old) believes in climate change" make you think? Constant repetition of a negative association, even if false, undermines the message that a highly intelligent former US VP has studied climate change and accepts the science.
I am not advocating the attachment of demeaning adjectives to people ... rather to things. Leave the personal stuff to Marc Morano.
Nearly everyone has heard of "Climategate", but the permananent attachment of an adjective like "the faux-scandal Climategate" has a better chance of sinking in to the consciousness of the reader, particularly with constant repetition.
My own opinion was that the climate science folks were too defensive about the faux-scandal Climategate, and that the amount of blogging-inches devoted to it probably backfired. However, this pseudo-scandal has probably died out in public consciousness, so that if it comes up (and there are whole sites devoted to it), mention it in no uncertain terms as the farrago of fabrication and exaggeration it really was.
Associating the word "Climategate" constantly with words like "faux", "pseudo", "farrago" helps get the message across. We learn most things by repetition, so continuous word-association will boost replacing myth with fact.
[DB] "When educated people try to explain facts with a story to show others how inteligent they are the"
Umm, no. Educated people try to explain facts with a story to show others because they are trying to help others learn. You are projecting your perception of things onto others here.
"A bit like what you said makes no sence just noise to me"
If you do not understand the explanation, ask for a different one instead of pointing fingers.
"Science shouldn't be a foreign language When communicating to the public."
That is the entirety of why we donate our time here: to try and help communicate climate science to the public in clear, understandable words.
The notion that words have no meaning without pictures assumes incorrectly that there is a straightforward way in which pictures have meaning. There is not. Finding meaning in pictures is as much a matter of convention as is language, and filtering language through pictures to find the meaning of language just adds to the explanatory burden. It leaves you with more to explain, not less.
Finally, I would not dream of trying to explain the meaning of "myth" or "fact" to some-one with no schooling, anymore than I would explain "quark" without first giving them enough education to understand basic principles of physics. On the other hand, seeing you think it is so easy, perhaps you can describe what the picture that means "myth" looks like.
Therefore this series is a valuable stepping stone along that path.
I think Bertrand Russell hit the nail on the head when he said "If a man is offered a fact which goes against his instincts, he will scrutinize it closely, and unless the evidence is overwhelming, he will refuse to believe it. If, on the other hand, he is offered something which affords a reason for acting in accordance to his instincts, he will accept it even on the slightest evidence. The origin of myths is explained in this way."
One of the problems we face, I suggest, is the problem of many having the "instinct" of not wanting to accept that we ourselves are the problem. Illustrated quite well on SKS here I think :)
Perhaps the series will deal with this.
Regardless I really look forward to more installments
When that is clear then recognize the bottom line of denialism: Delay. That is, delay the time when other energy sources are used and the flow of profits to Big Carbon slows way down. Learn to spot that bottom line - delay.
When you get into complications, you have rough going in part because of this sort of thing:
Understanding Public Complacency About Climate Change: Adults' mental models of climate change violate conservation of matter
Why Don't Well-Educated Adults Understand Accumulation? A Challenge to Researchers, Educators, and Citizen
and other items here:
http://jsterman.scripts.mit.edu/On-Line_Publications.html
so just start with Stop Burning Carbon and Leave it in the Ground. Some people have alternate realities firmly entrenched in their heads. There isn't time enough to deal with them. Reach others.
So how do people who have been blind since birth communicate? How do they think or dream if, as you insist, these activities require reference to visual imagery?
In short, there seems to be direct evidence that you are incorrect.
As Tom describes in #10 above, I use different mental constructs for different tasks (e.g. concepts for working out high level code architecture, images for figuring out how 'some assembly required' purchases fit together, words for reviewing possible phrasings for this post, et cetera). Most of the time I think and dream in concepts... bringing symbolic representations of those concepts (i.e. words, pictures, sounds, et cetera) into it requires a higher level of concentration.
Getting back to the main topic... AndrewD has a point. Alot of the site is laid out in the format, 'here is a myth... and here is why it is wrong'. Deliberately so given that the idea was to organize the information by myth so that it would be easy to locate the corrections. I think structure that has value for the 'database' underlying SkS and should be retained. This series is then best viewed as describing how that information can then be most effectively communicated to others.
Of course, my own style (as above) is usually to present discrepancies between the myth and reality. I'm guessing that probably causes some people to dig in more... but it's the way I evaluate information so it's always my natural inclination to prompt others the same way.
jmorpuss is actually presenting something that can be valuable - especially to public speakers. The best of well-known oratory contains a nice sprinkling of visual, auditory and other approaches, variously noted as kinetic, holistic, conceptual. The worst of teaching focuses on what students think they prefer - surprisingly adolescents who spend half their lives with headphones on claim to need this because they learn best with 'auditory stimulation'.
What neuroscience tells us about conveying information is that you use the modality best suited to the content. Geometry and geography rely heavily on visual information, for instance. No need to go into pedagogical arcana here, but we should all bear in mind that graphs are terrific for presenting some information, lots of words are unavoidable for others, even if they have lots of syllables.
Trying to convert absolutely everything into graphs or other pictorial material can actually obscure rather than clarify what you're trying to get across if the content is unsuited to this approach.
Indeed, I've been having a nagging feeling that debunking was only having the effect of repeating the meme. Explaining the facts (even if mentioning the myths as you do it) is a much more efficient way of communication.
In the post there is a one liner about age. I wasn't sure if it was from this paper or other data but here is a quote from the referenced paper
:
"the more often older adults were told
that a claim was false, the more likely they were to remember it erroneously as true after a 3 day delay. The size of this effect is far from negligible. After 3 days, older adults misremembered 28% of false statements as true when they were told once that the statement was false but 40% when told three times that the statement was false. There was no parallel tendency to misremember true information as false."
Very scary stuff...Perhaps incorrectly done fact-checking only reinforces Fox News falsehoods. I think you're right that drowning the falsehood in truth is the way to go in any debunking or response, the less mentioned the better. An alternative lead in to a rebuttal might be not to discuss the untruth, but just present a new truth that "person X is speaking untruthfully". I guess that's something along the lines of "Monckton Myths". Have the liar or spreader of untruth be labeled as such, so a new, true truth is planted and watered, more than the old falsehood watered...
I would have to assume that the fence-sitters are the ones to be most concerned about,but maybe we should worry about further entrenching AGW skeptic's belief in myths as well.
The effect we highlighted in the studies – that in the absence of specific information to the contrary, people assume that vaguely familiar statements are true – has some limits. The most important constraint for this discussion is probably that it tends to happen when people don’t have either extensive background knowledge on the topic, or motivation to call a statement true (or false). For example, someone with little knowledge about vaccinations might accept the statement “It’s a myth that flu vaccines cause the flu” when they first hear it. For this person, as time passes the “myth” designation will fade from memory, but the core of the proposition will still be familiar, making the statement “Flu vaccines cause the flu” seem true. This change in memory representation won’t affect an expert in vaccines, however, because they can draw on relevant background knowledge to assess the statement. Similarly, someone strongly motivated to argue that the vaccine is harmful will always call it harmful, even if the opposite statement seems vaguely familiar to them.
The upshot is that many people who come to this site probably won’t be adversely affected by the “Myths” organizing structure in the sidebar. People most at risk for the “illusion of truth” effect are those who have little knowledge of the topic other than what they read here. If they intend to read up on the topic, detailed knowledge will probably end up overruling any inference based on experienced familiarity. There are exceptions, but this seems like a reasonable generalization.
The second point that the comments bring to mind is that there are many routes to false or inaccurate beliefs, some of them cognitive, some motivational, and some a combination of the two. For example, sometimes people seek out weak counterattitudinal information in order to argue against it, which gives them practice in taking down weak arguments. The effect is that exposing themselves to opposing points of view sometimes “inoculates” them against countervailing information, rather than modifying their perspective.
Thanks, I look forward to seeing more of these posts.
[JC] Welcome also from me :-) Our treatment of the Familiarity Backfire Effect is deliberately quite brief and simplified in the Debunking Handbook - our aim is to provide a short, practical guide. I am also working with Stephan Lewandowsky and a few other scientists (including your co-author Norbert Schwarz) on a more thorough, scholarly review of research into misinformation.
That last point, that seeking out and arguing against weak opponents can serve to reinforce one's own point of view, is an interesting one.
I wonder, too, if that same paradigm works in a sense with false but emotional arguments, and if it is also aided by a "gang" mentality. That is, a frequent behavior I see on poorly moderated sites (WUWT, Nova, etc.) is that when they either cannot follow the reasoning behind an argument or cannot formulate an adequate response, they instead fall back on insults. Generally, several people will then pile on. The atmosphere turns into a sort of group "look at the fool who's not one of us and doesn't get it" attitude, and all meaningful dialogue screeches to a halt.
Taking this from their point of view, the reinforcement becomes "everyone else is laughing at this guy with me, so we're right and he's wrong, despite the fact that he's proved his point."
Thanks for the comment.That pretty much addressed what I was wondering about.Now I don't need to read Part 4.Just kidding.
Sphaerica makes a good point about emotional arguments,gang mentality,and piling on.Whenever I encounter that sort of behavior,it immediately raises a red flag of irrationality,and unfortunately,I have seen this sometimes coming from people that I agree with,and it makes me cringe.
I generally try to maintain a civil debate with those that I disagree with,although I am not above a snarky retort if I sense someone being disingenuous or intellectually dishonest in their comments.
A typical response of mine might be "do you know of any national acadamy of science that agrees with you on this issue"
Seems to work well and not backfire !
I don't recommend people hide any of the myth on purpose except if you have short-term goals. There is no short-cut. If you care about lasting change, the person will have to visit and revisit the myths. It takes time. You should not expect results quickly (much like any human doesn't overcome issues overnight or from one surefire solution).. much like you don't gain deep understanding overnight. If they don't hear the facts and myths juxtaposed from you, they will likely hear it later over and over and not in the context of the facts. To over-ride an actual myth they have adopted, it may take a lot of time and the journey likely will come from within the person.
I agree easy explanations are great, so mentioning the myth alongside a rhyme or easy catchy rebuttal, etc, is valuable. When they think of the myth, you want them to then think of the failure of the myth or the right answer.
Working off contradictions is great because many theories can be internally consistent. To undo the effects of a myth, you want the myth not to float around in its own consistent if limited universe. You want to attach that mythology to contradictions the thinker easily believes. Also, when you want to convince yourself of something, you aren't going to perform every experiment. Contradictions are a great way to eliminate the wrong paths efficiently.. what remains will be the path taken.
In summary, help the thinker both (a) build up a web of interconnected facts and also, via contradictions, (b) help them derail every facet of a mythology (ie, defeat every little related myth). So it's OK to headline a myth if there are catchy and easy arguments that derail that myth. [Ultimately, they have to address the myths in their minds to defeat them.]