Mythbusting - it's harder than you think

The Washington Post reports on research that correcting mythical beliefs is more difficult than you'd think. The interesting finding seems to be that if you repeat the myth in the course of correcting it, people are more likely to forget the correct information and remember the myth!

When University of Michigan social psychologist Norbert Schwarz had volunteers read the CDC flier, however, he found that within 30 minutes, older people misremembered 28 percent of the false statements as true. Three days later, they remembered 40 percent of the myths as factual.

Younger people did better at first, but three days later they made as many errors as older people did after 30 minutes. Most troubling was that people of all ages now felt that the source of their false beliefs was the respected CDC.

The psychological insights yielded by the research, which has been confirmed in a number of peer-reviewed laboratory experiments, have broad implications for public policy. The conventional response to myths and urban legends is to counter bad information with accurate information. But the new psychological studies show that denials and clarifications, for all their intuitive appeal, can paradoxically contribute to the resiliency of popular myths.

It's interesting the examples that they use as popular myths that have become ingrained through repetition.

This phenomenon may help explain why large numbers of Americans incorrectly think that Saddam Hussein was directly involved in planning the Sept 11, 2001, terrorist attacks, and that most of the Sept. 11 hijackers were Iraqi. While these beliefs likely arose because Bush administration officials have repeatedly tried to connect Iraq with Sept. 11, the experiments suggest that intelligence reports and other efforts to debunk this account may in fact help keep it alive.

Similarly, many in the Arab world are convinced that the destruction of the World Trade Center on Sept. 11 was not the work of Arab terrorists but was a controlled demolition; that 4,000 Jews working there had been warned to stay home that day; and that the Pentagon was struck by a missile rather than a plane.

So hear that framers and mythbusters? If you want to change popular perception of science, and myths about everything from global warming to 9/11 conspiracies, one major thing to remember is to not repeat the myth.

Mayo found that rather than deny a false claim, it is better to make a completely new assertion that makes no reference to the original myth. Rather than say, as Sen. Mary Landrieu (D-La.) recently did during a marathon congressional debate, that "Saddam Hussein did not attack the United States; Osama bin Laden did," Mayo said it would be better to say something like, "Osama bin Laden was the only person responsible for the Sept. 11 attacks" -- and not mention Hussein at all.

There you have it. I admit this would be difficult to do. For the most part, when I take on something that is patently false as part of a skeptical response, I often repeat the claim in order to take it apart. This research would suggest that by merely repeating the myth, I'm shooting myself in the foot.

So the question is, when writing skeptically about myths that people believe and repeat, how do you challenge individuals making the claims without mentioning what claim they made? I'll have to keep this research in mind in the future I think, and while I'll still mock people for really stupid statements, the focus of skeptical writers should be on providing positive statements of correct information, while avoiding repetition of the false information.

More like this

This seems to present an impossible problem. What is one to do?

"Position X is a complete sham! A travesty! A pack of lies!"

"Sure, but, uh... You just said 'Position X'. Are you talking about creationism? Global warming denial? Homeopathy."

"Sorry, I can't tell you, you might start believing in it."

Actually, I suspect the answer might lie in how the false claim is presented. For instance, consider South Park's take on Scientology. Even though they did no debunking at all, but rather simply stated the tenets of Scientology, I doubt very much came away from that believing in Xenu. So, why does repeating a claim in a traditional debunking cause it to stick in somebody's brain while repeating it in South Park doesn't.

Well, ridicule. The traditional debunking takes a "just the facts" approach to laying out whatever claim is being dissected, but the South Park approach mocks the claim while laying it out. Social pressure is immensely strong in humans, even ones who think they're above it, so it's not unreasonable to think that "You're ridiculous for believing this" (preferably followed with, "and here's why") is more likely to cause a person to turn away from a belief than "This belief is a serious proposition deserving of serious consideration, but it's still wrong" (again, preferably with an explanation of why).

But of course I'm probably predisposed to think this way, being in the de facto anti-Nesbit crowd.

It's very counter-intuitive, though. I mean, I used to read Snopes a lot, as well as watch Mythbusters, but I never felt inclined at all to believe any of the busted myths afterward, even though both describe the myths in detail before debunking (or confirming, occasionally) them. I suppose that's something to chalk up to personality types.

It might have a lot to do with time investment and thoroughness as well.

Here they're working with a flier. Whereas mythbusters and Southpark are showing visual proof, are humorous and memorable.

I agree with your ridicule idea as well. If people associate the idea with being an idiot in their brains they might not repeat the claims so readily.

This is common knowledge in marketing. If you have a pill that cures headaches, but also add that it does not cause drowsiness, does not cause constant urination, does not cause...and so on, people will remember the pill as something that causes drowsiness, causes urination, and so on.

It's a difficult problem.

RealityOn: Apply directly to the forebrain! RealityOn: Apply... okay. I'll stop now.

I've noticed this on a smaller scale myself, combating the junk emails that get sent around. After a time, people just stop sending them to me, while still forwarding them to everyone else they know.

My hypothesis is that people don't really *want* to know the truth, especially when it conflicts with comfortable pre-judgements.

At the risk of making up my science as I go along, I think a key issue is how much people care in the first place about the information you are trying to convey.

For example, I pay no attention to flu vaccines, because I imagine I'm too young to nee them. So if you want to bust the myth "Only older people need flu vaccine" with me you have an uphill struggle, because as soon as you start talking about boring flu vaccines I'm starting to switch off. I'm just not (in the context of a flier you have worthily pressed on me) going to make the mental effort to stash the correct information in the correct place.

Whereas if I were going around scared stiff of the flu, but really didn't need the vaccine, you would have a better chance to convince me because I'd be really concerned, if I thought you were credible on the topic.

In the case of 9/11, I think the logic of what I'm saying is that, to the American people, one swarthy person with facial hair and an un-Christian name is pretty much like another, and who cares who you go after, as long as somebody suffers.

I owe most of what I know about Mormons and Scientologists to South Park. Education can be enjoyable when it's presented right.

By Watt de Fawke (not verified) on 06 Sep 2007 #permalink

You must put the bogus claim as context in a debunking. You can't assume the bogus claim because there might be people that haven't heard of it.

Now that you say it, most bogus claims are presented today in an audiovisual way thanks to television, so they are more memorable that their written rebutals.

So it's easy: we should make our own debunking shows and DVDs. With lots of jokes and explosions.

By Martin Pereyra (not verified) on 06 Sep 2007 #permalink

Hehehe.

Those notions remain widespread even though the federal government now runs Web sites in seven languages to challenge them. Karen Hughes, who runs the Bush administration's campaign to win hearts and minds in the fight against terrorism, recently painted a glowing report of the "digital outreach" teams working to counter misinformation and myths by challenging those ideas on Arabic blogs.

A report last year by the Pew Global Attitudes Project, however, found that the number of Muslims worldwide who do not believe that Arabs carried out the Sept. 11 attacks is soaring -- to 59 percent of Turks and Egyptians, 65 percent of Indonesians, 53 percent of Jordanians, 41 percent of Pakistanis and even 56 percent of British Muslims.

Say no more. The Bush administration's top PR team is on it.

As part of my continuing education I had the priviledge to be instructed on skill acquisition and performance by a guy called Doug Swanson. He taught us telling people what not to do is ineefective in learning, weshould always frame our instructions in a way that tells people what you wanted the people to do:
So when would normally say, "don't kick the ball at the goalkeeper," We now had to say, "put the ball into the corner of the net."
His take was that people ignore the "not" in any sentence and just remember what comes afterwards. And you know, anecdotally speaking it dose seem to work.
At the risk of seeming like an armchir psychologist when studying what people remember there is a heirarchy that seems to favour: the last thing heard, then the first thing heard then onwards.
If the last thing you heard was "Do not believe myth A" and the first thing you heard was "Some people belive myth A"
Then stripping down the everything after "not" and adding it to the first thing heard means a double amplification of the myth.

And there's no getting through to some people, regardless. Some years ago I worked in a health clinic, and we were pushing flu shots before Christmas. Most people were amenable, since a dozen or so kids in Colorado had recently died of the flu (and the CDC was hyping the hell out of the shots on TV).

One mom refused the shot with "heck, no! A bunch of kids just died of that!" Sigh.

I think I'll just reread that beer-making article now.

By tourettist (not verified) on 06 Sep 2007 #permalink

The key thing is to stay on message.
For example, rather than talk about how dumb creationism is, it's usually better to talk about how good science is.
Remember, it's the same nuclear physics that makes nuclear power plants work, that also allows us to date the earth at 4.5 billion years old (give or take). If you deny one part of science, you deny it all, and if you promote one part, you promote it all.

By Pseudonym (not verified) on 06 Sep 2007 #permalink

Remember, it's the same nuclear physics that makes nuclear power plants work, that also allows us to date the earth at 4.5 billion years old (give or take). If you deny one part of science, you deny it all, and if you promote one part, you promote it all.

Maybe I'm missing your point but isn't that just selectivity? Of the same kind that's practiced by the AEI, TechCentralDaily, etc?

Guns are the staple of police, military and hunters. Yet there are about 30,000 gunshot deaths in the US. Let us focus on the first three and disregard the fourth. Like we normally do because America of 2007 socially resembles the America of 1787.

Cars take us to work and provide convenience yet they also add to greenhouse gases. Focus on one rather than the other?

If you deny one part of science, you deny it all, and if you promote one part, you promote it all.

We can't separate the good from the bad? I thought that was the intent of regulations. And I don't think you can regulate if you avoid discussing negatives.

Joshua: "it's not unreasonable to think that 'You're ridiculous for believing this' (preferably followed with, 'and here's why') is more likely to cause a person to turn away from a belief than 'This belief is a serious proposition deserving of serious consideration, but it's still wrong'"

The catch here is that ridicule is a double-edged sword. Because people know that it can be used to hide a lack of evidence, they are also wary of it, at least when it comes from people that they don't trust. I could be wrong, but most people seem to use this pair of heuristics with regard to ridicule:

* If ridicule comes from people that I believe are on my side, buy into it.

* If ridicule comes from people that I believe are against me, take it as a sign that they are covering for not having anything of substance to say.

I think the way ridicule might work is the concept of replacing the myth association with another association.

Another strategy I think I hear the Washington Post article writer (not the researcher) discuss in term of a campaign smear was to simply not reference the smear.

So if the smear is "Senator X is a liar" instead of going on TV and saying "I am not a liar" (because the "not" seemed to get dropped from memory) to say "Senator X is a fine, moral, upstanding human being." This doesn't dispute the "liar" part but merely attempts to make those "myths" associated with the senator's name instead of the terms liar.