Denialist Rhetoric

People argue bad science, psuedoscience and nonsense for a variety of reasons, some religiously motivated, some politically motivated, some out of ignorance, some out of arrogance, some out emotional needs, some due to psychological problems.

When they encroach onto the scinetific turf and argue nonsense within a scientific domain, they use a limited set of rhetorical tools. The exact choice of tools depends on the motivation, as well as the forum where they advocate the nonsense. Some, the generals in the army in War On Science, have big soapboxes, e.g., TV, radio and newspapers. Some teach and preach in schools and churches. Some run blogs, and some - the footsoldiers of The War - troll on other people's blogs.

So, when the motivation is political, when they are pushing for debunked conservative ideas, from femiphobic stances on anti-abortion and anti-stem-cell-research, through thinly-veiled racism of the War On Terror, to failed economic policies ("trickle-down") and global-warming denial, they mainly use one set of rhetorical strategies.

When the motivation is religious, as in Creationism, the strategies are similar, but not exactly the same. Loony fringe pseudoscience, from the Left or the Right (and sometimes it is difficult to figure out if they come from the Left or the Right) - appears to employ very similar rhetorical devices as the religiously motivated pseudoscience, suggesting that perhaps both are sharing the same underlying emotional disturbances.

Pseudoscientists of various colors, the denialists of reality, have been the topic of a couple of interesitng blog posts recently, most notably this one on GiveUp Blog. PZ Myers chimed in as well, adding a couple of other rhetorical devices. A number of commenters also added some good ones, e.g., David Harmon:

-- binary splitting (everything MUST be one way or another, no mixing)

-- idealization and denigration (combines with the previous, e.g., "good" must be perfect; any contamination of "evil" makes something entirely "evil")

-- projection (assigning to others the characteristics they reject in themself)

and adspar:

Another common tactic is to magnify doubt, which goes along with setting impossible expectations. Chris Mooney mentions numerous examples of this tactic in his book.
If you can't say something is 100% certain, or if your statistics have some margin of error, they jump all over it as if any sliver of doubt undermines a scientific claim.

Prometheus of Photon In The Darkness blog wrote a similar list of The Seven Most Common Thinking Errors of Highly Amusing Quacks and Pseudoscientists, in four installments: Part I, Part II, Part III and Part IV. This was done with a lot of care for detail and is well worth your time to read.

I do not have too much to add to this, though I'd like to see a complete taxonomy of rhetorical strategies, tabulated as to which ones are more likely to be used by politically motivated vs. religiously motivated purveyors of nonsense, which are more likely to be found on big bully-pulpits and which in comment threads on blogs.

Recently, when looking at an example of medical quackery (another category of pseudoscience), I identified several more rhetorical strategies, which are all familiar to you, I'm sure:

Reverence for the Past
Reverence for the Ancient Wisdom of the Orient
Naive Scientism
Complexity
Appeal to Mathematics
Prosecution Complex (which may foster Secrecy)

What do you think?

More like this

Check it out here. I mentioned previously that I'll be reviewing another chapter of the book in the future, and PZ has a write-up of the chapter on developmental biology here (that will be posted to Panda's Thumb in the coming days), and he also has some suggestions here on how bloggers can help…
Here is the third part of the introduction to SEED sciencebloggers, the next eleven (check out the first part and the second part if you have missed them before) of my SiBlings: Razib of Gene Expression and I go back a long way of .....friendly disagreement on pretty much everything. But he…
I've on occasion been asked why I even bother responding to the brain--and I do use the term loosely--droppings of Mike Adams, the purveyor of one of the largest repositories of quackery on the entire Internet. Good question. Sometimes I wonder that myself. After all, Adams is so far out there, so…
Today, instead of introducing people, I will introduce a session, or two or three. Feedback from participants of the last two conferences indicated a lot of interest in sessions relevant to science educators at all levels. At both the 1st and the 2nd conference, we had one session on using blogs…

I think these are symptoms, not causes. Reverance for the past merely means learning from past experiences of others (a form of "Choose the best" I talked about in my series that I note you didn't link to but I'm fine with that no really). Likewise scientism, complexity, etc., are all part of a bounded learning algorithm that is simply triggering on the wrong cues.

On the other hand the lists given by PZ Whosits and Give Up Blog are lists of failures of moral fibre, or special pleading, or other fallacies. We can usefully distinguish between failures of learning, and failures of good faith.

While John is good to be forward thinking in terms of how to prevent people from going down these paths, I think our posts deal more with what you do when people have adopted these unscientific arguments, or worse, when there is a big-money PR campaign using these tactics to spread misinformation.

I want people to look at global warming denialism or creationism spread by the George Marshal Center, Mercatus, Heritage, Cato, DI, whatever, and be able to say, "hey, that's the same bullshit tactics these anti-science guys always use."

In fact, I'm thinking of developing a scoring system for denialism to rate these anti science positions has they come out of DI or whatever. I have 5 criteria (1) conspiracy (2) selectivity (3) False Experts (4) Impossible expectations/moving goalpost (5) Illogical appeals (analogy, red herrings etc.). Maybe we can give them a total of ten points. 3 for presenting one or more conspiracies, 1 point (for a total of 3) for each instance of quote-mining or selective data representation, 1 point for a false expert, 1 point for the impossible expectation, and finally one point for each illogical appeal up to 2 points. We can thereby create a "denialism score" between 1 and 10 points to rapidly dismiss the anti-science BS being spread by people with financial interests in keeping us ignorant, or are ideologically motivated to attack good science

I like quitter's idea of a scoring system. I'd suggest an additional item, which is a break down, or provided examples, under each category. Per my post at his place, one of the problems is that citizens don't know how to analyze content like this. A detailed scoring card would provide both ready-made tools for a specific application and, over time, education in how to think in this analytical way.

Also, it would make it more difficult for the perpetrators to wiggle out of points. For example, one could avoid any hint of conspiracy and automatically score 7 or lower, etc.

The more I see of these people the less I think of these as mistakes and the more I see them as deliberate lying. In a truth-seeking debate one takes in an opponent's critique and attempts to integrate it, so it's weaknesses and strengths can be discussed. In a public, psuedo-debate, every word is about winning and the truth is immaterial. Careful analysis of denialist's rhetoric, while avoiding the details of arguments, frequently shows big holes in logic and demonstrates profound disinterest in the pursuit of truth. Behavorial science should be consulted because, I'd argue, the good guys underutilize this approach.

By SkookumPlanet (not verified) on 20 Sep 2006 #permalink

Not sure what the "complexity" thing is about, since I and my profs have said something (for instance, a signalling pathway) is "complex"; but I don't think that's what you meant. But, yes, I recognize all those "tactics".

Sorry - the explanation for that last list is in that last link - this was getting long so I decided not to go into detailed explanations here.

"I think these are symptoms, not causes."

Agreed! Our Host didn't quote this part, but my suggestions were based out of clinical psychology. They're examples of "infantile defenses of the ego" -- various universal psychological patterns which appear in infancy, but are "supposed to" (by modern standards) be enveloped and controlled by later-developing parts of the mind. Note that they never really disappear, just get "built over" -- or not, if maturation doesn't proceed "properly".

By the way, "magnifying doubt" and "impossible expectations" are classic examples of the idealization/denigration pattern. Indeed, they're a demonstration of why those two apparently-opposite ideas are really flipsides of the same thing. Likewise "reverence for ..." paired with magical use of scientific terms. The "appeal to complexity" is a species of "withdrawal", which is another of the infantile defenses. Come to think of it, the "straw men" and question-dodging habits fit under yet another, "displacement".

By David Harmon (not verified) on 28 Sep 2006 #permalink

Whoops, "reverence for past, tradition, etc" should have been paired with the refrain that "all those eggheads are full of *$^%, but 'I' or 'MY guru' have the Truth(tm)".

By David Harmon (not verified) on 28 Sep 2006 #permalink

Change Prosecution Complex to Persecution Complex.