The Intersection of Science and Policy: Distrust Is Not Irrational, But Observational

Over the last couple of weeks, Chris Mooney has written several interesting posts and articles about how human cognition affects the incorporation of evidence, especially scientific evidence (e.g., global warming) and what that means for politics. At the back of my mind have been nagging issues of the assumptions Mooney has been making. The assumptions have been crystallized by a post by Timothy Burke. I'm not sure that voters are as irrational as Mooney makes them out to be--overall, I think they're far more 'low information' and focused on a few reasonable, if not perfect and sometimes exploitable, 'rules of thumb' that then guide how other secondary issues are interpreted (and, unfortunately, global warming is one of those issues).

Regarding the distrust of scientific evidence--which should be more accurately described as policy and political interpretations of scientific publications--there are actually some legitimate grounds for that distrust. First, scientists have done some extraordinarily sleazy things (e.g., the Tuskeegee experiments), not to mention declaring personal opinions to be scientifically-based, such as eugenics. And the popular movie depiction of scientists as either ammoral or immoral hasn't helped either.

Burke notes that politics also has sullied the legitimacy of the scientific process (italics mine):

...the interests of political elites and institutional actors within modern states are demonstrably not identical in all or even most instances to the public good, and have a history in their own right of delivering policies which subsequently prove to have unintended, uneven, self-interested or destructive effects. When scientific knowledge gets caught up in that process, it becomes by definition less trustworthy or more worthy of skepticism than research which is not strongly directed towards justifying political or bureaucratic decisions. Add to this the intrusion of businesses and other private institutions with a strong interest in the production (or suppression) of particular kinds of scientific knowledge in relationship to the making of public policy. A historical perspective quickly demonstrates that many claims imbued with the authority of science, deployed in service to policy, have had powerful consequences but a very weak relationship to scientific truths.

Indeed. And the third point has to do with the intersection of the Decline Effect with propaganda and hype (often not by the researchers themselves). Because many studies--think of all of the health-oriented observational studies--are after-the-fact (post-hoc) tests of hypotheses, they shouldn't be used to ask questions because they are often underpowered (e.g., not enough patients). This lack of power means that 'convincing' results which are spurious in a biological sense are often statistically significant:

Gelman (and he has some good slides over at his post) is claiming, correctly, that if the effect is weak and you don't have enough samples (e.g., subjects enrolled in the study), any statistically significant result will be so much greater than what the biology would provide that it's probably spurious. You might get lucky and have a spurious result that points in the same direction as the real phenomenon, but that's just luck....

But the problem with real-life experimentation is that we often don't have any idea what the outcome should look like. Do I have the Bestest Cancer Drug EVAH!, or simply one that has a small, but beneficial effect? If you throw in a desire, not always careerist or greedy (cancer does suck), to want or overestimate a potential large effect, the healthy habit of skepticism sometimes is observed in the breach. Worse, if you're 'data-mining', you often have no a priori assumptions at all!

Note that this is not a multiple corrections or 'p-value' issue--the point isn't that sometimes you'll get a significant result by chance. The problem has to do with detection: with inadequate study sizes plus weak effects, anything you detect is spurious, albeit sometimes fortuitous.

I really don't think we should underestimate just how damaging the constant barrage of bizarre claims that are then overturned by later studies is to the legitimacy of the scientific process. It's incredibly damaging.

This isn't to say that scientific results aren't filtered by political leanings or circumstance (e.g., people might be more receptive to global warming concerns if unemployment was lower). And never underestimate the phenomenon of large swathes of the American public to rally around the notion of punching Dirty Hippies in the Face.

But criticism of science which is marshaled in support of policy isn't necessarily irrational or a cognitive slip up, just as economic behavior isn't often irrational in a cognitive sense.

By the way, telling people they're being irrational will piss them off. Just saying.

Categories

More like this

whoa. we're talking totally past each other here. distrust of science really has nothing to do with this--everybody trusts science when it seems to support what they already believe, cites science to support what they believe, goes looking for more science to support what they believe....and attacks science that doesn't seem to support what they believe, while claiming to be pro-science.

"e.g., people might be more receptive to global warming concerns if unemployment was lower"
Doubtful. People with a financial or emotional attachment to denying global warming would just choose a different stick/'argument from result' to poke people with. Try reformulating the statement as "people might be more receptive to vaccines if the amount of mercury in them was lower".
How well did that work out?

Mooney's Mother Jones article is here:

The Science of Why We Don't Believe Science
http://motherjones.com/politics/2011/03/denial-science-chris-mooney

It contains some very provocative ideas.

Basically, humans are emotionally driven more than they are intellectually driven. Thus the phenomenon of very highly educated people who believe things that fall apart under logical analysis.

So it doesn't matter if you went to Harvard or Yale: whether you like Kennedy better than Reagan, or whether you believe there is a God is an emotional issue, not an intellectual one.

That is why proving to someone else why they are wrong never (NEVER) changes their mind.

Example: Obama just released his long-form birth certificate. Did that convince the birthers? No, it did not.
And it never will. They don't believe intellectually; they are emotionally attached to their belief.

Scientists have not figured this out yet.

Yes, I know there's a lot of studies that show this principle, but what I am saying is that in the public sphere, scientists keep getting clobbered because they defend against emotional attacks with logic.

As far as the human brain is concerned, this is like fighting against an M-16 with a bb gun.

We need a few more scientists who can just come out, like PZ for example and say straight to a denialist's face - "We're RIGHT because WE'RE F-CKING RIGHT, IDIOT!!"

There - now you have an emotional RPG to fight back with.

A large part of the problem with the public perception of science has simply to do with how it is reported to the public. Most people do not read the original research papers, but only articles written by non-expert reporters about them. Reporters, especially in the last 30 years, have far less motivation to get the story right than to make it sound sensational so their articles get read and they get attention.

"Scientists expect to find God with new particle accelerator!!!" gets far more attention than the more accurate "scientists hope to find evidence for the Higgs boson after X months or years of accelerator runtime."

With the decline, and frankly destruction, of professional standards in the press in recent decades, business and political interests have no trouble distorting science reporting however they want. The closest many reporters come to the truth, or what their editors allow to be published, is simply gossip of the he said/she said variety. The fact that one side has thousands of dedicated unbiased researchers spending their careers on the issue and the other has half a dozen stuffed shirts being paid by a couple rich corporations isn't mentioned and the reporting is presented as both sides being equaly credible.

There are too many examples to count of deliberate and obvious falsehoods being widely spread unchallenged through the media. There are days when I'd give anything to hear a reporter say "Politician X lied today when he claimed Y," instead of simply repeating the obvious lie with "Politician X said today Y."

We used to have more reporters working for major media organization who would honestly report, instead of mindless repeat whatever lies are said. Unfortunately, media consolidation and rules changes have either led such reporters to either repeat the falsehoods to keey their jobs, or been fired. This is the real reason that newspapers are dying across the country and online newsblogs have taken off. The blogs certainly have the advantage of prompter publishing, but so many newpapers threw away the trust they had with the public that their entire industry has been crippled.

Why should I pay money to read obvious lies in the local paper when I can quickly visit a variety of blogs online to get a well rounded view of the world? If newspapers still presented the news in a detailed, well researched, unbiased fashion, they wouldn't be dying off.

By DetailsMatter (not verified) on 30 Apr 2011 #permalink

"I really don't think we should underestimate just how damaging the constant barrage of bizarre claims that are then overturned by later studies is to the legitimacy of the scientific process. It's incredibly damaging."

Bizarre claims are not damaging the legitimacy of the scientific process, methodically sorting the shit from the clay is the whole point of the scientific process.

Or as Sagan more eloquently put it - "Science is more than a body of knowledge; it is a way of thinking. I have a foreboding of an America in my childrenâs or grandchildrenâs time â when the United States is a service and information economy; when nearly all the key manufacturing industries have slipped away to other countries; when awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues; when the people have lost the ability to set their own agendas or knowledgeably question those in authority; when, clutching our crystals and nervously consulting our horoscopes, our critical faculties in decline, unable to distinguish between what feels good and whatâs true, we slide, almost without noticing, back into superstition and darkness."

Basically, humans are emotionally driven more than they are intellectually driven. Thus the phenomenon of very highly educated people who believe things that fall apart under logical analysis.

So it doesn't matter if you went to Harvard or Yale: whether you like Kennedy better than Reagan, or whether you believe there is a God is an emotional issue, not an intellectual one.

That is why proving to someone else why they are wrong never (NEVER) changes their mind.

Example: Obama just released his long-form birth certificate. Did that convince the birthers? No, it did not.
And it never will. They don't believe intellectually; they are emotionally attached to their belief.

Scientists have not figured this out yet.

I Agree with yogi-one !