Every party needs a pooper, that’s why Scienceblogs invited me.
Enemy? Really? Yes, it can be. Read on; It doesn’t have to be.
A commentary in Nature Nanotechnology discusses the European Environment Agency report, Late Lessons from Early Warnings. The basic idea is to make recommendations so that nanotech can grow with the idea that if you ignore risks and get dealt a nasty surprise, the public backlash will doom the whole field. They came up with a list of 12 lesssons: some are as old as the hills and probably won’t followed; all would be welcome if they were followed.
Among the lessons that we won’t learn from (considering name change to Cynical Toxicologist) are the ones the commentary groups under “Heed the ‘warnings'”, “consider wider issues” and “retain regulatory independence” (hah!). Everybody knows this is what’s supposed to happen and it doesn’t; it’s not specific to nanotech. There are some that should stop and make you think, however. Lessons 5 and 8 (accounting for real world conditions and using layperson knowledge) are often dismissed as going beyond the worst case scenario. Anyone who has ever worked in a factory knows how much of a joke it would be to assume that the product conforms to the standard. Talking to the workers on the floor really changes how you think about potential risks. But what I really want to write about is lesson 12…
You say scientists, I say enablers. Let’s call the whole thing off.
My favorite is lesson 12, the “paralysis by analysis” lesson. Everyone likes to ask for more information instead of putting their ass on the line and making a decision. I can’t tell you how many gov’t advisory committees I’ve almost pulled my hair out at because the committee as a whole doesn’t want to make any recommendations other than ‘we need more research’. I can recall three where committee members specifically asked if they had to answer the question put before them as they’d rather not make judgements, and can’t recall a single one where the members didn’t want to change the questions posed to it. More data. More research. Uncomfort with uncertainty. When science meets public health, it gets ugly. Really ugly. It sounds pretty easy to say you’re going to use a weight of evidence approach until you try and apply it to a decision that will have a drastic effects on the public, the gov’t, and industry. It’s oh so tempting to pull up your ‘more research’ security blanket around you and refuse to make a decision. It’s not only tempting, it’s how we’re trained as scientists: look at data, formulate hypothesis, test hypothesis, repeat. It’s tough to break that mold and replace it: look at data, formulate hypothesis, act.
Because this is used so often, the costs in health, money, and manpower are astronomical. Think about the debate over global warming as an example. This is actually at a pretty good stage compared to most scientific debates over a public health issue. Most are stuck doing ever more research to try and confirm something that most people know while everyone mumbles about what we don’t know. With global warming at least we also have a large number of scientist saying we need to do something. So here’s my lesson 13 (et al.) and professional plea to other scientists out there: Whatever science you practice, take a stand about what you work on, be passionate about people and the data, understand that regulators need advice from advisory boards-not more confusion, understand that making decisions and being open-minded are not mutally exclusive, and most importantly, locate your spine and act in accordance with this discovery or confirmation as the case may be. Thank you.