The duties that come with knowledge and uncertainty.

While I hope this hurricane season is a lot less eventful than the last one, it's always good to be ready. To that end, I'm brushing off (and bringing together here) two "classic" posts from the 2005 hurricane season.

As we look to the scientists to tell us what nature may have in store for us, we need to remember how scientists think about uncertainties -- and especially, how important it is to a scientist to avoid going with predictions that have a decent chance of being false. Being wrong may seem almost as bad to the scientist as being under 10 feet of water.

Meanwhile, the scientists need to remember that non-scientists ask the scientists for predictions so they can act on these predictions to make prudent decisions. (Except, of course, in the cases where the people who control the resources find it convenient to ignore the relevant scientific information.)

Lindsay Beyerstein at Majikthise put up a post last September that that sifted through the matter of what the Federal Emergency Management Agency (FEMA) did or did not know about about the likely effects of a hurricane on New Orleans. It's an interesting case of people in FEMA and the Army Corps of Engineers not being on the same page. I resisted the impulse to put it down to evil intent (e.g., changing your story on what you were expecting once your response to what actually happened turns out to be wholly inadequate). Rather, I think a good part of the problem may have come down to how people have dealt with uncertainties given the interests they feel themselves bound to serve.

Problem #1: Ahead of time you can't be exactly sure what's going to happen. If you have a really good model of how hurricanes of various strengths will interact with your region under specific conditions, you can generate a set of predicted outcomes. You can figure out the worst-case scenario, and how bad the other cases would be. You could, perhaps, develop a good guess as to which outcomes are most likely.

But then, you have to deal with these predictions.

Problem #2: Preparing for the case you think is most likely may be at odds with making the preparations needed to keep people safe. Even if the majority of projected outcomes have the hurricane missing your region, or hitting it but not damaging the levees, the fact that there is a non-negligible change the levees are going to get it might mean it's prudent to reinforce the levees (if there's still time), or to take aggressive measures to get people out of harm's way.

Being "overly cautious", though, might mark you as an hysteric. The folks you're trying to help might be less likely to listen to you next time if the worst case scenario doesn't come to pass. And, the other folks in the business of making models and predicting disasters may give you a hard time for acting as if the worst case scenario was more likely than any good modeler would have seen it was.

Of course, if you're working for the government, there's ...

Problem #3: Preparing to avoid harm has costs -- monetary costs and political costs. Even if you think a particular outcome is really likely, there may be pressure to anticipate a better outcome instead because of the lower costs. There are not endless buckets of money to throw at every possible harm your model predicts, and we get by on getting lucky.

So, even after you've surmounted the problem of building a good model of reality, when betting on what's going to come to pass, you have to weigh the competing pulls of:

  • wanting to be right
  • not wanting people to get hurt if it's avoidable
  • not wanting to get canned by your governmental overlords

Different people will give these pulls different weight. I think it's important for people to see the potential consequences of giving one of these pulls too much or too little weight, and to think about how it would feel to own those consequences. If hurricane prediction were just a matter of satisfying scientific curiousity, extreme caution in making predictions (e.g., requiring much more data before making a prediction at all) would be easier to justify. But hurricanes matter quite a lot to many people who are not scientists.

* * * * * *

Figuring out the relevant merits of minimizing type I vs. type II errors (here, of predicting there will be a devastating hurricane when no such devastating hurricane actually comes to pass vs. predicting there won't be a devastating hurricane when a devastating hurricane actually does come to pass) is just part of the challenge for scientists, however.

If you have the knowledge to foresee a really bad outcome -- not just to foresee its possibility, but to foresee that it is reasonably likely -- and if that outcome could hurt a lot of people, you probably have a duty to share that knowledge. Certainly, the people who are at risk might benefit from the knowledge. But, you'd also want to make sure that the people who might have the power to prevent the outcome had that knowledge, too, so they could prevent it.

But if you make that knowledge public, and you explain it very clearly, and you jump up and down to impress upon the people with the resources and the organization to head off catastrophe that they really need to take the risk seriously, doesn't the responsibility shift to the people with the means to do something who hem and haw and say, "Well, I don't know, it's a hypothetical risk and there are other interests we need to attend to ..." ? I mean, do the scientists who see disaster looming have to get the people who can do something in a headlock and make them do the right thing?!!

Or do the people who pooh-pooh the responsible scientists as hysterical doom-sayers have to lie in the bed they've made for themselves?

More like this

I think scientists have a very tricky communications job here. If add to many qualifiers they're not likely to be taken seriously if they're too heavy handed that the worst case scenario doesn't materialize then they're alarmist nuts.

Ultimately I don't think educating the public and executing policy are things that most scientists should be doing. We need more good science educators/popularisers like the late Carl Sagan or the still kicking and absolutely magic Attenborough to convey these messages to the general public in a way that is comprehensible and easily absorbed.