Wrong by 6 orders of magnitude? And you trust the nuclear folks why?

We end up examining political, cultural, economic, moral, environmental, and technical factors of nuclear energy in some of my courses here in the Engineering School. In the process of reviewing some older material, I once again came upon this wonderful article by Kristin Shrader-Frechette, a philosopher and environmental ethicist at Notre Dame: "Nuclear Technology and Radioactive Waste: A Survey of the Issues," Environmental Ethics 13:4 (1991): 327-343.

In it, she explains some of the background for distrust in the technical expert:

"At the nuclear waste facility containing more plutonium than any other commercial site in the world, Maxey Flats [in Kentucky], experts were wrong by six orders of magnitude when they predicted how fast the stored plutonium would migrate. They said it would take 24,000 years for plutonium to travel one-half inch off-site. It went two miles off-site in ten years."

So, to review:

1/2 inch in 24,000 years?

Or...

2 miles in 10 years?

If we are relying on technical calculations alone to resolve issues that are at once technical and non-technical, then we are in great trouble. This, I believe, lays at the core of the problem with technocractic decision-making about nuclear energy. This, I think, is the issue that frustrates most people with the coming nuclear debate: how do you provide technical quantifiable certainty, and thus instill trust in the public (here deferring to the classical model of what it takes to gain trust from the public, that is), with inherently unquantifiable risk-producing technologies?

More like this

Student Pugwash USA, whose mission is to promote social responsibility in science and technology, is having the first of a series of regional conferences March 31 - April 1 at Purdue University. (Other conferences are planned at Rockeller University, Carnegie Mellon, and UC-Berkeley.) The…
The Washington Post reported yesterday that President-elect Obama wants Harvard law professor Cass R. Sunstein to serve as the head of OMB's Office of Information and Regulatory Affairs ( OIRA).   I'm not prepared at this point to tangle intellectually with a  magna cum laude graduate of …
Welcome to the "I'm starting to get cynical" edition. The situation at Fukushima Diiachi Nuclear Plant reached an impasse over the least few days. Two or three of the reactors are in a situation where cooling is being kludged, the reactor fuel rods are damaged and have melted but the details are…
By David Michaels Sometimes reviewing records of past exposures to toxic materials can be pretty dangerous itself. AP carried the story: Records buried in a landfill used for radioactive waste may be dug up to determine whether cancer-stricken workers from a defunct nuclear-weapons plant qualify…

Inherently unquantifiable, really? Just fire stupid engineers, hire smart ones and don't stand over their shoulders.

By Roman Werpachowski (not verified) on 14 Nov 2006 #permalink

Another of those fiddling problems from confusing millimeters and miles. Good thing they're not rocket scientists.

Oh, wait ...

By Rather not say (not verified) on 14 Nov 2006 #permalink

What was the root cause of the error? From both the technical and public-relations standpoint, that seems to be the most important factor.

The root cause was the judgment and assumptions made by the engineers, and the USGS, not stupid engineers. All the engineers were top notch.

But also, to answer Matt, why would public relations seem to be the problem? I see the problem as one of environmental (both human and non-human) health. Seeking to identify a "root cause" would suggest that there is a rogue, one-time error in the mix, when the issue is far broader than that. In fact, seeking the root cause -- which also suggests a technical root cause (though I realize you didn't say that outright) -- returns the issue to the core of the problem, which is to assume that the solution is strictly technical.

By PR I meant how the public's preception of nuclear power changes based on it.

Most risks people deal with aren't strictly quantifiable - take the risk of an automobile accident for example. However, even in these cases, the marginal impact of changes can qualitatively, and even sometimes quantitatively, discussed. Understanding how technical expertise fails, including the technical challenges specific to nuclear power, and what raises and lowers those risks. The techincal aspect isn't the whole picture, but without it the discussion of the risks involved is incomplete.

After 10,000 years of practice, our culture has become adept at rationalizing away any risk that stands between us and something we want. Within our culture, any risk assessment can be trumped by the words, "Who cares? This is what I want."
 

The scientists who did the original environmental safety study at Maxey Flats were hired by the company which wished to run it. The scientists said the ground was safe; the company came in and made millions. The rest is (toxic) history