Having had a couple of comments on this, I realise that some of the required background on Bayesian statistics is waaaay over some peoples heads. This is probably no fault of theirs. Let me make some faint attempt at explanation, and James can correct me as needed, and doubtless Lubos will leap in if I leave him an opening.
The issue (at least in this context) is the updating of “prior” information in the light of new information. Prior information means (at least nominally) what you knew about, let us say, the climate sensitivity S before you tried to make any plausible estimates of it. If you want the maths (which is not hard) then James post applies, as does the rejected paper. In fact let me copy it here, perhaps it will help: f(S|O) = f(O|S)f(S)/f(O). f(S) is the *prior* probability density function (PDF) for the climate sensitivity S. f(S|O) is the *output*: the PDF of S, taking into account the observations you’ve just made, and the prior distribution.
So the output depends on your obs (so you would hope); and on what you knew before, the prior. The problem is that it is hard or impossible to construct a truely sensible “prior” (as James points out; but I don’t think its particularly contentious). You can construct a pretend “ignorant” prior by asserting that a uniform value between 0 and 20 is ignorant (U[0,20]). But (again, as James points out) this means that your prior thinks it is three times as likely that S is greater than 5 than it is less than 5. No one believes that. Inevitably, you are using some of your knowledge in constructing the prior. But hopefully not the same knowledge as you use to sharpen it up aferwards.
So the only hope is that your obs are sufficiently well constrained that the prior doesn’t matter too much. If your obs were S=3 (with absolute precision), this would be true. But the obs are S=3+/-1.5, where the +/-1.5 hides various different PDFs, so the prior matters. But still, if you apply enough obs then it still doesn’t matter too much, as each one sharpens up the result more. And if you do this, you end up rejecting any reasonable chance of a high climate sensitivity.
Did that help at all?