Since much of what I write about the responsible conduct of research takes them for granted, it's time that I wrote a basic concepts post explaining the norms of science famously described by sociologist Robert K. Merton in 1942.  Before diving in, here's Merton's description:
The ethos of science is that affectively toned complex of values and norms which is held to be binding on the man of science. The norms are expressed in the form of prescriptions, proscriptions, preferences, and permissions. They are legitimatized in terms of institutional values. These imperatives, transmitted by precept and example and reinforced by sanctions are in varying degrees internalized by the scientist, thus fashioning his scientific conscience or, if one prefers the latter-day phrase, his superego. Although the ethos of science has not been codified, it can be inferred from the moral consensus of scientists as expressed in use and wont, in countless writings on the scientific spirit and in moral indignation directed toward contraventions of the ethos. 
Let's break that down:
- The norms Merton has in mind are statements of what scientists believe they ought to do, and what they believe they are allowed to do, and what it would be good for them to do. In other words, by "norms" we're not identifying whatever it is scientists normally do; sometimes what scientists normally do falls short of what they know they should do.
- Scientists aren't handed a rule book that includes the norms Merton is about to describe. Rather, they figure them out by paying attention to other scientists in their community to see what behaviors they reward and what behaviors they punish.
- Members of the scientific community do spend some time talking about the norms -- or at least, gnashing their teeth at what they view as egregious violations of the norms. But mostly scientists talk about the norms either when a scientist violates them or when someone outside the community of science attacks that community's reputation. When scientists are more or less living by the norms, scientists tend not to talk about them.
The distance between what we think we ought to do and what we end up actually doing is so much a part of the furniture of our lives (both as members of the tribe of science and as human beings) that I'd like to hold off on objections to the norms in terms of real behavior of scientists for now. I will be writing a follow-up post on scientific anti-norms -- sort of the evil twins to Merton's scientific norms -- at which point comments about the values that seem actually to be guiding scientists' behavior will be welcome.
As a sociologist, Merton was interested in understanding science as a social group. He wasn't primarily concerned with providing some independent justification for how scientists conduct their research. It's worth noting, though, that he seemed to think the norms of science were good ones to have if you're interested in building good knowledge about the world:
The institutional goal of science is the extension of certified knowledge. The technical methods employed toward this end provide the relevant definition of knowledge: empirically confirmed and logically consistent statements of regularities (which are, in effect, predictions). The institutional imperatives (mores) derive from the goals and the methods. The entire structure of technical and moral norms implements the final objective. The technical norm or empirical evidence, adequate and reliable, is a prerequisite for sustained true prediction; the technical norm of logical consistency, a prerequisite for systematic and valid prediction. The mores of science possess a methodological rationale but they are binding, not only because they are procedurally efficient, but because they are believed right and good. They are moral as well as technical prescriptions. 
(Bold emphasis added.)
If the tribe of science is a community defined by a common project -- building a body of reliable knowledge about the world and how it works -- the norms Merton identifies are something like the shared values of that community, values that are taken to be essential to the project of the community.
Merton thinks the values of modern science boil down to a set of four norms:
This is the idea that the important issue for scientists is the content of claims about the world (or about the phenomena being studied), not the particulars about the people making those claims.
In other words, the tribe of science is committed to investigating knowledge claims made by graduate students as well as those made by Nobel Prize winners, those made by scientists at small colleges as well as those made at famous universities with huge endowments and buckets of grant money, those made by scientists in other countries as well as those made by scientists in one's own country.
Since the shared goal is building a reliable body of knowledge about the world we share, all the scientists engaged in that project are to be treated as capable to contribute. Disregarding another scientist's report because of who he is, then, is a breach of the norm of universalism.
Writing in 1942, Merton was careful to put this in scare-quotes. He's not talking about Marxist-Leninist Communism, but about the view that scientific knowledge is a resource to be shared by the whole tribe of science, regardless of which individual scientists produced which particular bits of knowledge.
In other words, ideally, if you establish a finding, you get recognition within the tribe for finding it, you may even get your name on an equation, but then that finding is public knowledge that anyone in the tribe of science can use to build additional knowledge (which itself is to be shared by the tribe of science).
One of the things a scientist needs to do to live up to this norm is to communicate her findings to other scientists. If a result stays in your head, or even your lab notebook, it's not ending up in the shared body of scientific knowledge. Knowledge that isn't made public doesn't help the tribe with its shared project.
One way to think about the norm of disinterestedness is that scientists aren't doing science primarily to get the big bucks, or fame, or attractive dates. Merton's description of this community value is a bit more subtle. He notes that disinterestedness is different from altruism, and that scientists needn't be saints.
The best way to understand disinterestedness might be to think of how a scientist working within her tribe is different from an expert out in the world dealing with laypeople. The expert, knowing more than the layperson, could exploit the layperson's ignorance or his tendency to trust the judgment of the expert. The expert, in other words, could put one over on the layperson for her own benefit. This is how snake oil gets sold.
The scientist working within the tribe of science can expect no such advantage. Thus, trying to put one over on other scientists is a strategy that shouldn't get you far. By necessity, the knowledge claims you advance are going to be useful primarily in terms of what they add to the shared body of scientific knowledge, if only because your being accountable to the other scientists in the tribe means that there is no value added to the claims from using them to play your scientific peers for chumps.
This is the value that balances universalism. Everyone in the tribe of science can advance knowledge claims, but every such claims that is advanced is scrutinized, tested, tortured to see if it really holds up. The claims that do survive the skeptical scrutiny of the tribe get to take their place in the shard body of scientific knowledge. This is also the norm that makes disinterestedness work -- without organized skepticism, you might actually have a reasonable expectation of putting one over on your scientific peers for personal gain.
Merton sees these four norms as the values that scientists themselves take to be definitive of the scientific enterprise. However, Merton himself identified tendencies pulling in the opposite directions from these norms. He thought instances of scientists criticizing other scientists for not resisting these pulls were good evidence that scientists as a group were serious about the norms.
The countervening norms Merton noted were largely from outside the tribe of science. In my promised follow-up post on scientific anti-norms, I'll look at these, plus the anti-norms that seem to come from within the scientific community.
 The relevant essay was originally published as "Science and Technology in a Democratic Order," Journal of Legal and Political Sociology 1 (1942), 115-126. I'm working from the version entitled "The Normative Structure of Science" included in Robert K. Merton, The Sociology of Science: Theoretical and Empirical Investigations University of Chicago Press (1979), 267-278.
 pp. 268-269.
 p. 270.
Merton was quite an influential person. I referenced him once and was pleased to get a reprint request from Harriet Zuckerman.
...Raising her hand in the back and saying oh-oh and then sitting on it until the anti-norms post... (so I guess I'll leave Polanyi's view of passionate scientists till then, too,darn)
But as a comment on the skepticism one, in Merton's earlier work on Science and the Social Order (originally 1938, starting p254 in your book) he says that this organized skepticism is frequently taken as skeptical of everything, including religion, the government, etc., thus leading to hostility toward science, increase of control over science, and decreasing funding for science.
Do you think a better case needs to be made to separate scientists views of science (and approaches toward) and scientists views of politics/religion? How does this play out in the new atheism? any relevance to current science funding?
Dear Dr. Free-Ride: Great post. I use Merton as a start to teaching both undergrads and grads "what is science." I would be interested in how you handle science studies.
Regards, Mr. Natural (aka "Fixed Carbon.")