Definitions

My computer is starting to run slow in that way that indicates that either Microsoft has released an important update, or it's just been on too long without a reboot. Either way, I need to clear some browser tabs before restarting, and there are a bunch of articles that I thought were too interesting to put in a links dump, but where I don't quite have a clear enough opinion to write a blog post. These split into two rough groups, both of which are concerned with definitions.

One bunch of posts has to do with the recent poll about science knowledge, showing that a majority of Americans are unable to answer surprisingly basic questions about science.

The definitional aspect comes in because a number of bloggers, among them Mark at Cosmic Variance and Sheril here at ScienceBlogs, have complained (apparently without coordination between them) about media descriptions of this poll as showing a low level of "science literacy." The argument is, basically, that "scientific literacy" should be more about the process of science and the way scientific decisions are made. This poll shows a lack of knowledge of science trivia, but doesn't really test science literacy.

I don't disagree, exactly, but I don't really see how this makes anything better.

I mean, it's true that science is more about process than about specific facts, and this survey is testing knowledge of specific facts. But the facts they're asking about are really basic stuff-- "How long does it take the Earth to revolve about the Sun?" was one, which only 53% of people got right. It's hard to believe that people who can't answer that correctly are going to have a solid grasp of the scientific method.

The other reason to ask these particular questions is that they've been asking these (more or less) repeatedly over the last twenty-odd years. Imperfect as they are, we've got records going back a ways, and can check whether the public's knowledge of these items has changed over time (it hasn't).

I'd be fine with coming up with some set of questions that really test science-as-process rather than science-as-trivia, and starting to ask those. It's a tricky proposition, of course-- you probably need something like the famous light bulb question at MIT's graduation, and it's tough to come up with those. But the questions currently being asked are not invalid, and do serve some purpose.

The other big definitional question that's been rattling around is the question of who gets to call themselves a scientist. Scicurious has the first post tying the whole thing together, and Janet provides an exhaustive analysis of all the possible ways to take the question.

(This is, in some ways, "Why I Couldn't Make It as a Philosopher" part 3-- I just don't have the patience for defining things at that length.)

As noted previously here, the whole problem is that scientists, unlike doctors, lawyers, and Realtors(tm), do not have a guild system where one must be certified by some central authority to claim membership in the group. Somebody who runs around calling themselves a lawyer without having been admitted to the Bar Association is going to get in trouble, and a doctor who violates professional standards can lose their license to practice medicine, but science doesn't have any such central authority. There's nothing stopping raving loonies from calling themselves "Scientists," and no way to sanction scientists who tip into crankery, provided they avoid outright fraud.

There are a lot of reasons to prefer a more open system for science, but then again, it would simplify this question quite a bit if you had to get a license from the National Academy of Sciences before calling yourself a scientist in public...

More like this

Misogynistic authoritarian Vox Day requests a definition of science. His commenter suggests: Science - sci·ence (sī'əns) n. - The organized attempt to disprove the existence of God so we can do whatever we want without feeling bad about it. Anyone involved in the arguments over creationism will…
The NSF's Science and Engineering Indicators report came out not too long ago, and the bulk of it is, as usual, spent on quasi-quantitative measures of scientific productivity-- numbers of degrees granted, numbers of patent applications for various countries, etc. I find all of those things pretty…
I spent the past three days with my colleague Ed Maibach and several graduate students conducting one-on-one interviews about climate change with participants recruited and screened from among the diversity of visitors to the National Mall in Washington, DC. In conducting these qualitative…
Marie-Claire Shanahan teaches science education at the University of Alberta, and blogs about her own research and about the state of science education (and science education training: science education education if you will). Her latest post summarizes her findings from reviewing science…

Nit: you're not admitted to the Bar Association, those are voluntary membership trade organizations. You're admitted to the bar, which is administered usually the courts in your state.

Being a member of the American Bar Association, in other words, is not a prerequisite for practicing law.

Carry on.

I think even the famous light bulb question isn't a direct measure of people's understanding of science. I think it is better as a measure of something related to what they have studied but can not use in a different situation. I use questions like the light bulb one and my favorite - 'what causes the seasons' to show that people don't understand stuff they have been told (many times).

There are some tests that look at the nature of science. However, they are sort of long. The ones I can think of right now are EBAPS (http://www2.physics.umd.edu/~elby/EBAPS/home.htm), VASS (http://www.flaguide.org/tools/attitude/views_about_sciences.php), and CLASS (http://www.colorado.edu/sei/class/) The problem with all of these are they take time to administer.

The California Standards for High School Science are very extensive on content. But they do have a useful chunk on the process of science.

Word version of Science Content Standards (DOC; 255KB; 52pp.)
relevant excerpt:

Investigation & Experimentation - Grades 9 To 12
Science Content Standards.
1.Scientific progress is made by asking meaningful questions and conducting careful investigations. As a basis for understanding this concept and addressing the content in the other four strands, students should develop their own questions and perform investigations. Students will:
a.Select and use appropriate tools and technology (such as computer-linked probes, spreadsheets, and graphing calculators) to perform tests, collect data, analyze relationships, and display data.
b.Identify and communicate sources of unavoidable experimental error.
c.Identify possible reasons for inconsistent results, such as sources of error or uncontrolled conditions.
d.Formulate explanations by using logic and evidence.
e.Solve scientific problems by using quadratic equations and simple trigonometric, exponential, and logarithmic functions.
f.Distinguish between hypothesis and theory as scientific terms.
g.Recognize the usefulness and limitations of models and theories as scientific representations of reality.
h.Read and interpret topographic and geologic maps.
i.Analyze the locations, sequences, or time intervals that are characteristic of natural phenomena (e.g., relative ages of rocks, locations of planets over time, and succession of species in an ecosystem).
j.Recognize the issues of statistical variability and the need for controlled tests.
k.Recognize the cumulative nature of scientific evidence.
l.Analyze situations and solve problems that require combining and applying concepts from more than one area of science.
m.Investigate a science-based societal issue by researching the literature, analyzing data, and communicating the findings. Examples of issues include irradiation of food, cloning of animals by somatic cell nuclear transfer, choice of energy sources, and land and water use decisions in California.
n.Know that when an observation does not agree with an accepted scientific theory, the observation is sometimes mistaken or fraudulent (e.g., the Piltdown Man fossil or unidentified flying objects) and that the theory is sometimes wrong (e.g., the Ptolemaic model of the movement of the Sun, Moon, and planets).

> you probably need something like the famous light bulb question at MIT's graduation

I have just looked at the light bulb video for the first time
mms://media.scctv.net/annenberg/Minds_of_Our_Own_01.wmv
and while this is not central to your point, I have to say that video itself is egregiously manipulative in ways you would typically criticize.

It showed a handful of people, not specifying which from Harvard and which from MIT (unless you recognize the gowns), only one of whom (the darkest skinned one of course...) was depicted as able to light the light bulb.
There were two men and two women at a Harvard graduation (one of whom was wearing an usher outfit, so not clear if a Harvard student), majors unknown, three of whom are shown trying to light a light bulb by attaching a wire from one pole of a battery to one contact of the bulb, and one of whom at least says she can't do it with one wire because she needs to make a closed circuit (i.e., at least at that moment seemingly unable to figure out that she needs to press one end of the bulb directly to the battery). There are two at MIT, a black male shown just at the point of success, and a poorly shot woman (seems to be wearing an MIT gown) who can't figure it out and apologizes that she's an ME not an EE.

This was all confusingly intercut, less than a few seconds for each, so unclear how long anyone spent or whether they eventually figured it out.
But MOST significantly, they do not report whether they asked 10 students, 100 students, or 1000 students, and specifically do not report whether the results depicted were a representative sample (e.g., that 5/6 couldn't do it).
What if they'd asked dozens or hundreds and cherry-picked the only five who were momentarily confused (and moreover even those got it after a few seconds more)?

While the Annenberg/CPB science project that created this film is trying to raise awareness of the need to improve science teaching, their biased methodology teaches precisely the wrong lesson: use of small statistics, unrepresentative sample, biased sampling, lack of control, context, or even partial explanation of actual methodology. If these are the well-intentioned people trying to improve science education, no wonder we're in such poor shape...

By Not Impressed (not verified) on 22 Mar 2009 #permalink

I agree with your point in general, but some of those questions are ridiculous. Like the one about what percentage of the Earth's water is fresh water (spoiler: it's 3%). There are so many rather finely demarcated categories that it misses the point. I answered 4-10% and so was grouped in the 99% of people who could not answer that question "correctly." In fact, by similar splitting of hairs, they missed the Earth's revolution question. Depending on one's definition of a "year," the answer is not one year, it's 365.25 days. I think it would be really difficult to come up with a test of basic science that most people would agree is basic science and that captures whether people have a rough idea of what the right answer is.

To add one more - my favorite one - only 62% of the respondents said that evolution is currently occurring, but 70% of respondents said that humans are influencing the evolution of other species.

Jeff wrote:

In fact, by similar splitting of hairs, they missed the Earth's revolution question. Depending on one's definition of a "year," the answer is not one year, it's 365.25 days.

If you really want to split hairs then the length of the year is dependent on whether you mean a Julian, sidereal, tropical or anomalistic year.

The surface of the earth covered in water one was odd, too - because they give the answer for "liquid water", but they asked for water. It's a lot more if you include Antarctica as water, since it's almost all covered in the stuff...

Having been to MIT, I assume that the light bulb answer should have been, "Hang on, I'm a pre-med. I've got to find someone to kick the chair out from under me."