A bit over a year ago, we reported on the removal of evolution from a report by the NSF's governing body, the National Science Board. The NSB is presidentially appointed and Senate confirmed, and sets broad policy for the NSF. Every other year, it publishes a report on Science and Engineering Indicators for the nation, and Chapter 7's discussion of public science literacy is what I always look at first. I was surprised to find evolution absent from the 2010 edition, and sleuthing by NCSE and others resulted in a report in Science magazine, that revealed some disturbing attitudes.
Most troubling was the idea that evolution wasn't a good fit for a report on public attitudes and knowledge on science. One board member expressed a desire that "indicators could be developed that were not as value-charged as evolution." Another member said questions about evolution and the Big Bang were "flawed indicators of scientific knowledge because responses conflated knowledge and beliefs." I was quoted saying, "Discussing American science literacy without mentioning evolution is intellectual malpractice" that "downplays the controversy" over evolution education.
Last Friday, as I was wrangling with the Texas board of education over evolution education, Science printed a followup report (paywalled). Reporter Yudhijit Bhattacharjee says:
The board now says that deleting that text was a mistake and that the 2012 edition of Indicators, which comes out in January next year, will contain an analysis of the survey results relating to those questions. But members hope that future surveys will resolve the issue."
"In retrospect, we shouldn't have omitted that text from the 2010 Indicators," says José-Marie Griffiths, who took over as chair of NSB's Indicators subcommittee when the term of the former chair, Louis Lanzerotti, expired in May 2010. Griffiths says the 2012 Indicators will reinstate the section and include a sidebar explaining the board's plans for revamping the survey. And for the first time, the book will include a comparison of knowledge indices--a metric of overall science literacy--measured with and without the evolution and big bang questions.
In other words, evolution will be included, but they still want to rework how evolution is surveyed.
Toward that end, the NSF commissioned two reports from small workshops. One seeks to rather dramatically redefine science literacy, and the other seeks ways to reflect that new framework for science literacy in surveys. I think that changes that dramatic ought to come not from small and somewhat idiosyncratically chosen workshop panels, but from a full consensus report by the National Research Council. That's what I told Bhattacharjee, and what I think science literacy grand master Jon Miller also said.
Miller's quotes in Science show his concern: "Miller says the two workshop reports downplay the continuing rejection of evolution by a majority of the U.S. population. 'The idea that a scale [a question on evolution] should be abandoned because Americans are not scoring high enough flies in the face of the most basic principles of scientific measurement,' he says."
I'm quoted, too, emphasizing that my issue is that evolution be included, and treated like any other scientific concept:
Joshua Rosenau of the National Center for Science Education in Oakland, California, which has fought to keep creationism out of the science classroom, also finds the reports disheartening. "Whatever the cultural context or reasons for it, rejection of evolution has profound consequences for a person's ability to fully integrate new and existing science into their own lives, to participate in their own medical care and in the 21st century economy," he says. "If NSF's surveys downplay that fundamental concept, they will be measuring science literacy in name only."
I'm open to the possibility that a better measure of evolution literacy can be derived. Indeed, I've been at meetings in the last year with the top researchers developing such measures. That none of those folks were part of, or apparently even aware of, the NSF workshops and reports, doesn't speak well to the reports' ability to capture the current state of the art. But if further efforts at the NSF do bring in the insights of evolution literacy researchers, and newer, better survey instruments are put in the field, I won't weep. But if evolution is watered down, or if questions about evolution are handled so differently from other questions as to frustrate side-by-side comparison, I won't be satisfied.
I'm glad to hear that the NSB recognizes that removing evolution from the report was a mistake; that resolves my major concern. As this effort moves forward, I hope that concerned scientists, science educators, science communicators, and citizens will weigh in with the NSB and make sure that they know this is an issue people really care about. I'd especially suggest contacting - politely! -Â Dr. Griffiths.
There's also a conversation to be had about how NSF - guided by NSB policies -Â will measure the broader impacts of research. Every NSF research grant comes with money for outreach efforts of various sorts, meant to fund the "broader impacts" component of a grant proposal. That section of the grant has all sorts of valuable uses, and the NSB is limiting the scope of uses that it can be put to. Concerned researchers write to Science that the changes "move too far in the direction of accountability, at the cost of scientific creativity and autonomy." In particular:
the list focuses on economics and national security, but excludes protecting the environment and addressing other social problems. Aside from the consequences of neglecting these areas, this new focus may undermine the attractiveness of STEM [science, technology, engineering, math; I hate this acronym] disciplines to more idealistic students who are interested in meeting human needs rather than fostering economic competitiveness. Second, under the proposed new criteria, applicants and reviewers are restricted to the provided list of national needs, which will complicate efforts to respond to new challenges as they develop. Third, addressing these national needs is now supposed to happen "collectively." This reopens the question of whether each individual proposal must address broader impacts. The new criterion thus replaces vagueness regarding what counts as a broader impact with vagueness regarding who is responsible for addressing broader impacts.
This is of a piece with the changing scope of science literacy suggested in the NSF workshop reports. The first redefined science literacy, and to the extent broader impacts funds are meant to influence science literacy, how that concept is defined will have important implications for what outreach efforts are funded. As the scope of science outreach is narrowed, so too will be the institutional understanding of science literacy.
It's especially worrisome to see this retreat by the NSB on broader impacts funding exactly as a new generation of scientists is working to change the relationship between science and society. Younger scientists, whether grad students or young faculty, are seeking out ways to communicate with the public, as we've discussed with the Darwin Road Show before. It's exciting, and I want to see NSF encouraging and supporting such outreach. The future of NSF funding depends on it, but so does the future of the United States as a technological and scientific leader. Already, we're seeing US schools bringing in foreign students to fill up classes in key disciplines (based on Science and Engineering Indicators). Already we're seeing other countries increasing their science literacy relative to the US by key measures. That should worry us all, no matter how we conceptualize science literacy. There are great discoveries to be made in the next century, and fights like the stem cell battles of the last decade will not the last time these discoveries will draw backlash from the public, especially if that public hasn't had a chance to talk with scientists about their work. I hope the NSF and NSB are thinking ahead to those next battles, not redefining science literacy so they can ignore the problems.
- Log in to post comments
One of my former NSF colleagues let me know about this piece. And, I agree with everything youâve said.
As the author of four chapter 7s (2000, 2002, 2004, and 2006), I canât recall ever having a problem with the NSB on the chapterâs coverage of evolution acceptance. It really wasnât difficult to put that section together: You present the survey result, you state that it hasnât changed over time,* and you compare it with the comparable statistic for other industrialized nations. You also include the results of surveys conducted by other organizations.
In addition, over the two-year period between Indicators reports, I would collect news stories on evolution controversies in the U.S. and other countries, and I would summarize them in a sidebar. Finally, I would check out the National Center for Science Education website to make sure I didnât miss anything.
Hereâs an excellent point I want to emphasize by repeating it:
âI'm open to the possibility that a better measure of evolution literacy can be derived. Indeed, I've been at meetings in the last year with the top researchers developing such measures. That none of those folks were part of, or apparently even aware of, the NSF workshops and reports, doesn't speak well to the reports' ability to capture the current state of the art.â
I, too, know of other researchers who werenât included in the workshops. I would have also included an expert or two on survey question design, because thereâs no doubt that the NSF question has been used successfully for more than two decades and replicated all over the world. If the question is defective, why did it take so long to notice? So, it doesnât seem like those workshops were particularly useful in providing guidance to NSF.
*The survey was changed from a telephone survey to face-to-face interviews for the 2008 report, so we canât be sure that thereâs still an intact time series.