In a pair of earlier posts, I looked at the ethical principles Matthew C. Nisbet says should be guiding the framing of science and at examples Nisbet discusses of ethical and unethical framing. Here, consider some lessons we might learn from the framing wars. I'm hopeful that we can gain insight about the folks interested in communicating science, about the various people with whom they're trying to communicate, and perhaps even about the approaches that might be useful (or counterproductive) in trying to sell scientists on the utility of the framing strategy.
This post is not so much a response to Matt's recent post on the ethics of framing as it is to the multi-year brouhaha over framing and its discontents in the science blogosphere.
The lessons I'm taking away from all of this are more along the lines of bite-sized nuggets than a Grand Unified Moral of the Story (although it's quite possible that someone with more patience or insight than I can muster could find the grand unifying thread in these nuggets):
1. Successful communication does not automatically bring with it successful persuasion
You can understand what I'm saying perfectly well and still disagree with me. That's how it goes.
2. Confusing what you know and what you value can only lead to headaches.
Sure, what you know can influence the ends you regard as valuable (not to mention the ends you regard as attainable), and your knowledge may shape your strategies for seeking your chosen ends. But the facts don't speak for themselves. The facts alone cannot tell you what to do.
Doing stuff, obviously, can be useful, even fun. So having interests and values, as well as facts, is not a pathological condition for a human to be in.
3. While scientists qua scientists may feel deeply uncomfortable advocating for particular values or ends, scientists qua human beings can do so.
Maybe they even should do so; this might depends on your particular picture of our general duties to others, what participating in the family of humanity involves, and so forth. But so long as scientists about clear about where the tether of their scientific expertise runs out, they have just as much of a place in a discussion of what's worth valuing as anyone else does.
4. Groups of people -- scientists, non-scientists, religious folk, secular folk -- can be radically heterogeneous in their core values.
You can't assume on the basis of a person's group membership that you know what he or she cares the most about. Nor, for that matter, can you assume on the basis of a person's group membership that you know how best to persuade him or her that a particular policy is worth pursuing. A person's group membership may give you some reasonable hunches, but people will surprise you.
This applies even to people concerned with communicating to the public about science. Not all science communicators seek the same ends. To the extent that their ends might overlap, these ends aren't necessarily prioritized the same way.
5. Better scientific literacy and understanding is an end some people value.
Indeed, some people view it as having more than instrumental value (as a means to achieving some other end). Not all the people who value better scientific literacy and understanding as an end in itself are scientists.
This doesn't mean that everyone values this end. But we shouldn't forget that some people do -- a lot.
6. People can agree about the message they want to communicate and disagree about the best communication strategy.
Some of this has to do with the communication strategies that have worked for us before, or the responses we've gotten in our prior attempts at communication. What strategy will be most successful is in the end an empirical question. However, it seems perfectly plausible that multiple strategies for communicating a given message could be moderately successful.
7. People can agree about the best communication strategy and disagree about the message they want to communicate.
A communication strategy is a tool, and tools can be put to different uses. If you want to use a hammer to build a birdhouse, it doesn't mean that my use of a hammer to build a trapdoor is wrong -- it just means I want to accomplish something different with the hammer than you do.
8. Focus group and polling data may be extremely useful in informing efforts to persuade particular people, but such data are not magical.
There are many ways that the real people you're trying to persuade may differ from the people in even a very well-constructed sample population. If it's the real people you're trying to persuade, then the prior data are advisory at best.
9. Saying to someone who is trying to communicate, "You're doing it wrong!" is not persuasive unless you can actually:
- correctly identify the goal of the communication
- correctly identify the intended audience for the communication
- provide something recognizable as sensible data that that bears on whether the communication has been effective.
Here, it is very important to recognize that a prospective critique ("I don't think that will work!") does not necessitate that the attempt to communicate will actually fail -- it might end up working! Whether it does or not is always clearer after the fact, when there is relevant data to be had on whether it worked.
10. If you're trying to sell a communication strategy to scientists and journalists, you need to give them reasons to think it will work.
You need to be able to explain how to apply that strategy to what the scientists and journalists actually want to communicate. Providing a tool without clear information on how to use it makes it more likely folks will continue using the tools they're used to using, or using the new tool you're offering in a way you're inclined to label as "the wrong way". (Maybe the instructions that came with the tool aren't clear enough. Maybe what you call "the wrong way" to use the tool actually does a better job of accomplishing what the tool-user is trying to accomplish. Every tool is a weapon if you hold it right.)
And, if you want scientists and journalists to set aside their familiar patterns of communication in favor of a new tool, you need to make this new way of approaching communication resonate with their core values as scientists and journalists. If it doesn't, can you blame the scientists and journalists for not getting on board?
- Log in to post comments
Awesome!!!!
Thank you for doing these 3 posts. Should be saved for posterity.
Thank you for this great series. You've really done a bang up job of framing the framing debate ;)
I think that you do have of a Grand Unified Theory of Framing here, however. Those ten points go a long way towards informing any efforts at changing the way science is communicated to the public (that is assuming that there is change needed other than in the sense that there should be more funding for it, a position I've not yet been convinced of).
To repeat a vaguely relevant idea inflicted elsewhere... there is a difference between science as pure philosophical discipline and applied anthropological practice.
Thanks for these posts. Although I am critical of the accommodationist stance adopted by the NAS (with no acknowledgement that it is controversial among both philosophers and scientists), I do applaud your careful work to tease out some of the issues. I thought your discussion of the goals being pursued by Dawkins in your Part 2 was exceptionally good.
One thing that you might think about is: When is satire justified? I don't see how Matt's account allows much, if any, room for satire of anybody or any set of beliefs or practices. But surely that is counterintuitive.
Russell,
Who is the object of the communication in satire? I think it would be a third party, which seems to be outside Matt's focus in his ethics.
This does raise the question of whether or not Dawkins, PZ, and others are engaged in science communication or satire. They appear to be more successful in the latter.
Most of these are pretty useful guidelines for communication in general, actually. Excellent posts!