Political opinion polls are very tricky. Answers to questions depend on the order they’re asked in, and on precisely how they are phrased. If you ask people whether they’re in favor of killing unborn children, you’ll get a much different response than if you ask if there’s any situation where women should be allowed to terminate a pregnancy.
What’s even more difficult is to assess public opinion on complex pending legislation. Most polls find that most Americans like the idea of requiring everyone to buy health insurance. But it’s only a slim margin — 56 to 41 percent. Kevin Drum cited a recent study that asked a follow up: would you change your mind if low-income families got government assistance to pay for insurance? Now 34 percent of naysayers changed their mind to support the requirement. But Kevin wondered if it’s a fair question. Wouldn’t some people change their minds no matter what?
So Kevin suggested that someone do a study to see if some people flip-flop no matter what question is asked. If some people will always flip-flop, then that suggests poll questions about flip-flops aren’t very helpful in determining what the “true” public opinion is. We decided to take him up on the suggestion. Last week we created a set of six opinion questions about issues we felt our readers were likely to disagree on. 491 people responded. For each question, we came up with two different follow-ups. So, for example, everyone was asked “Should the United States adopt a government-run health care system based on Medicare?” But each respondent saw only one of the follow-ups:
- Suppose the plan involved a required pay-cut for doctors of 10 percent. Would your opinion change?
- Suppose the plan involved a required pay-increase for doctors of 10 percent. Would your opinion change?
Each of these groups was divided in half again. One group just answered the questions, while the other group had to keep track of the total number of flashes of a flashing square (like this) in the corner of their screen while they answered. The hope was that this task might simulate some of the distractions a typical respondent might face while answering the polling questions over the phone.
So, did the flashing square affect whether respondents flip-flopped their answers? Here are the results:
This graph charts the average number of flip-flops each respondent made over the course of all six questions. While people who saw the flash flipped slightly more than the people who did not, the two groups were not significantly different.
But to return to Kevin’s question, did we find that a certain number of respondents flipped their answers no matter what the follow-up question was? While it is true that someone changed their answer for each question, in some cases, very few people did. Consider the responses to the question “Should the United States withdraw all troops from Afghanistan?” Respondents were roughly evenly divided. But depending on the follow-up question, there was a big difference in the portion of flip-floppers:
While 35 percent of respondents said they’d change their answer if the US kept one base in Afghanistan to address only the terrorist threat, only 4 percent said they’d change their answer to the original question if the US also closed the prison at Guantanamo Bay. And this wasn’t the only question that resulted in a small number of flips. There were a total of four (out of twelve) follow-ups that caused 4 percent or fewer to flip. Just 1 percent of respondents flipped their response about lotteries when asked “Suppose J.K. Rowling had never written the Harry Potter books. Would your opinion change?”
So to answer Kevin’s question, it seems that the baseline level for flip-flopping is quite low — perhaps as low as 1 percent, and certainly below 4 percent.
Interestingly, for nearly every follow-up we asked, the flip-flops were quite evenly distributed. No matter how you answered the original question, you were equally likely to change your mind. Is this a general phenomenon, or just a coincidence? Since it didn’t happen every time, I think it’s just a coincidence.
Here’s an example where it didn’t happen. The original question was whether recreational drugs should be legalized. When followed up with “Suppose the law also prohibited health insurance companies from providing substance-abuse treatment coverage. Would your opinion change?” the response to the original question had an important effect on flip-flopping:
If you initially wanted to legalize drugs, you were nearly twice as likely to change your opinion in this case as someone who had wanted to keep them illegal.
But how biased was our sample? There’s no doubt that our audience at CogDaily is more educated than the average American. What about politics? We asked readers to rate their political philosophy on a simple scale. Here are the results:
As you can see, CogDaily readers are quite a liberal bunch. Does stated political philosophy affect your likelihood of flip-flopping? You bet:
While the differences between individual groups on this graph aren’t significant, the overall trend is: the more conservative you said you are, the less likely you were to change your answer to the questions on this survey. Does this mean liberals as a group are wishy-washy? Possibly. Or it’s also possible that we did a good job coming up with questions that liberals were likely to be ambivalent about: after all, that was our goal when when we created the study.
(Just a reminder: All Casual Fridays studies are non-scientific. This doesn’t mean we can’t use scientific principles to assess what’s going on, but we can’t make general claims based on the results)