Just to let you know where things stand, I'm in the process of setting up the study. Some of the coding is a bit over my head, because I've never done this sort of thing on the web before. Fellow Science Blogger Razib has been helping me a great deal, but if you have knowledge of how these web page thingamajigs work, and you'd like to help, feel free to send me an email. The coding should be really simple, but I'm web design illiterate. Razib suggested that I save the data using MySQL, which should make it easier for others to access and analyze the data however they please. Below the fold, I describe the basic design of the project, and if you have any suggestions, feel free to send me those too.
Also, if you're interested in getting an idea of what the project is like, check out this paper by McRae et al., in which they used a feature listing task to derive norms for living and nonliving things concepts. As the paper points out, once you have such norms for a set of concepts, the sky's the limit for what you can do with them.
UPDATE: I want to thank everyone for their input so far. You've been very helpful and you've really helped me to see the potential of blogging for research. In a sense, it seems it's entirely possible to use the blog as a medium for collaboration, both with other psychologists and with nonpsychologists who are interested in the issues being studied. I've chosen 30 concepts to include in this first pass. You'll notice that they are pretty much limited to issues that are important in the U.S. The reason for this is primarily that those are the issues with which I'm familiar. However, if we are able to get a lot of respondents, it will be possible to add concepts in the future, and to look at differences in concept representations internationally. Below are the concepts I've chosen. If you feel that there's something that I absolutely should have included, but didn't, or that is on the list that absolutely should not be, let me know.
separation of church and state
END OF UPDATE
Here's the design. Every study needs informed concent, so the first page people will see will be a short description of the task (listing properties for several concepts). They'll be told that their data will be confidiential, and that while the data will be publically available, no identifying information (e.g., ISP addresses) will be. They'll also be told that they can end the study at any time. By clicking a button to continue, they'll be providing consent.
The next page will be designed to gather demographic information: age, sex, race, location, where they heard about the study, political affiliation, political orientation, etc. It'd be nice if we could use an actual political orientation scale, but I'm afraid that might tax people's patience, so for now, I'm going to rely on self-report. If you know of a good, short political orientation scale that might work, let me know. At the end of that page, they'll read instructions for the property listing task. Then they'll click a button to continue.
Then comes the meat of the study. By clicking continue on the demographic page, they'll be sent to one of the concept pages, determined randomly. At the top of the page is the name of the concept, and beneath it will be a text box so that they can enter the properties. When they finish with one concept, they'll click next. I'm debating with myself whether to have each person do 5 or 10 concepts. In lab experiments, I've done 20 per person, and that takes anywhere from 15 to 45 minutes, which is probably well beyond the attention span of most people on the internet. I'm hoping to limit the length for each person to 15 minutes. Each time they finish a concept, they click next, and it takes them to another concept.
After completing the final concept, they'll click a button and it will take them to the debriefing page. This will tell them more about the study, as well as containing contact information for me in case they have any questions or concerns.
And that's it. Again, let me know if you have any suggestions.
Chris, you could do this all in Survey Monkey with a lot less hair-pulling. Yes, the data would be harder for others to access, but you can easily export all the data into an Excel file for mass distribution. Only problem is number of participants -- survey monkey caps it at 100. I pay the $19 a month to get that number up to 1,000/month. After that it's 5 cents a response.
It might be worth it to you to do a one-month subscription to get the data you need.
I've been following your plans for this study since you first announced it. I too have some problems with Lakoff's pov on this topic. However, it seems that you have another theory and you're looking to design a study to prove it (or to disprove it, I hope).
However, I have seen no mention of the theory (hypothesis) itself - including how it differs from Lakoff's, is better, whatever. Did I miss that?
I think this is a terribly interesting topic but I was hoping for some discussion of the theory itself in your blog - perhaps just to gather some opinions for consideration - before you devise a test to prove (or disprove) it.
Uranius, in a very old post on the old blog, I briefly discussed my perspective (I will look for the old post later), but you're right, I haven't talked about it yet. I'm in the process of writing a series of posts on it, and I don't want to post any of them until I've finished them all. I should start posting them in the next few weeks.
This study is really designed to get an idea of the structure and concent of people's political concepts. It's not specifically meant to test any hypotheses. One of the uses to which I plan to put it is to do some modeling work that will give us predictions about the sorts of inferences people will make when provided with different "frames," given the structure of their existing concepts. How that works will be one of the things I discuss in the posts.
Cool. I'll be watching for the post series. Thanks.
Chris, I agree with Dave. There are several reliable survey sites that are really inexpensive (sometimes even free) that would make your web survey very easy to manage. A fellow grad student has done this and collected over 2,000 respondents for her dissertation. She used Question Pro , but I have heard from others that Survey Monkey is also pretty good.
As for the number of concepts, it really depends on how many responses you think you'll get. My gut says to go for less concepts per survey and try to get more subjects. I would keep the writing to a minimum--no more than 5 concepts. If you have too many, people may get bored or lazy, and put little effort into your study. This is especially likely, since you are asking people to do more than simply click a box--they'll actually have to type in their responses. Ultimately, quality matters more. If you get a small number of really interesting results, we can continue to push for more subjects. If, however, you get a large number of junk, then you don;t really get anything out of it.
Lastly (I promise that'll be it), I would reorganize your survey. Put the boring demographic questions at the end and let people get started on the main task immediately. This prevents people from becoming bored by your survey, and it ensures that if they do break off prematurely, you'll probably get some data out of it. Also, you won't be priming them with a bunch of other thoughts before you ask them to think about the concepts--just asking them to list their race, gender, party affiliation, ideology, etc., makes a whole slew of concepts accessible in memory that could easily influence your free association task (obviously, the order of the issues will do this too, but hopefully a randomization will make it less of an issue). One other thing, make sure that you include some type of counter or progress bar on the survey to let people know that they don't have too much more to continue. The lat thing you want to do is have people get 3/4 of the way through and quit because they feel that the survey could drag on forever.
Well, even with demographic collection, a self-selected study is worth about as much as you pay the participants - that is, nothing.
No to mention the fact that by doing it on the internet, you are leaving out a sizable group of people who would be reasonable to have demographically, but who do not use the internet in this way, either from economic limitation or personal choice.
But, if you think doing a survey of motivated rich and semi-rich bored people is good, well, then, have at it!