Setting goals for a course on Experimental Design and Data Analysis (Course design 1.2)

i-9dc84d4d9156dccb30d5f62466b4219a-swblocks.jpgAs introduced yesterday, I'm blogging my way through the SERC tutorial on course design, for a new graduate-only course on experimental design and data analysis. Yesterday, I explained the context and constraints on the course, and today I'm mulling on the course goals. I'm supposed to identify 1-3 over-arching goals for the course and 1-2 ancillary skills goals. Below the fold, I'll share my overarching goals and how I got to them. But I'm struggling with the ancillary skills goals, dear readers, and I'd love your help.

Task 1.2c: Set one to three overarching goals for your course.
The SERC tutorial lays out some very specific guidelines for the over-arching goals. They should be student-centered (not teacher-centered), involve higher order thinking skills, concrete, and have measurable outcomes, and provide clear direction for course design. They should be phrased as "Students will be able to..." or "I want students to be able to...". Before we get to the stage of actually writing the course goals, however, the tutorial asks us to read some example goals and assess whether they meet the guidelines above. The tutorial also asks us to think about what I do as a professional in my discipline, in the context of the course. Here's what I came up with:

I identify questions that advance understanding of my field and which I have at least some of the skills and resources to answer. I form collaborations and write proposals to gain the necessary skills and resources to answer the questions. As part of the proposal writing process, I: (1) design field, lab, and modeling studies; (2) hypothesize the expected outcomes of those studies; and (3) plan the sorts of data analysis necessary to answer my questions based on my study design. I carry out the studies, usually as part of a team, adjusting the experimental design as necessary. I then analyze the data, attempt to answer the original question, and formulate next questions so that I can begin the process over again.

I also review proposals and manuscripts from other researchers working in my field, and I evaluate the scientific and technical merits of the project design, data analysis, and interpretation of the results.

Given what I do, here's what I want my students to be able to do:

  1. I want students to be able to evaluate the connections between: (a) knowledge of existing literature and/or preliminary data; (b) research question and hypothesis generation; (c) experimental design; (d) quality of the collected data; (e) methods of data analysis; (f) ability to answer the posed research question.
  2. I want students to be able to work in teams to formulate a research question, design a study to answer the question, and analyze the resulting data using appropriate statistical techniques.
  3. I want students to be able to critique experimental design and data analysis techniques that appear in proposals or the published literature of their field.

The first goal may seem a bit redundant to the scientific method, but what I want my students to understand is a bit more nuanced. I want them to realize that you have to design the your data analysis at the same time as the experiment or you won't be able to answer your question. I want them to understand the importance of preliminary data and intensive literature review for ensuring that the question they are asking is relevant and answerable and for ensuring that good data is collected once the project is underway. I want them to realize that if you design your experiment in an ad hoc fashion (as so many of us do), you risk collecting data that will be useless for answering the question you really wanted to answer.

The second point is asking them to apply and operationalize the concepts from the class. Maybe I'm getting ahead of myself, but I'm envisioning having the students work in teams to write an NSF-style proposal by the end of the semester. I've also toyed with the idea of asking them to collect some preliminary data to support that proposal, but I think I'm getting the cart ahead of the cart now.

The third goal is another application of the same concepts, but now applied not to their own research but to that of other researchers. I designed the goal to help teach the skill of peer-review, something on which I think many graduate students lack adequate training and mentoring.

Do my over-arching goals seem reasonable for a course on experimental design and data analysis, given the constraints and context of the course?

Assuming for the moment that I don't experience a massive commenter uprising against the over-arching goals, what do you think the ancillary skills goals for the course should be. Here's what SERC has to say about them:

What ancillary skills would like to have your students improve on during your course? Ancillary skills might include writing, quantitative skills, 3-D visualization, self-teaching, peer teaching, oral presentation, working in teams/groups, critically assessing information on the Internet, accessing and reading the professional literature, and so on.

Task 1.3: Set one or two ancillary skills goals for your course.

Which ancillary skills are important enough in the context of your course that you are willing to provide timely feedback and repeated practice so that students actually improve their skills? Be realistic. You can't do everything and still address the content of the course. Choose one or two ancillary skills to work on - ones that you can commit to integrating from the beginning to the end of the semester.

I find myself oddly struggling with this one, maybe because the course seems, by its very nature, to be a skills-oriented course that will touch on many of those things. But how do I pick one or two to focus on? I'd really like to hear your comments, and I'll try to participate in any comment thread discussions. Then in a day or two, I'll have to move on, set the goals, and start choosing content to support those goals.

More like this

I think you've answered the question about ancillary skills already: the course seems, by its very nature, to be a skills-oriented course that will touch on many of those things. My reading of the "ancillary skills" is that the question is aimed at courses that focus on scientific content. For example, my goals for structural geology would be full of things about rock deformation, but wouldn't say anything about writing. But I want the students to work on writing throughout the major, so I the lab write-ups double as writing assignments.

Given your course goals, maybe improved proposal-writing would be a good ancillary goal. (Especially since some of the students speak English as a second language.) Improving writing involves a lot of work on the part of the professor, though, so you might also want to think about ways to make the students help each other. (Because an unspoken goal should be to keep from killing yourself, too.)

Your course seems very similar to a course I just designed and taught for the first time last semester. Although my field is quite a bit different from yours, the basic goals and skill sets are the same. I had my students work on designing a collaborative study (collecting preliminary data was way too much for this course) but they also wrote individual grant proposals (I only had 5 students so it wasn't difficult to handle the grading/feedback). I then had them do a mock peer review of each others grants. The ancillary goals I chose were: writing, collaborative work, and peer review.

My challenges were the diversity of academic backgrounds of the students and that very few had taken a graduate stats course--at all or within the last 3-4 years. Therefore the connection between data analysis design and answering the question was particularly hard to achieve. They simply didn't have the skills for it and it wasn't possible to give them the skills and cover the rest of the course materials (they do have a series of stat classes they will be taking but only this one research design course). I believe they left understanding the importance and they made some headway in being able to approach how one might design the analysis and the experiment to answer the question but it will be a long time before they have the ability to do so. I have no idea if your students will have similar issues, but it is something to think about.

I want them to realize that you have to design the your data analysis at the same time as the experiment or you won't be able to answer your question.

If not before; I'm rather enamored of dependency graphs myself. Among other things, doing the analysis first makes it possible to scope many of the experimental requirements (sample sizes, instrumentation precision, etc.)

Which leads to my suggestion: by all means give them some flawed proposals to rip into. Junior investigators are like baby predators: before they can kill their own prey, they need to practice on some that their parents have wounded for them.

By D. C. Sessions (not verified) on 15 Jul 2009 #permalink

Maybe, just maybe, one of the ancillary skill sets would be to use a standard piece of software for their data analysis. If you have students from multi-disciplines, a grounding in how to apply a software package might be a good ancillary skill. That package could be as simple as Excel, or Matlab, or whatever fancy-shmancy stats package you use routinely. I know this isn't a How To Use s/w Course, but it IS part of your analysis and it does go into your methods.

I see many suggestions to have the students look at flawed proposals and evaluate them. Is there any reasonable way to get them some poor *data* so that they can get their hands on why data collection and analysis design go hand-in-hand? *Especially* for field work, because it's one thing to re-do a lab experiment (not that you WANT to do that either) and an entire other thing to try to redo a survey of the Indian tiger cub population in Spring (for example) because your observer didn't understand how the data was to be analyzed..

What carrie said, although I would vote for r, despite its horrendous learning curve. R is free and open source, so students will be able to use it whenever and wherever. A highly relevant ancillary skill is critiquing research papers - again, some bad science will go a long way here. Finally, you said you're also getting sociologists? Be careful - that's a whole other methodological ball game.

@Perceval: social scientists not sociologists.

More generally, I love the ideas you all are providing. At this point, I'm learning towards peer review as the primary "ancillary skill" I'll focus on. I'm not sure I have the bandwidth to stress writing skills with this group, and by focusing on peer review I can justify having them critique each other's drafts.

Hi Sciencewoman,

One way to think about learning objectives and course development is to ask yourself three sets of questions:
(1) What do I want students to be able to:

know
think
do, and/or
feel

at the end of the course, and why do I want this? The next question is -

(2) How will I, and the students, know that they "know, think, do and/or feel" what I want them to? And finally,

(3) what can I and the students do to help the students get there.

The first question gets at your goals and learning objectives, the 2nd at your assessment methods (and here I like to think of assessment "for", "as" and "of" learning - but that can be for another post), and the 3rd at what you and the class will do during the term.

The reason I'm saying this is that I think your first goal is actually answering Question #3 and not Question #1. This comes through in your answer to the "why" part of Question #1 -- i.e. your rationale for Goal 1. Your rationale explains that doing an evaluation of "the connections between ...." is a way for students to get to particular realizations and understanding. The realizations and understandings are your ultimate goal. Doing an evaluation is a process that may (or may not) help students get there.

Similarly, the rationale for your third goal suggests that what you really want is for students to improve their peer review skills. Reviewing proposals is one way for students to do this (provided they get clear, formative feedback from you.)

It strikes me that maybe your overarching course goals are something like ...

1. Students will be able to design and propose a research project that will (a) have a relevant and answerable question, and (b) have data collection and analysis procedures that can produce an answer to the research question.

2. Students will understand the following factors that affect the success of a research project, and will successfully apply these factors to their own proposal:
- "you have to design your data analysis at the same time as the experiment or you won't be able to answer your question."
- preliminary data and intensive literature review are important for ensuring that the question they are asking is relevant and answerable and for ensuring that good data is collected once the project is underway.
- "systematic design of an experiment will help to ensure the collection of data that will answer the question you really wanted to answer."

3. Students will improve their skills at peer-reviewing research proposals.

4. Students will work effectively in research teams.

What do you think?

I'm learning towards peer review as the primary "ancillary skill" I'll focus on.

Wind to your sails, Madam -- even from out here in industry, it can't be overemphasized.

By D. C. Sessions (not verified) on 15 Jul 2009 #permalink

Peer review is an excellent idea. I have a colleague who's used it to good effect. When I teach juniors next spring, I'm thinking that their final class project might be a proposal for their senior project, and I'll run the last few class meetings like a review panel, with them critiquing each other's proposals. If nothing else, it's good to get them to think about senior project before, say, spring quarter of senior year.

My colleague is seriously thinking about collaborating with people at other schools, and having their students critique our students' papers while our students critique theirs. The only question mark I have is privacy laws. Can student work be shown to people at other schools? I know that if it were human subjects research there'd be an IRB to sort this out with, but nobody is producing a pedagogical study here (well, not yet, anyway). Removing student names from the papers would be important, obviously (and recent research shows it produces fairer peer review anyway), but I don't know if that's enough.

Anyway, I'm excited to hear that peer review experiments in teaching are catching on.

This may seem very basic, but it is something that was an ancillary skill articulated by the instructor in a course I took as an upper level undergrad. This is to be able, in reading a research report, to extract an understanding of the procedure, results and conclusion even if the technical details of the stats or math modelsl are unfamiliar, in other words, the art of "reading around" that stuff. If you have a really good ability at this, you can be unfettered by the details and make contributions to any study.

But as for your overarching goals, they seem "spot on" to me. Good stuff!!

By Guppygeek (not verified) on 15 Jul 2009 #permalink

As I mentioned, I teach a similar course though probably at a slightly lower level. My 'key skill' emphasis is on critical thinking - analysing literature, creating research designs, communicating analyses and reasoning - because I find that to be a very weak area. In my context, students who've made it to this grad course are smart, which means that some of them have managed to get good or excellent grades in undergrad classes via hard work, extra reading etc. rather than by being genuinely doing the right kind of thinking - coursework does rather tend to allow a hard-working student to achieve good marks without doing top-level thinking, I find, especially in the sciences. I think that's just How It Is. But, in this module, I aim to get them to address that (and have found that that feeds really strongly into their other grad modules, by helping them across that divide from undergrad success to grad success).

The group is generally all English-as-first-language but with very mixed interests and backgrounds - pretty much any kind of science, blue skies or very applied. I teach it via three elements: a seminar series where we discuss controversial topics, a research design seminar and a group project.

The seminar series (5 x 1 hour 40 minute classes). I pick topics that are controversial, poorly defined, political or emotive - and give the students a choice from a list of which we will do. For each seminar, we all read two or three papers which are contrasting - sometimes they totally disagree with each other, sometimes they use very different methodologies to ask questions about the same problem or process. And I encourage the students to do a little extra reading to find some additional examples or voices. In class, we discuss the topic, focusing on identifying the facts, outlining the arguments people are making about the facts, and trying to come to some kind of overview. The students then write 400 word summaries on the topic - short enough that I can give really focused line by line feedback if necessary, and hand it back in the next seminar. If the majority have problems with e.g. referencing (sigh, even at this level), paragraphing, opening and closing sentences, then I spend 15-20 minutes in the next seminar outlining the issue and then getting them to work in small groups to improve each other's work from the last exercise. Then on to the next topic.

The research design element also has 5 x 1 hour 40 classes, and is effectively spent preparing the proposal for their master's thesis (which is perhaps less research than in your system? For these students, it's the equivalent to 1 semester of full time courses, and they don't go through the same process of proposal defense and upgrade and committee that PhD students do). They have to work with a supervisor to produce a proposal to a standard format, of passing standard, and these classes take them through the different stages of the design.

In parallel with these classes, the students are working on a group project; this was introduced because I found that many of them had real difficulty making the links between reading. writing, designing and DOING, or in talking to their classmates since their backgrounds are so different, so having an actual project ongoing meant that they had common ground and immediate practical experience. Designing these projects is challenging - I get students into groups of 4-7, and usually have to offer three alternative projects. Students then indicate their preference for a project and are sorted into groups on that basis. Projects need to draw on a range of skills, use relatively simple lab methods that they should all have some idea about, and be pretty open-ended - I won't give examples here as they're quite identifying, but email me if you'd like to see some examples (mollimog at gmail).

Students get a half-page brief of the problem, meet once a week with an academic advisor for a short period, keep minutes of group meetings (submitted for very small amounts of credit, but it keeps them on track), give oral presentations of their design, their pilot study (or an interim report) and of their findings to the rest of the group (who are expected to act as peer reviewers), and present a final product (this varies - it has been a talk, a poster, a formal report, a submission-format manuscript) and an essay reflecting on the process of doing science.

I've come to the conclusion that the doing is more useful, especially in helping the weaker student, than having them do more literature review or essay or proposal-type writing - because making a simple design then carrying it through to a conclusion really emphasises the importance of integrated planning in a way that lectures and reading don;t, especially for students who don't learn well that way (and for non-English-native-language students, and indeed for students with dyslexia and related issues, who come through in ever-increasing numbers at the moment).

I look forward to reading the rest of the series!

I agree that teaching students how to use a software like R for data analysis and presentation could be nice.

Also, I just wanted to point out that sociologists *are* social scientists!