Projectile Motion, Uncertainty, and a Question of Ethics

We no longer do what is possibly my favorite lab in the intro mechanics class. We've switched to the Matter and Interactions curriculum, and thus no longer spend a bunch of time on projectile motion, meaning there's no longer room for the "target shooting" lab.

It's called that because the culmination of the lab used to be firing small plastic balls across the room and predicting where they would land. In order to make the prediction, of course, you need to know the velocity of the balls leaving the launcher, and making that measurement was the real meat of the lab. The way I used to do it, it also included a nice introduction to error and uncertainty, particularly the difference between random and systematic errors.

I'm sorry to see this one go (I don't think there's any way short of a calendar change to fit it into the new curriculum), and I've toyed with the idea of trying to write it up for some pedagogical journal or another. I'm not sure that would be permissible, though, given the way the thing works.

The whole thing really came from pasting together two previously existing versions of a projectile motion lab. One colleague used to have students measure the velocity of a ball leaving a PASCO launcher by measuring the maximum height reached by the ball. Another used two different timing methods-- students measured the time the ball spent in the air with a stopwatch, and then did essentially the same thing with a high-speed video camera. What I did was to combine the three methods into one lab.

This works really nicely as a way to get at the different kinds of error and uncertainty in a lab. The maximum-height method turns out to be very good-- the projectile launchers fire the ball a bit more than two meters straight up, and students standing on the table can measure the height to within a centimeter or so. The systematic errors in this case are pretty limited-- basically, the only thing that comes in is a parallax effect for those students who are too short to align their eyes with the maximum height. Averaging together ten or so measurements gives a statistical uncertainty of less than a percent.

The stopwatch method, on the other hand, is plagued with systematic errors. Individual students will be fairly consistent with their times, but the time measured is highly dependent on individual reaction time and how the students choose to do the timing. Some students will be systematically fast, starting a bit late and stopping a bit early, while others will be systematically slow, starting early and ending late. Averaging together ten times gives a statistical uncertainty that isn't too much worse than the maximum-height measurement, but the two methods generally don't agree within the statistical uncertainty.

The video-camera method ends up agreeing very well with the maximum-height measurement, with similar uncertainty (the camera we used typically recorded 250 frames per second). The velocities determined from either of those measurements work well for predicting the range when the balls are launched at some angle, while the stopwatch velocities are much less successful.

Putting all of these together works in a few different ways. The difference between stopwatch and maximum-height methods clearly shows the difference between random and systematic errors, particularly when they see a full class worth of measurements, with some students getting a velocity that's too big and others too small, while the maximum-height and video camera methods always agree within uncertainty. Calculating the velocity gives them some practice with the kinematic equations, in two different forms (the video camera and stopwatch methods use the same calculation). The multiple measurements involved allow you to introduce the idea of standard deviation as a way of quantifying the uncertainty in a data set, and the calculation of the velocity uncertainty forces them to deal with propagating uncertainty.

It's a nice lab, or at least, I think it's a nice lab. I'm a little disappointed to see it drop out of use, and as such, I'm half tempted to use some of my infinite free time to write it up for The Physics Teacher, or some such (assuming that somebody else hasn't beaten me to it, at least).

I'm not sure that would be entirely kosher, though, as the only way to really show how the whole thing works would be to include representative data sets from a couple of classes, to show how the numbers turn out. I have the data (at least, I'm pretty sure I do), because I used to have the students type their results into Excel so I could show them a scatter plot comparing the three methods, but I'm not sure I'd be able to use it. A comment made some time back by a colleague in psychology seemed to suggest that I wouldn't be able to use data from students in my classes without first having cleared the use of those data. Which I didn't, because I was just using the numbers for making a point during class. And, of course, since we're no longer doing that lab, there's no way to get new numbers...

I could be wrong, though, so I'll throw this out there for people who know more about these sorts of ethical questions: would that sort of thing be a potential problem? It seems awfully silly, but it's silly in the manner characteristic of academic bureaucracies, so I can easily believe that it would be forbidden by some rule or another, and it wouldn't be worth the headache.

(Then, of course, there's the fact that I've just described the whole concept on the blog, which I suppose could be seen as just as effective a publication channel, though without a bibliographic entry to put on my annual evaluation form...)

More like this

Last week, Rhett did a post on animating a bouncing ball in VPython. This was mostly making a point about the distinction between real simulation and animation, along the lines of yesterday's post on social construction of videogame reality. But, of course, my immediate reaction was, "That's not…
SteelyKid is spending a couple of days this week at "Nerf Camp" at the school where she does taekwondo. This basically consists of a bunch of hyped-up kids in a big room doing martial activities-- taekwondo class, board breaking, and "Nerf war" where they build an obstacle course and then shoot…
Today's a lab day in my main class for the term, with a fairly involved experiment to measure the charge-to-mass ratio of the electron. This is going to be all kinds of fun, because 1) I can't get into the room to set anything up until an hour before the start of class, and 2) SteelyKid is home…
My class this term is a "Scholars Research Seminar" with the title "A Brief History of Timekeeping," looking at the science and technology of timekeeping from prehistory through modern atomic clocks. This is nominally an introduction to "research methods," though the class operates under a lot of…

"...I've just described the whole concept on the blog, which I suppose could be seen as just as effective a publication channel..."

And a peer-reviewed one at that, assuming the entry attracts a couple of comments from other physics instructors.

By Johan Larson (not verified) on 26 May 2009 #permalink

Make the data up.

Seriously - you're not publishing a research result, but making an example. You know the expected averages and deviations from the real labs, so use those as the base for generating a class-full of made up measurement data. Jsut be upfront that you're showing representative but fictional example data and there's no problem. If you need to, you coudl even make up a few fictional students (Aaron, Bertha and Chang perhaps?) to use for the description.

Go to your IRB and tell them what you want to do. If you tell them that this is data that you collected in regular teaching and now you see how this could be a pedagogical publication, I don't think they're going to give you a hassle. Getting IRB exempt status for something like this is a walk in the park. And besides, you're not planning on using any student names or identifying characteristics are you? Pysch people tend to be really cautious about stuff like this to a degree that is a bit overboard for physics education research.

What an interesting idea. In regards to the idea of making it up, what would be the ethics of just adjusting all the data by a centimeter up or down? Now none of the data is real yet it is still exactly representative. Now what is the ethical position? What a weird mess. I do think there would be no problem.

I was going to ask who owns the data: the students? the school? you?

Can data be owned, since they don't represent information or knowledge until they've been (respectively) organized and interpreted?

But then I realized that janne's perfectly right; you don't have to use actual collected data for a pedagogical example. In fact, in order that future classes who use the lab not to be 'cheating' by using pre-existing data, perhaps you shouldn't even provide good data, let alone real ones.

I really like that lab. It captures the essence of what students are supposed to learn in the lab. I try something similar (measure v with a ballistic pendulum, then predict where the ball will land when fired as a projectile), but doing it three different ways is really nice.

As for your problem - like G@3 said: ask your IRB.

But I have a better suggestion: Get a bunch of students in your SPS chapter to do the lab for "fun" and give them credit in the acknowledgments section of the paper. And they are far enough along that they might be challenged to make up a 4th measurement method on their own.

Data cannot be owned (i.e. with copyright). The only ethical issue is misrepresentation of the origin, which wouldn't be a problem since you would state that the data is from students in the class.

With respect to making it up, I find it more genuine, in terms of teaching the concepts here, to use real data. Of course, the lab guide you are producing is meant to encourage others to perform the lab work in which case they will have their own real data.

I like CCPhysicist's idea of bringing in a group to produce new data and crediting them.

By Bruce Elrick (not verified) on 26 May 2009 #permalink

If the IRB makes a fuss, find the class roll, and ask the alumni association for current addresses to get a SASE postcard for permission. However, if the data is anonymized (you don't indicate which student gave which times, and perhaps using only data for one year without specifying the date), I can't see a problem.

CCPhysicist's suggestion to run the lab again sounds good, too.

You've got students that work for you, right? Isn't that what they're for? To satisfy your every whim? The experiment sounds simple enough that if you're concerned about using data you should be able to make new data pretty quickly.

We used to do a very similar lab with the pre-med students. No high speed cameras, though. Instead we measured how long the projectile blocked a laser beam as it came off the launcher. Our results were never as good as yours, due to wildly inconsistent launchers.

siteniz ve sitenizdeki konular gerçekten çok güzel.
principles sizi herzaman takip ediyorum ben türküm.

The ethical issues here are minor to non-existent.

1) You are not doing experiments ON students (or human or animal subjects). Most psychology projects are, and would require IRB and individual permissions if data from experiments on human subjects were to be released, even de-identified (there is a big differnce between balls flying through air and a student having reaction times measured).

2) There is no way students can be identified from this data. Because this is not experiments ON students. It's not like like each data point comes with a students name attached, your data will be something like mean height +/- SD under different measurement conditions. If you are using images to illustrate the set-up, if there is a student in the image frame, you need their permission (otherwise just get someone to photograph you holding the measuring stick).

3) This is prac class data, the students *don't* own the data. This is unlike *research* projects, where there is student intellectual input (and the University usually has a policy on Intellectual property arising out of such research) or original assignments/essays or even the students prac writeup, where you would be required to get permission to use the original research, with appropriate acknowledgments.

Still, checking with the IRB would be a good idea. They will asky you inane questions to make sure that the studenst can't be identified, but they should let it through with no problems.

By Ian Musgrave (not verified) on 26 May 2009 #permalink

The proper publication would be in "The Physics Teacher." Then other teachers could use it, having something refereed to show their Chairmen for approval.

I love blogs, and so do you, and this likewise filters the people reading this now. But Academe is slow to catch up.

Clark @9:
If your launchers are "inconsistent", the standard deviation is measuring a property of the launcher rather than the inevitable flaws in the measurement process. This can offer you a real teachable moment in a pre-med class, since the data they will encounter in their career will contain inherent variations due to uncontrollable differences between people using a particular drug.

I try to do that with my pre-engineering students in our lab. They will also face the reality that no two batches of concrete are identical.

That's not an error, it is additional data - much like the width of a peak can tell you the lifetime of a transition if the resolution is smaller than the real width.

By CCPhysicist (not verified) on 30 May 2009 #permalink

The proper publication would be in "The Physics Teacher." Then other teachers could use it, having something refereed to show their Chairmen for approval.

I love blogs, and so do you, and this likewise filters the people reading this now. But Academe is slow to catch up.