I was going to make this as a video tutorial, but it just didn’t work out right. So, here it is in blog post form.
How do you deal with a video that zoom and pans at the same time? You could keep on adjusting the coordinate axis AND adjust the scale for each frame – but sometimes that is not possible. Tracker Video has a great tool to handle these types of videos – the calibration point pair. The basic idea is that you identify two points in a scene that should be stationary (part of the background) and track those two points. Tracker will then adjust the coordinates and scale to make these points “stationary”. Here is my tutorial on actually how to do this.
I am going to use this awesome video that I used in a previous post.
Note: If you want to use that exact video, you are going to have to download it. There are several ways to save youtube videos. I use the NetVideoHunter plugin for FireFox. This may present another problem though. Typically, this will save a flash format movie (.flv). Tracker Video will let you use any video that Quicktime can play. For the latest OS X and Quicktime X, it will play flash files. If this doesn’t work for you, you will have to convert the .flv to something else (I think you could use VLC or Mpeg Streamclip)
First, get the video into tracker. Just go to Video-Import and find your video. Next, you will want to adjust the video properties. Step forward through the video until you get to the part you want to analyze. In this case, I don’t want to keep adjusting the frames if the guy is just standing there waiting to run. Click the little film icon in the lower right of the window.
Here I skipped forward to frame 33, so that is where I what the analysis to start (note, I am using a sub-clip of the original video). I also changed the step size to 4 otherwise, there might be too much clicking and I would go insane. For the next step, I am going to allow the scale, angle and origin to be changeable. Uncheck these boxes.
Now do two things. Show the coordinates and scale the video. Since the runners are moving in 1 dimension, I want one of the coordinate axes in this direction. Here are the two buttons that show the coordinates and the scale.
Here is my coordinate and my scale (hint: 10 yards is about 9.14 meters – or leave it in yards if you like).
Now for the calibration point pairs. Go to “create” and choose “Calibration Point Pair”
Now you need to pick some points in the background that you can use to track the camera. Not sure how it works on Windows, but in Mac OS X you need to hold down the shift-key to choose your points. Here are mine. (hint – after you use the scale tool, you can move it out of the way)
The points are marked by little “+” signs and may be difficult to see – so I added the arrows. Now I can just step the video forward one frame and move the points back to where they should be. Here is what it would look like after a few frames. (hint: if you can’t remember where the point was, just go back one frame and look)
And here is a problem. The calibration point on the left (at the end zone) is about to go out of the frame of the video. What to do? Simple, make another calibration point pair. You want to make the next pair of points BEFORE the old pair is not visible so that they can “match up”. Here are my new points.
Here the new points are marked with the red arrows and the old ones with the blue arrows. Now when I step forward in the video, I can move the new points and it will adjust the old points and the coordinate. This is a great way to make sure everything is working fine – to see you old calibration points (or at least one of them) moved its correct location. So, just keep on doing this. If you need to make some more calibration point pairs – do that. Once you are finished, you can go back and mark the location of the object.
One final note. In this video, there may be some perspective problems if you look at the motion over the whole video (because of the large angle through which the camera pans). I fixed this be breaking the motion into three separate cases with their own coordinates. I then matched up the data from the three “runs” to get one.