My dad and I had a disagreement. We were driving back from a trip and he let me take over the wheel for a while. His complaint was that I was driving too slow and it was driving him crazy. For me, I thought I was driving fast. My typical behavior is to drive 3 mph under the speed limit. That is just how I roll. In this case, I knew he wouldn’t be able to handle this so I went the speed limit (70 mph).
Here is the problem. The speedometer said 70 mph. The gps thingy said the average speed was 69 mph. I think my dad feels that the gps is correct and that cars have speedometers that are intentionally set too high to prevent excessive speeding. I wasn’t too sure about this theory. Ok, before I go any further, let me give some definitions.
When this term is used, it is typically used to mean “instantaneous velocity”. In simple terms, the instantaneous velocity is how fast something is moving and in what direction (it’s a vector). Just for completeness, you could say the instantaneous velocity is:
But I really don’t want to talk about this except for the next definition.
Again, this is usually used as “instantaneous speed”. Basically, this is how fast something is going. Velocity is a vector, and speed usually means the magnitude of this vector. (if you forgot about vectors, here is a refresher) There is a problem though. Sometimes, the word speed is used to mean the distance traveled over the time or:
This one is going to need a diagram. Basically, the average velocity is the change in position over time for some interval. I know what you are thinking – but isn’t this the distance? Let me draw an overly complicated diagram.
The vector r1 is a vector from the origin to the starting position of the object. Delta r vector is the vector from the initial position to the final position. The black curved line is the path of the object. So, in this case, the magnitude of the average velocity is not the distance divided by the time (if you call the distance the length of that black line).
Determining the speed
So, what does a car do? How does it “measure” the speed that it reports. I suspect that the gps and the speedometer use different methods. The speedometer uses the rotation of the wheels to get the speed. If you know the angular velocity and the radius of the wheel, then the instantaneous speed is:
Of course, if the wheels are slipping, this method does not work too well. Also, if you have the wrong size tires, this can cause a problem. The gps reports average speed as the magnitude of the average velocity (see above). So, it is possible to have these two devices report a different average speed.
When the gps average speed is reset, it takes about 12 seconds to give a reading. From this, I assume that it is finding the average speed by using the position every 12 seconds. If the road is reasonably straight, this should be a good measure of your average speed. However, if it is curved, this would give a low value.
To settle our argument, we decided to set the cruise control around 70 mph and use a stopwatch. We measured the time to go 5 miles (according to the mile markers on the road). This gave an average speed of 70.5 mph (compared to the gps report of 69 mph). The road was not too curvy, so I suspect the difference is due to something other than the magnitude of the average velocity. Also, it is possible that the mile markers were off. We would have done more than 5 miles distance if we were more patient.
Anyway, thanks dad for helping me drive on that trip – even though you were wrong.