Recently we took our hybrid car into the shop for its annual emissions test. In our state, the test is conducted while the car is idling. A hybrid doesn’t actually idle — it shuts the engine off completely. So our car’s emissions were tested at 0 RPM. It may be time to rethink our state’s emissions laws.
There’s another law that might need rethinking in the age of hybrids. Our car’s internal combustion engine often doesn’t start up even when the car is moving at low speeds — it uses electric motors, running nearly silently. This can potentially be dangerous for pedestrians in parking lots and crosswalks: if they can’t hear us, they might not notice us at all, and if we don’t see them, someone could get hurt. Now some states are actually considering legislation requiring cars to make noise even while idling or moving at low speeds.
But how much does noise help us spot objects? Aren’t pedestrians supposed to look for cars, not just listen for them? Aren’t drivers supposed to look for hazards, not hear them? Indeed, there has been some research suggesting that sounds do help us locate objects. However, most of this research has been on directional sounds — a sound from the right helps us spot an object on the right side of the computer screen (for an exception, see this post). Does a sound that’s not from a particular direction still help us notice a change?
A team led by Toemme Noesselt flashed images using an extra fast-response computer display to flash images at 16 volunteers. The displays looked something like
this movie (click on the image to play):
For each trial, viewers had to say whether the top or the bottom ring of dots disappeared. It’s easy in this version because your computer display is probably not fast enough to show the actual flashes. In the actual experiment, viewers were first tested to determine how quick a flash they could spot. Usually this was around 15 milliseconds (the flash in my movie, for comparison, was 100 milliseconds). Then they were shown movies like mine, where either the central cross changed to a circle to cue viewers that one of the rings of dots were disappearing, or a tone was played while the ring disappeared, or nothing cued them.
The length of the flash was gradually decreased until viewers could no longer detect it. Then it was increased, so that for the duration of the experiment, participants were between 55 percent and 80 percent accurate. So, did the tone improve viewer accuracy? Here are the results:
The graph shows the difference in reaction times compared to when no cue was given. As you can see, when there was a visual cue, respondents were significantly less accurate than with no cue. With an audio cue, they were significantly more accurate. And when both audio and visual cues were given, the difference from the no-cue condition wasn’t significant (although it was significantly different from both the other conditions). So the audio cue, especially when not paired with a visual cue, made respondents more accurate than a visual cue. If anything, the visual cue seemed to make it more difficult for respondents to identify which circle was disappearing.
In two additional experiments, the researchers varied the timing of the audio cues, finding that an early audio cue didn’t help accuracy, but extending the length of the audio cue helped about as much as an audio cue that matched the length of time the ring of dots disappeared.
Remember, regardless of which ring disappeared, the sound always came from the same spot, so it seems that just the fact of a sound accompanying the blink somehow helps focus attention on the change. Since the early cue doesn’t help, it doesn’t appear that the sound is somehow “warning” observers to look for a change. It’s more like the audio simply helps focus attention. And that can be a good thing when a killer hybrid is bearing down on you in a parking lot!
NOESSELT, T., BERGMANN, D., HAKE, M., HEINZE, H., & FENDRICH, R. (2008). Sound increases the saliency of visual events Brain Research, 1220, 157-163 DOI: 10.1016/j.brainres.2007.12.060