People often use the concept "hand-eye coordination" without appreciating its neural basis. Evidence collected by Andersen & colleagues over the past ten years indicates that different areas of parietal cortex are recruited to represent targets which require different effectors, all using a common eye-based coordinate system. Thus these areas are precisely those which contribute to "hand-eye coordination."
In a 1999 issue of Science, Batista, Buneao, Snyder & Andersen reported that 84% of neurons recorded from the "parietal reach region" of 3 macaques showed a peculiar pattern of sensitivity. The critical finding was that firing in these neurons was more correlated across situations where monkeys moved their arms in different physical directions, but the same direction relative to the direction of gaze, than across situations where monkeys moved their arms in the same physical direction, but different directions relative to the direction of gaze. In other words, the neurons were specific to reaching movements, but were unaffected by the position of the hands; instead, the firing was modulated mostly by the difference in reach direction relative to the direction of gaze. So, this region appears to transform the location of an object in eye-centered coordinates (akin to primary visual cortex) to that required for reaching.
Supposing that this area represents a kind of "module" for transforming "gaze direction data" to "reach direction data," one can make a strong prediction: these cells should be very sensitive to eye movements (saccades) which intervene between the appearance of the target and a reach. Out of 34 neurons tested for this capacity, all 34 in the parietal reach region showed strong modulation by intervening saccades.
The authors suggest that the nearby lateral intraparietal area, which is known to be involved in planning eye movements, may operate with similar representations of space, allowing for a computationally and metabolically efficient coordination of hand and eye movements.
Subsequent work from the same group has indicated that nearby regions - the dorsal aspect of Brodmann's area 5 - may represent hand location in eye-centered coordinates. The authors suggest that computationally, activity from the two areas might be simply "subtracted" to arrive at an object's location in hand-centered coordinates, as would be required for reaching, and as are observed in the primary motor cortex.
Similarly, the anterior intraparietal area is thought to be specific for grasping motions, and also representations spatial information in eye-centered coordinates. An excellent review covers the possible interactions between - and anatomical proximity of - these areas, given their similar coordinate systems.
- Log in to post comments
Parietal is very cool and this subject has some zen aspects. For example, is your awareness of the location of your hand in eye centered coordinates:
a) the location of your hand in eye centered coordinates or
b) a prediction of the next location of your hand in eye-centered coordinates (e.g., a forward model)
Other areas ask a similar question. The activation properties of M1, for example, are tuned to the kinematics of the movement you are about to make, not the present movement. Which do you think you're aware of?
Churchland (2006). "Self-Representation in Nervous Systems"
http://www3.interscience.wiley.com/journal/118875892/abstract?CRETRY=1&…