Monkey controls robotic arm with brain-computer interface



Neuroscientists at the University of Pittsburgh report that they have successfully trained monkeys to feed themselves using a robotic arm controlled by a brain-computer interface (BCI).  The study has been covered extensively in the media, and I've written quite a lot about these devices in the past, so, rather than elaborate on it here, I'll refer you to my previous posts, and to this post by Ed at Not Exactly Rocket Science.  

However, I suggest that the new study is somewhat overhyped in some of the news stories that I've read. According to The Independent, for example, it is "a major breakthrough in the development of robotic prosthetic limbs." Actually, the study builds on research that began about 10 years ago, and in fact, John Donahue and his colleagues of Brown University in Providence, Rhode Island, reported something similar with a human quadriplegic patient, nearly 2 years ago.

Although not revolutionary, the new study does make a significant advance on previous work: the monkeys were able to control the prosthesis far more accurately than was the quadrilegic patient in the 2006 study. Whereas the patient's control over the prosthesis was very limited, here the monkeys were able to perform more complex manoeuvers. They could, for example, quickly alter the trajectory of the robotic arm if the food reward was moved unexpectedly.

Related:


Velliste, M., et al. (2008). Cortical control of a prosthetic arm for self-feeding. Nature. doi: 10.1038/nature06996. [Abstract]

Hochberg, L. R., et al. (2006). Neuronal ensemble control of prosthetic devices by a human with tetraplegia. Nature 442: 164-171. [Abstract

More like this