Monkey think, monkey control robot - from 7,000 miles away



This film clip describes how neuroscientists have controlled the movements of a humanoid robot using a brain-computer interface (BCI) embedded in the motor cortex of a monkey.

I've written about BCIs before, so I won't go into details here. For more information about how they work, follow the links at the bottom, and for more about this particular device, there's an article in the NY Times by Sandra Blakeslee.

The main difference here is that the monkey and the robot were more than 7,000 miles apart: the monkey was in Miguel Nicolelis's lab at Duke University Medical Center in North Carolina, while the robot, which was designed by Gordon Cheng and his colleagues, was at the ATL Computational Neuroscience Laboratories in Kyoto, Japan. The monkey's decoded brain activity was transmitted between the two locations via a fast internet connection.

Related:

More like this

Recent highlights from the best in brain blogging: Online experiments at the Harvard Visual Cognition Lab!
Misha at Mind Hacks has a great update on brain-computer interface advances.
Here's a video of a brain-computer interface that's entering clinical trials. Unlike the MRI interface we reported on last week, this one requires an electrode to be embedded in the user's brain.

It just gets ever more exciting, though I can also think of some significant negatives (seeing the outsourcing of machine control jobs on the horizon). OTH, it would also mean a huge leap in how certain machines are controlled.

Monkeys do teleoperate with their limbs constrained. You take away the motivation or impetus for the movement so they are not really trying to move. So is this about keeping the monkey still of its own accord?