Insect-robot interfacing


The Los Angeles Times reports on "Robo-moth", a cleverly designed contraption, built from cheap off-the-shelf parts, which was presented at the Society for Neuroscience meeting in San Diego earlier this week.

Robo-moth is a 6-inch-tall wheeled robot to which attached tobacco horn moth has been attached. A microelectrode inserted into the insect's brain records the activity of a single visual motion detection neuron, which exhibits directional selectivity and which is involved in steadying the visual field during flight. 

The moth is immobilized inside a cylinder covered with vertical stripes. When the direction of rotation is switched, the moth switches its gaze, and the activity of the motion detection neuron changes as a result. The activity of the cell is amplified and translated into commands which control small motors located in the robot's base. When the moth looks to the right, the robot moves in that direction, and vice versa

The device was built as part of Tim Melano, a Ph.D. student in the Neuromorphic Vision and Robotic Systems Lab at the University of Arizona, led by Charles Higgins, an associate professor neurobiology and electrical and computer engineering.

Melano, Higgins and other members of the lab are investigating how the complex visual systems of moths, houseflies and dragonflies process complex patterns of optic flow, and  using knowledge of the neurobiology as inspiration for the architecture of neuromorphic VLSI chips - silicon semiconductors consisting of thousands of integrated transistor circuits.    

The device uses a closed loop control system, in which the sensory inputs received from the moth's visual system are used to control the movements of the robot. The robot's "brain" is a neuromorphic chip which detects visual motion by reading the activity of the direction selective neuron and performs the algorithms which generate the outputs.

Neuromorphic chips are nothing like as sophisticated as the biological computational systems on which they are based. The inputs they receive have a very low resolution, and the number of outputs they can generate is very limited.

But the chips need not be as sophisticated as the real thing in order to perform the same functions, and sooner or later, neuromorphic chips will be incorporated into advanced neural implants which are designed to augment or even replace human sensory systems.


More like this

Scientist and journalist Sunny Bains discusses how Swiss researchers are using central pattern generator (CPG) chips to develop self-organizing furniture. CPGs are networks of spinal neurons that generate the rhythmic patterns of neural activity which control locomotion. I wrote about them…
Electrical engineer and neurosciences researcher Charles Higgins sees a time in the future when scientists will design robots that have the powerful vision of a moth or dragonfly, or even have such insects built in them directly to carry out phenomenal tasks involving sight and motion detection.…
There's been a lot of news about robots lately, so I thought I'd take the opportunity to synthesize what's going on in this field and offer a bit of speculation about where robotics is headed. First: From Neurodudes comes news of an artificial robotic limb that not only responds to nerve impulses…
Last week, a team of computer scientists led by Dharmendra S. Modha announced what sounded like an impressive breakthrough for neuroscience-inspired computing: Using Dawn Blue Gene / P supercomputer at Lawrence Livermore National Lab with 147,456 processors and 144 TB of main memory, we achieved a…

You know, I remember when Steve Austin, the six million dollar man, was just a fictional character, with the exception of him being a super hero, we are almost there.