Flying moths inspire robotics
March 1, 2013
The hawk moth’s wings are a blur of mottled gray motion as it hovers tethered to a steel rod in large white plastic orb. Outside the orb in the darkened room, a projector casts moving patterns of dimmed light onto the sphere’s surface, illuminating the moth’s field of vision with oscillating stripes. …
These changing light patterns create altered visual environments for the moth inside to simulate real-world visual disruptions the moth might experience when exposed to wind gusts. As the patterns change, the moth makes rapid adjustments to its flight behavior to maintain constant stability, as the moth’s responses to the visual stimuli are detected by a force sensor attached to the end of the steel rod.
These recordings are helping Tonya Muller, a DPhil student in Oxford University’s Department of Zoology, to understand the moth’s remarkable visual-motor system, and identify the mechanisms of visual feedback in insect flight control.
:”Understanding vision-based flight control in insects has far reaching uses in the fields of sensor development, signal processing, and robotics,” says Muller, whose background is in mechanical engineering. Vision is important for information gathering in insects and up to 50% of an insect’s brain can be composed of visual neurons.
In fact, despite their small brain size, insects can solve extremely sophisticated orientation problems both rapidly and reliably. Yet their eyes are far less sophisticated than our own.
Parallel processing in insects: a robot model
“Insects receive visual information through a relatively noisy, low-resolution sensor. But with this sensor they are able to processes information at sufficient speeds to react and respond to unexpected disturbances,” she says.
Insects also assess changes in their environment using information they receive from other sensory organs on their bodies (including antennae, airflow sensors, and wing-load sensors). Studies have shown that insects pre-process and combine the information from these multiple sensory inputs, prior to reaching the controller.
Current robotic technologies, on the other hand, use serial processing systems in which multiple sensors deliver separate and distinct input to the controller. Robot sensors are also currently designed for a very narrow and pre-defined range of conditions.
These limitations impede the response time of today’s robots and restrict their ability to maintain or regain stability after unforeseen disturbances. For these reasons, discovering how the efficient parallel processing system seen in insects operates is an area of great interest for engineers developing sensory control systems in robotics.
A neural information processing model
“Insects might just be the perfect neural information processing model for improving sensory technologies and control systems in electronic applications such as robotics. Yet we are only just beginning to understand the basics of the mechanisms and pathways involved. We still don’t know how insects extract visual cues from their environment, which cues are the most important, and how those cues are processed to achieve the fast and efficient flight stabilization that we see,” she says.
By measuring the hawk moth’s flight behavior in response to the visual stimuli presented on the white sphere, these novel experiments are beginning to shed light on these questions. “We can now simulate a 360 degree visual environment for the first time and measure all the forces and moments associated with the moth’s response to a particular stimulus,” Muller says. ‘This is a huge advancement over previous studies that projected visual stimuli in just two dimensions and recorded only a subset of the insects’ motion.”
Preliminary results from the experiments suggest that hawk moths use the angular position and velocity of the projected stripes as a primary cue to stabilize their flight. While describing flight dynamics accurately is an important advancement in the field, it is only the first step towards identifying the mechanisms of the active control of visual feedback in insect flight.
‘The next stage of this work will involve measuring the activity of the moths’ neurons in response to the visual stimuli presented to describe the electrophysiological pathways from the visual sensor to the flight dynamics in this species.
In the future, Muller hopes to be able to use implanted electrodes to measure neural activity in the moths. “The ability to obtain this kind of data remotely from free-flying moths is the cutting-edge of science in this field and a truly exciting prospect,” she says.