How haptics can enhance bionic eyes

Using haptics to improve outcomes for people given visual prosthetics
May 3, 2013
2sight_argus_ii

Argus II bionic-eye device (credit: Second Sight)

Haptic devices — technologies that simulate the feel of an object — should be used as early as possible in children fitted with visual prosthetics, and also for older congenitally blind and late-blind people, George van Doorn and colleagues at Monash University suggest.

The haptic device can provide supplementary or redundant information that allows cross-referencing with the visual input from the prosthetic, they explain. This will help train the brain more effectively to understand the electrical input it is receiving from the prosthetic.

The brain can be retrained to “understand” inputs from seemingly odd places. For instance, researchers grafted an electronic retina, similar to a low-resolution digital camera, to a patient’s tongue and then helped the patient learn how to interpret patterns of light hitting the sensor, even though the electrical signals reach the brain from receptors in the tongue.

How haptics can help

Artificial retinas are currently very-low resolution, a small array of a few dozen pixels, whereas a digital camera can have millions of pixels in its sensor. One can imagine that during the next few years artificial retinas will become more sophisticated and their resolution will increase, the researchers say. The limiting factor is the ability of the brain to be retrained to understand the input from these devices.

Van Doorn and colleagues Barry Richardson and Dianne Wuillemin, experts in virtual reality, bionics and tactile technologies, are now investigating how a haptic device might help. They suggest that exploiting multisensory processes will allow cross-calibration of information from the environment as well as assisting in teaching recipients of visual prosthetics to filter out noise, just as the brains of sighted individuals are able to do when looking at an object or scene.

These concepts are related to the ability of Braille readers to “see” text and deaf people to “hear” sign language. There are, however, critical periods in development when the brain is most receptive and plastic. Even poor sensory information is better than none at all, the team explains, provided that the different inputs correlate — from a visual prosthetic and haptic device, for instance — to tell the same story about the world.