Robot sensing and smartphones help blind navigate

May 2, 2012 | Source: New Scientist
EyeRing

EyeRing (credit: MIT Media Lab)

University in Paris engineers have developed a 3D navigation system for the blind using a pair of glasses equipped with cameras and sensors like those used in robot exploration.

It produces a 3D map of the wearer’s environment and their position within it that is constantly updated and displayed in a simplified form on a handheld electronic Braille device.

Two cameras on either side of the glasses generate a 3D image of the scene. A processor analyzes the image, picking out the edges of walls or objects, which it uses to create a 3D map.

The system’s collection of accelerometers and gyroscopes keeps track of the user’s location and speed. This information is combined with the image to determine the user’s position in relation to other objects. The system generates almost 10 maps per second, which are transmitted to the handheld Braille device to be displayed as a dynamic tactile map.

Other new navigation systems for the blind include a University of Nevada project that uses freely available 2D digital indoor maps and the smartphone’s built-in accelerometer and compas, and synthetic speech; and MIT Media Lab’s EyeRing, using a ring equipped with a camera, and headphones. The user points the ring at an object they are holding and uses voice commands to say what they need to know. The ring takes a picture of the object, which is transmitted wirelessly to a cellphone, where software analyzes the image and reads out the answer with a synthesized voice.