Adroit Droids

October 29, 2004 | Source: Technology Review

Advances in sensors, software, and computer architecture are beginning to give robots a sense of their “bodies” and of what sorts of actions are safe and useful in their environments.

One of the world’s most advanced robots passed an important test at NASA’s Johnson Space Center in Houston: it learned to use tools to tighten bolts on a wheel. Rather than having to be separately programmed for each of several possible situations, the robot showed it could recover if a tool slipped from its grasp or was moved around—and that it was flexible enough in its routine to tighten the bolts in any order requested.

The key advance: a new framework for robot learning. Peters’s software gives the NASA robot, called Robonaut, a short-term memory that lets it keep track of where it is and what it’s doing. By correlating actions like reaching for and grasping a tool with information from its 250 sensors–visual, tactile, auditory–the robot gets a feel for which movements achieve what kinds of goals. It can then apply that information to the acquisition of new skills, such as using a different tool.

The results could eventually include more effective robotic assistants for the elderly and autonomous bots for exploring battlefields and space.