An autonomous flying robot that avoids obstacles

As smart as a bird in maneuvering around obstacles
October 31, 2012

An autonomous flying robot avoids a tree on the Cornell Arts Quad (credit: Saxena lab)

Able to guide itself through forests, tunnels, or damaged buildings, an autonomous flying robot developed by Ashutosh Saxena, assistant professor of computer science at Cornell University, and his team could have tremendous value in search-and-rescue operations, according to the researchers.

The test vehicle is a quadrotor, a commercially available flying machine table with four helicopter rotors. Human controllers can’t always react swiftly enough, and radio signals may not reach everywhere the robot goes. So Saxena and his team programmed quadrotors to navigate hallways and stairwells using 3-D cameras.

But in the wild, these cameras aren’t accurate enough at large distances to plan a route around obstacles. So Saxena is now building on methods he previously developed to turn a flat video camera image into a 3-D model of the environment, using such cues as converging straight lines, the apparent size of familiar objects and what objects are in front of or behind each other — the same cues humans unconsciously use to supplement their stereoscopic vision.

The Saxena lab has developed learning algorithms enable miniature aerial vehicles (MAVs) to avoid obstacles using vision, and successfully navigate indoor and outdoor environments with a wide variety of obstacles (trees, buildings, poles, fences, etc.). The algorithms map visual features from a single image into a 3D model, which the MAV uses to plan an obstacle-free path. (Credit: Saxena lab)

The researchers have trained the robot with 3-D pictures of obstacles such as tree branches, poles, fences and buildings. The robot’s computer learns the characteristics all the images have in common, such as color, shape, texture and context — a branch, for example, is attached to a tree.

The resulting set of rules for deciding what is an obstacle is burned into a chip before the robot flies. In flight, the robot breaks the current 3-D image of its environment into small chunks based on obvious boundaries, decides which ones are obstacles and computes a path through them as close as possible to the route it has been told to follow, constantly making adjustments as the view changes.

It was tested in 53 autonomous flights in obstacle-rich environments — including Cornell’s Arts Quad — succeeding in 51 cases, failing twice because of winds.

Saxena plans to improve the robot’s ability to respond to environment variations such as winds, and enable it to detect and avoid moving objects, like real birds; for testing purposes, he suggests having people throw tennis balls at the flying vehicle.

The project is supported by a grant from the Defense Advanced Research Projects Agency.