A smart-object recognition algorithm that doesn’t need humans

Skynet alert
January 17, 2014

(Credit: BYU Photo)

BYU engineer Dah-Jye Lee has created an algorithm that can accurately identify objects in images or video sequences — without human calibration.

“In most cases, people are in charge of deciding what features to focus on and they then write the algorithm based off that,” said Lee, a professor of electrical and computer engineering. “With our algorithm, we give it a set of images and let the computer decide which features are important.”

Humans need not apply

Not only is Lee’s genetic algorithm able to set its own parameters, but it also doesn’t need to be reset each time a new object is to be recognized —  it learns them on its own.

Lee likens the idea to teaching a child the difference between dogs and cats. Instead of trying to explain the difference, we show children images of the animals and they learn on their own to distinguish the two. Lee’s object recognition does the same thing: Instead of telling the computer what to look at to distinguish between two objects, they simply feed it a set of images and it learns on its own.

(Credit: BYU Photo)

Comparison with other object-recognition algorithms

In a study published in the December issue of academic journal Pattern Recognition, Lee and his students demonstrate both the independent ability and accuracy of their “ECO features” genetic algorithm.

The BYU algorithm tested as well or better than other top object recognition algorithms to be published, including those developed by NYU’s Rob Fergus and Thomas Serre of Brown University.

Example images from the Caltech image datasets after being scaled and having color removed (credit: BYU Photo)

Lee and his students fed their object recognition program four image datasets from CalTech (motorbikes, faces, airplanes and cars) and found 100 percent accurate recognition on every dataset. The other published well-performing object recognition systems scored in the 95–98% range.

The team has also tested their algorithm on a dataset of fish images from BYU’s biology department that included photos of four species. The algorithm was able to distinguish between the species with 99.4% accuracy.

Lee said the results show the algorithm could be used for a number of applications, from detecting invasive fish species to identifying flaws in produce such as apples on a production line. More interesting applications might be surveillance and robot vision systems.

“It’s very comparable to other object recognition algorithms for accuracy, but, we don’t need humans to be involved,” Lee said. “You don’t have to reinvent the wheel each time. You just run it.”


Abstract of Pattern Recognition paper

This paper presents a novel approach for object detection using a feature construction method called Evolution-COnstructed (ECO) features. Most other object recognition approaches rely on human experts to construct features. ECO features are automatically constructed by uniquely employing a standard genetic algorithm to discover series of transforms that are highly discriminative. Using ECO features provides several advantages over other object detection algorithms including: no need for a human expert to build feature sets or tune their parameters, ability to generate specialized feature sets for different objects, and no limitations to certain types of image sources. We show in our experiments that ECO features perform better or comparable with hand-crafted state-of-the-art object recognition algorithms. An analysis is given of ECO features which includes a visualization of ECO features and improvements made to the algorithm.