Secret Math of Fly Eyes Could Overhaul Robot Vision
Secret Math of Fly Eyes Could Overhaul Robot Vision: "
By turning the brain cell activity underlying fly eyesight into mathematical equations, researchers have found an ultra-efficient method for pulling motion patterns from raw visual data.
Though they built the system, the researchers don’t quite understand how it works. But however mysterious the equations may be, they could still be used to program the vision systems of miniaturized battlefield drones, search-and-rescue robots, automobile navigation systems and other systems where computational power is at a premium.
“We can build a system that works perfectly well, inspired by biology, without having a complete understanding of how the components interact. It’s a non-linear system,” said David O’Carroll, a computational neuroscientist who studies insect vision at Australia’s University of Adelaide. “The number of computations involved is quite small. We can get an answer using tens of thousands of times less floating-point computations than in traditional ways.”
The best-known of these is the Lucas-Kanade method, which calculates yaw — up-and-down, side-to-side motion changes — by comparing, frame by frame, how every pixel in a visual field changes. It’s used for steering and guidance in many experimental unmanned vehicles, but its brute-force approach requires lots of processing power, making it impractical in smaller systems.
In order to make smaller flying robots, researchers would like to find a simpler way of processing motion. Inspiration has come from the lowly fly, which uses just a relative handful of neurons to maneuver with extraordinary dexterity. And for more than a decade, O’Carroll and other researchers researchers have painstakingly studied the optical flight circuits of flies, measuring their cell-by-cell activity and turning evolution’s solutions into a set of computational principles.
In a paper published Friday in Public Library of Science Computational Biology, O’Carroll and fellow University of Adelaide biologist Russell Brinkworth put these methods to the test.
“A laptop computer uses tens of watts of power. Implementing what we’ve developed can be done with chips that consume just a fraction of a milliwatt,” said O’Carroll.
The researchers’ algorithm is composed of a series of five equations through which data from cameras can be run. Each equation represents tricks used by fly circuits to handle changing levels of brightness, contrast and motion, and their parameters constantly shift in response to input. Unlike Lucas-Kanade, the algorithm doesn’t return a frame-by-frame comparison of every last pixel, but emphasizes large-scale patterns of change. In this sense, it works a bit like video-compression systems that ignore like-colored, unshifting areas.
To test the algorithm, O’Carroll and Brinkworth analyzed animated high-resolution images with a program of the sort that might operate in a robot. When they compared the results to the inputs, they found that it worked in a range of natural lighting conditions, varying in ways that usually baffle motion detectors.
“It’s amazing work,” said Sean Humbert, a University of Maryland aerospace engineer who builds miniaturized, autonomous flying robots, some of which run on earlier versions of O’Carroll’s algorithm. “For traditional navigational sensing, you need lots of payload to do the computation. But the payload on these robots is very small — a gram, a couple of Tic Tacs. You’re not going to stuff dual-core processors into a couple Tic Tacs. The algorithms that insects use are very simple compared to the stuff we design, and would scale down to small vehicles.”
Intriguingly, the algorithm doesn’t work nearly as well if any one operation is omitted. The sum is greater than the whole, and O’Carroll and Brinkworth don’t know why. Because the parameters are in constant feedback-driven flux, it produces a cascade of non-linear equations that are difficult to untangle in retrospect, and almost impossible to predict.
“We started with insect vision as an inspiration, and built a model that’s feasible for real-world use, but in doing so, we’ve built a system almost as complicated as the insect’s,” said O’Carroll. “That’s one of the fascinating things here. It doesn’t necessarily lead us to a complete understanding of how the system works, but to an appreciation that nature got it right.”
The researchers drew their algorithm from neural circuits attuned to side-to-side yaw, but O’Carroll said the same types of equations are probably used in computing other optical flows, such as those produced by moving forward and backwards through three-dimensional space.
“That’s more challenging,” said O’Carroll. “It may involve a few extra neurons.”
Images: 1) Flickr/Tambako the Jaguar. 2) PLoS Computational Biology.
See Also:
- Blowflies Get Virtual Reality in Flight Simulator
- Scientists Mimic Beetle’s Liquid Cannon
- To Build a Better Bridge, Make Like a Conch
- Mantis Shrimp Eyes Might Inspire New High-Def Devices
- Secret Law of Flying Could Inspire Better Robots
Citation: “Robust Models for Optic Flow Coding in Natural Scenes Inspired by Insect Biology.” By Russell S. A. Brinkworth, David C. O’Carroll. PLoS Computation Biology, November 6, 2009.
Brandon Keim’s Twitter stream and reportorial outtakes; Wired Science on Twitter. Brandon is currently working on a book about ecosystem and planetary tipping points.
"