A dragonfly’s eyes and brain are the inspiration for a new machine vision system that has applications for surveillance, wildlife monitoring or smart cars.
An article written by Jack Baldwin for Lead South Australia explores Mechanical Engineering PhD Student Zahra Bagheri’s journey to develop a machine vision algorithm that mimics the eyes of a dragonfly.
Bagheri says that “despite having low visual acuity and brains no bigger than a grain of rice, dragonflies are remarkably good at tracking prey…97% of the time while they’re moving at very high speeds in very cluttered environments.”
According to Baldwin, “Bagheri is part of a team of engineers and neuroscientists that have used those methods to develop a machine vision algorithm that can be applied in a virtual reality simulation, allowing an artificial intelligence system to ‘pursue’ an object…[combining] neuroscience, mechanical engineering and computer science.”
“Detecting and tracking small objects against complex backgrounds is a highly challenging task,” Bagheri explains.
Baldwin shares, “this bio-inspired ‘active vision’ system has been tested in virtual reality worlds composed of various natural scenes. [Bagheri’s] team has found that it performs just as robustly as the state-of-the-art engineering target tracking algorithms, while running up to 20 times faster.”
The team recognizes that this technology has diverse applications and eventually can be utilized in surveillance, wildlife monitoring, smart cars, and even bionic vision.
Bagheri is lead author of the paper, titled Properties of Neuronal Facilitation that Improve Target Tracking in Natural Pursuit Simulations, which was published in the Journal of The Royal Society Interface.
The original article can be found from Lead South Australia. It was published in July 2015 by Jack Baldwin.