TY - JOUR
T1 - Consistent estimation of rotational optical flow in real environments using a biologically-inspired vision algorithm on embedded hardware
AU - Skelton, Phillip S.M.
AU - Finn, Anthony
AU - Brinkworth, Russell S.A.
PY - 2019/12
Y1 - 2019/12
N2 - Insects are able to freely navigate through ever-changing environments using predominantly visual inputs, while possessing very minimal processing power compared to humans. Not only are they able to move at high velocities and accelerations, but they are also able to achieve extraordinary levels of obstacle avoidance. We begin to emulate this biological behaviour in a robotic application by first modelling how these visual pathways react to separable degrees of freedom within the motion field, specifically rotation in this case. We have developed upon an existing biologically-inspired algorithm based on the visual pathway of the hoverfly, and statistically compare results to current state-of-the-art algorithms, all the while performing this on computationally-constrained embedded hardware. We have shown that, using a complex, highly-elaborated representation of the hoverfly visual pathway, rotation optical flow estimations can be achieved with a high level of accuracy, at a level of consistency previously unseen in dense-flow algorithms, and they can be achieved at 100 frames per second on an embedded system. This work forms a fundamental basis to understanding one of the two separable components of insect egomotion (rotation and translation), allowing for consistently accurate rotational velocity estimation, providing a building block towards understanding the translational component of insect vision and the application of biologically-inspired egomotion estimation in autonomous vehicles.
AB - Insects are able to freely navigate through ever-changing environments using predominantly visual inputs, while possessing very minimal processing power compared to humans. Not only are they able to move at high velocities and accelerations, but they are also able to achieve extraordinary levels of obstacle avoidance. We begin to emulate this biological behaviour in a robotic application by first modelling how these visual pathways react to separable degrees of freedom within the motion field, specifically rotation in this case. We have developed upon an existing biologically-inspired algorithm based on the visual pathway of the hoverfly, and statistically compare results to current state-of-the-art algorithms, all the while performing this on computationally-constrained embedded hardware. We have shown that, using a complex, highly-elaborated representation of the hoverfly visual pathway, rotation optical flow estimations can be achieved with a high level of accuracy, at a level of consistency previously unseen in dense-flow algorithms, and they can be achieved at 100 frames per second on an embedded system. This work forms a fundamental basis to understanding one of the two separable components of insect egomotion (rotation and translation), allowing for consistently accurate rotational velocity estimation, providing a building block towards understanding the translational component of insect vision and the application of biologically-inspired egomotion estimation in autonomous vehicles.
KW - Biologically inspired
KW - Embedded hardware
KW - Optical flow
KW - Robotic sensing
KW - Rotational velocity
UR - http://www.scopus.com/inward/record.url?scp=85074249622&partnerID=8YFLogxK
U2 - 10.1016/j.imavis.2019.09.005
DO - 10.1016/j.imavis.2019.09.005
M3 - Article
AN - SCOPUS:85074249622
VL - 92
JO - Image and Vision Computing
JF - Image and Vision Computing
SN - 0262-8856
M1 - 103814
ER -