Enhancing optical-flow-based control by learning visual appearance cues for flying robots

  • Flying insects employ elegant optical-flow-based strategies to solve complex tasks such as landing or obstacle avoidance. Roboticists have mimicked these strategies on flying robots with only limited success, because optical flow (1) cannot disentangle distance from velocity and (2) is less informative in the highly important flight direction. Here, we propose a solution to these fundamental shortcomings by having robots learn to estimate distances to objects by their visual appearance. The learning process obtains supervised targets from a stability-based distance estimation approach. We have successfully implemented the process on a small flying robot. For the task of landing, it results in faster, smooth landings. For the task of obstacle avoidance, it results in higher success rates at higher flight speeds. Our results yield improved robotic visual navigation capabilities and lead to a novel hypothesis on insect intelligence: behaviours that were described as optical-flow-based and hardwired actually benefit from learning processes.

Export metadata

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Author:Guido C. H. E. de Croon, Christophe De Wagter, Tobias Seidl
DOI:https://doi.org/10.1038/s42256-020-00279-7
ISSN:2522-5839
Parent Title (English):Nature machine intelligence
Document Type:Article
Language:English
Date of Publication (online):2021/12/23
Year of first Publication:2021
Publishing Institution:Westfälische Hochschule Gelsenkirchen Bocholt Recklinghausen
Release Date:2021/12/23
Volume:3
Issue:1
First Page:33
Last Page:41
Departments / faculties:Fachbereiche / Maschinenbau Bocholt
Licence (German):License LogoEs gilt das Urheberrechtsgesetz

$Rev: 13159 $