Robust optical-flow based self-motion estimation for a quadrotor UAV

Volker Grabe, Heinrich H. Bulthoff, Paolo Robuffo Giordano

Research output: Chapter in Book/Report/Conference proceedingConference contribution

47 Citations (Scopus)

Abstract

Robotic vision has become an important field of research for micro aerial vehicles in the recent years. While many approaches for autonomous visual control of such vehicles rely on powerful ground stations, the increasing availability of small and light hardware allows for the design of more independent systems. In this context, we present a robust algorithm able to recover the UAV ego-motion using a monocular camera and on-board hardware. Our method exploits the continuous homography constraint so as to discriminate among the observed feature points in order to classify those belonging to the dominant plane in the scene. Extensive experiments on a real quadrotor UAV demonstrate that the estimation of the scaled linear velocity in a cluttered environment improved by a factor of 25% compared to previous approaches.

Original languageEnglish
Title of host publication2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2012
Pages2153-2159
Number of pages7
DOIs
Publication statusPublished - 2012
Event25th IEEE/RSJ International Conference on Robotics and Intelligent Systems, IROS 2012 - Vilamoura, Algarve, Portugal
Duration: 2012 Oct 72012 Oct 12

Publication series

NameIEEE International Conference on Intelligent Robots and Systems
ISSN (Print)2153-0858
ISSN (Electronic)2153-0866

Other

Other25th IEEE/RSJ International Conference on Robotics and Intelligent Systems, IROS 2012
Country/TerritoryPortugal
CityVilamoura, Algarve
Period12/10/712/10/12

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Computer Vision and Pattern Recognition
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Robust optical-flow based self-motion estimation for a quadrotor UAV'. Together they form a unique fingerprint.

Cite this