Descending-stair detection, approach, and traversal with an autonomous tracked vehicle

Joel A. Hesch, Gian Luca Mariottini, Stergios Roumeliotis

Research output: Chapter in Book/Report/Conference proceedingConference contribution

36 Scopus citations

Abstract

This paper presents a strategy for descending-stair detection, approach, and traversal using inertial sensing and a monocular camera mounted on an autonomous tracked vehicle. At the core of our algorithm are vision modules that exploit texture energy, optical flow, and scene geometry (lines) in order to robustly detect descending stairwells during both far- and near-approaches. As the robot navigates down the stairs, it estimates its three-degrees-of-freedom (d.o.f.) attitude by fusing rotational velocity measurements from an on-board tri-axial gyroscope with line observations of the stair edges detected by its camera. We employ a centering controller, derived based on a linearized dynamical model of our system, in order to steer the robot along safe trajectories. A real-time implementation of the described algorithm was developed for an iRobot Packbot, and results from real-world experiments are presented.

Original languageEnglish (US)
Title of host publicationIEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings
Pages5525-5531
Number of pages7
DOIs
StatePublished - Dec 1 2010
Event23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Taipei, Taiwan, Province of China
Duration: Oct 18 2010Oct 22 2010

Publication series

NameIEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings

Other

Other23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010
Country/TerritoryTaiwan, Province of China
CityTaipei
Period10/18/1010/22/10

Fingerprint

Dive into the research topics of 'Descending-stair detection, approach, and traversal with an autonomous tracked vehicle'. Together they form a unique fingerprint.

Cite this