Research infrastructure for interactive human- and autonomous guidance

Bérénice Mettler, Navid Dadkhah, Zhaodan Kong, Jonathan Andersh

Research output: Contribution to journalArticle

10 Scopus citations

Abstract

This paper describes a research infrastructure setup to exercise and investigate guidance and control capabilities under human and autonomous control modalities. The lab facility is designed to implement tasks that emphasize agent-environment interactions. The overall goal is to characterize these interactions and to apply the gained knowledge to determine interaction models. These can then be used to design guidance and control algorithms as well as human-machine systems. The facility uses miniature rotorcraft as test vehicles with a Vicon motion tracking system and SensoMotoric gaze tracking system. The facility also includes a high-fidelity simulation system to support larger scale autonomy and teleoperation experiments. The simulation incorporates the software components and models of the key flight hardware and sensors. The software system was integrated around the Robotics Operating System (ROS) to support the heterogenous processes and data and allow easy system reconfiguration. The paper describes the research objectives, details of the hardware and software components and their integration, and concludes with a summary of the ongoing research enabled by the lab facility including.

Original languageEnglish (US)
Pages (from-to)437-459
Number of pages23
JournalJournal of Intelligent and Robotic Systems: Theory and Applications
Volume70
Issue number1-4
DOIs
StatePublished - Apr 1 2013

    Fingerprint

Keywords

  • Autonomous
  • Guidance
  • Human-machine
  • Perception
  • Tele-operation
  • UAS

Cite this