A visual language for robot control and programming: A human-interface study

Gregory Dudek, Junaed Sattar, Anqi Xu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

42 Scopus citations

Abstract

We describe an interaction paradigm for controlling a robot using hand gestures. In particular, we are interested in the control of an underwater robot by an onsite human operator. Under this context, vision-based control is very attractive, and we propose a robot control and programming mechanism based on visual symbols. A human operator presents engineered visual targets to the robotic system, which recognizes and interprets them. This paper describes the approach and proposes a specific gesture language called "RoboChat". RoboChat allows an operator to control a robot and even express complex programming concepts, using a sequence of visually presented symbols, encoded into fiducial markers. We evaluate the efficiency and robustness of this symbolic communication scheme by comparing it to traditional gesture-based interaction involving a remote human operator.

Original languageEnglish (US)
Title of host publication2007 IEEE International Conference on Robotics and Automation, ICRA'07
Pages2507-2513
Number of pages7
DOIs
StatePublished - Nov 27 2007
Event2007 IEEE International Conference on Robotics and Automation, ICRA'07 - Rome, Italy
Duration: Apr 10 2007Apr 14 2007

Publication series

NameProceedings - IEEE International Conference on Robotics and Automation
ISSN (Print)1050-4729

Other

Other2007 IEEE International Conference on Robotics and Automation, ICRA'07
CountryItaly
CityRome
Period4/10/074/14/07

Fingerprint Dive into the research topics of 'A visual language for robot control and programming: A human-interface study'. Together they form a unique fingerprint.

Cite this