TY - GEN
T1 - Charlie rides the elevator - Integrating vision, navigation and manipulation towards multi-floor robot locomotion
AU - Troniak, Daniel
AU - Sattar, Junaed
AU - Gupta, Ankur
AU - Little, James J.
AU - Chan, Wesley
AU - Calisgan, Ergun
AU - Croft, Elizabeth
AU - Van Der Loos, Machiel
PY - 2013
Y1 - 2013
N2 - This paper presents the design, implementation and experimental evaluation of a semi-humanoid robotic system for autonomous multi-floor navigation. This robot, a Personal Robot 2 named Charlie, is capable of operating an elevator to travel between rooms located on separate floors. Our goal is to create a robotic assistant capable of locating points of interest, manipulating objects, and navigating between rooms in a multi-storied environment equipped with an elevator. Taking the elevator requires the robot to (1) map and localize within its operating environment, (2) navigate to an elevator door, (3) press the up or down elevator call button, (4) enter the elevator, (5) press the control button associated with the target floor, and (6) exit the elevator at the correct floor. To that end, this work integrates the advanced sensor motor capabilities of the robot - laser rangefinders, stereo and monocular vision systems, and robotic arms - into a complete, task-driven autonomous system. While the design and implementation of individual sensor motor processing components is a challenge in and of itself, complete integration in intelligent systems design often presents an even greater challenge. This paper presents our approach towards designing the individual components, with focus on machine vision, manipulation, and systems integration. We present and discuss quantitative results of our live robotic system, discuss difficulties faced and expose potential pitfalls.
AB - This paper presents the design, implementation and experimental evaluation of a semi-humanoid robotic system for autonomous multi-floor navigation. This robot, a Personal Robot 2 named Charlie, is capable of operating an elevator to travel between rooms located on separate floors. Our goal is to create a robotic assistant capable of locating points of interest, manipulating objects, and navigating between rooms in a multi-storied environment equipped with an elevator. Taking the elevator requires the robot to (1) map and localize within its operating environment, (2) navigate to an elevator door, (3) press the up or down elevator call button, (4) enter the elevator, (5) press the control button associated with the target floor, and (6) exit the elevator at the correct floor. To that end, this work integrates the advanced sensor motor capabilities of the robot - laser rangefinders, stereo and monocular vision systems, and robotic arms - into a complete, task-driven autonomous system. While the design and implementation of individual sensor motor processing components is a challenge in and of itself, complete integration in intelligent systems design often presents an even greater challenge. This paper presents our approach towards designing the individual components, with focus on machine vision, manipulation, and systems integration. We present and discuss quantitative results of our live robotic system, discuss difficulties faced and expose potential pitfalls.
KW - Multi-Floor Navigation
KW - Robot Elevator Operation
KW - Robot Vision
KW - Robotics
KW - Service Robots
UR - http://www.scopus.com/inward/record.url?scp=84883395570&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84883395570&partnerID=8YFLogxK
U2 - 10.1109/CRV.2013.12
DO - 10.1109/CRV.2013.12
M3 - Conference contribution
AN - SCOPUS:84883395570
SN - 9780769549835
T3 - Proceedings - 2013 International Conference on Computer and Robot Vision, CRV 2013
SP - 1
EP - 8
BT - Proceedings - 2013 International Conference on Computer and Robot Vision, CRV 2013
T2 - 10th International Conference on Computer and Robot Vision, CRV 2013
Y2 - 29 May 2013 through 31 May 2013
ER -