TY - GEN
T1 - Controlled active exploration of uncalibrated environments
AU - Smith, Christopher E.
AU - Brandt, Scott A.
AU - Papanikolopoulos, Nikolaos P.
PY - 1994
Y1 - 1994
N2 - Flexible operation of a robotic agent in an uncalibrated environment requires the ability to recover unknown or partially known parameters of the workspace through sensing. Of the sensors available to a robotic agent, visual sensors provide information that is richer and more complete than other sensors. In this paper we present robust techniques for the derivation of depth from feature points on a target's surface and for the accurate and high-speed tracking of moving targets. We use these techniques in a system that operates with little or no a priori knowledge of the object- and camera-related parameters to robustly determine such object-related parameters as velocity and depth. Such determination of extrinsic environmental parameters is essential for performing higher level tasks such as inspection, exploration, tracking, grasping, and collision-free motion planning. For both applications, we use the Minnesota Robotic Visual Tracker (a single visual sensor mounted on the end-effector of a robotic manipulator combined with a real-time vision system) to automatically select feature points on surfaces, to derive an estimate of the environmental parameter in question, and to supply a control vector based upon these estimates to guide the manipulator.
AB - Flexible operation of a robotic agent in an uncalibrated environment requires the ability to recover unknown or partially known parameters of the workspace through sensing. Of the sensors available to a robotic agent, visual sensors provide information that is richer and more complete than other sensors. In this paper we present robust techniques for the derivation of depth from feature points on a target's surface and for the accurate and high-speed tracking of moving targets. We use these techniques in a system that operates with little or no a priori knowledge of the object- and camera-related parameters to robustly determine such object-related parameters as velocity and depth. Such determination of extrinsic environmental parameters is essential for performing higher level tasks such as inspection, exploration, tracking, grasping, and collision-free motion planning. For both applications, we use the Minnesota Robotic Visual Tracker (a single visual sensor mounted on the end-effector of a robotic manipulator combined with a real-time vision system) to automatically select feature points on surfaces, to derive an estimate of the environmental parameter in question, and to supply a control vector based upon these estimates to guide the manipulator.
UR - http://www.scopus.com/inward/record.url?scp=0027932003&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0027932003&partnerID=8YFLogxK
U2 - 10.1109/cvpr.1994.323900
DO - 10.1109/cvpr.1994.323900
M3 - Conference contribution
AN - SCOPUS:0027932003
SN - 0818658274
SN - 9780818658273
T3 - Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
SP - 792
EP - 795
BT - Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
PB - Publ by IEEE
T2 - Proceedings of the 1994 IEEE Computer Society Conference on Computer Vision and Pattern Recognition
Y2 - 21 June 1994 through 23 June 1994
ER -