Shape morphing-based control of robotic visual servoing

Rahul Singh, Richard M. Voyles, David Littau, Nikolaos P Papanikolopoulos

Research output: Contribution to journalArticle

1 Scopus citations

Abstract

We present an approach for controlling robotic interactions with objects, using synthetic images generated by morphing shapes. In particular, we attempt the problem of positioning an eye-in-hand robotic system with respect to objects in the workspace for grasping and manipulation. In our formulation, the grasp position (and consequently the approach trajectory of the manipulator), varies with each object. The proposed solution to the problem consists of two parts. First, based on a model-based object recognition framework, images of the objects taken at the desired grasp pose are stored in a database. The recognition and identification of the grasp position for an unknown input object (selected from the family of recognizable objects) occurs by morphing its contour to the templates in the database and using the virtual energy spent during the morph as a dissimilarity measure. In the second step, the images synthesized during the morph are used to guide the eye-in-hand system and execute the grasp. The proposed method requires minimal calibration of the system. Furthermore, it conjoins techniques from shape recognition, computer graphics, and vision-based robot control in a unified engineering framework. Potential applications range from recognition and positioning with respect to partially-occluded or deformable objects to planning robotic grasping based on human demonstration.

Original languageEnglish (US)
Pages (from-to)317-338
Number of pages22
JournalAutonomous Robots
Volume10
Issue number3
DOIs
StatePublished - May 1 2001

Keywords

  • Robotic visual servoing
  • Shape morphing
  • Shape recognition
  • Vision-based grasping
  • Vision-based robot control

Fingerprint Dive into the research topics of 'Shape morphing-based control of robotic visual servoing'. Together they form a unique fingerprint.

  • Cite this