Visual identification of biological motion for underwater human–robot interaction

Junaed Sattar, Gregory Dudek

Research output: Contribution to journalArticlepeer-review

2 Scopus citations


We present an algorithm for underwater robots to visually detect and track human motion. Our objective is to enable human–robot interaction by allowing a robot to follow behind a human moving in (up to) six degrees of freedom. In particular, we have developed a system to allow a robot to detect, track and follow a scuba diver by using frequency-domain detection of biological motion patterns. The motion of biological entities is characterized by combinations of periodic motions which are inherently distinctive. This is especially true of human swimmers. By using the frequency-space response of spatial signals over a number of video frames, we attempt to identify signatures pertaining to biological motion. This technique is applied to track scuba divers in underwater domains, typically with the robot swimming behind the diver. The algorithm is able to detect a range of motions, which includes motion directly away from or towards the camera. Once detected, the motion of the diver relative to the vehicle is then tracked using an Unscented Kalman Filter, an approach for non-linear estimation. The efficiency of our approach makes it attractive for real-time applications on-board our underwater vehicle, and in future applications we intend to track scuba divers in real-time with the robot. The paper presents an algorithmic overview of our approach, together with experimental evaluation based on underwater video footage.

Original languageEnglish (US)
Pages (from-to)111-124
Number of pages14
JournalAutonomous Robots
Issue number1
StatePublished - Jan 1 2018


  • Diver tracking
  • Human–robot collaboration
  • Underwater robotics
  • Visual servoing
  • Visual tracking


Dive into the research topics of 'Visual identification of biological motion for underwater human–robot interaction'. Together they form a unique fingerprint.

Cite this