Enhancing feature tracking with gyro regularization

Bryan Poling, Gilad Lerman

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

We present a deeply integrated method of exploiting low-cost gyroscopes to improve general purpose feature tracking. Most previous methods use gyroscopes to initialize and bound the search for features. In contrast, we use them to regularize the tracking energy function so that they can directly assist in the tracking of ambiguous and poor-quality features. We demonstrate that our simple technique offers significant improvements in performance over conventional template-based tracking methods, and is in fact competitive with more complex and computationally expensive state-of-the-art trackers, but at a fraction of the computational cost. Additionally, we show that the practice of initializing template-based feature trackers like KLT (Kanade-Lucas-Tomasi) using gyro-predicted optical flow offers no advantage over using a careful optical-only initialization method, suggesting that some deeper level of integration, like the method we propose, is needed in order to realize a genuine improvement in tracking performance from these inertial sensors.

Original languageEnglish (US)
Pages (from-to)42-58
Number of pages17
JournalImage and Vision Computing
Volume50
DOIs
StatePublished - Jun 1 2016

Bibliographical note

Funding Information:
We are thankful to the anonymous reviewer and the action editor for their very helpful comments that improved the presentation of the paper and its supplementary material. This work was supported by NSF awards DMS-09-56072 and DMS-14-18386 , the University of Minnesota Doctoral Dissertation Fellowship Program , and the Feinberg Foundation Visiting Faculty Program Fellowship of the Weizmann Institute of Science .

Keywords

  • Feature tracking
  • Gyroscopes
  • Inertial sensors
  • Optical flow

Fingerprint Dive into the research topics of 'Enhancing feature tracking with gyro regularization'. Together they form a unique fingerprint.

Cite this