Synchronization of Video Sequences Through 3D Trajectory Reconstruction

Xue Wang, Jian Bo Shi, Hyun Soo Park, Qing Wang

Research output: Contribution to journalArticle

1 Scopus citations

Abstract

We present an algorithm for synchronization of an arbitrary number of videos captured by cameras indepen- dently moving in a dynamic 3D scene. Assuming the 3D spatial poses of the cameras are known for each frame, we first reconstruct the 3D trajectory of a moving point using the trajectory basis-based method. The trajectory coefficients are computed for each sequence separately. Point correspondences across sequences are not required, or even it is possible to track different points in different sequences, only if every 3D point tracked in the second sequence is a linear combination of subsets of the 3D points tracked in the first sequence. Then we propose use a robust rank constraint of the coefficient matrices to measure the spatio-temporal alignment quality for every feasible pair of video fragments. Finally, the optimal temporal mapping is found using a graph-based approach. Our algorithm can use both short and long feature trajectories, and it is robust to mild outliers. We verify the robustness and performance of the proposed approach on synthetic data as well as on challenging real video sequences.

Original languageEnglish (US)
Pages (from-to)1759-1772
Number of pages14
JournalZidonghua Xuebao/Acta Automatica Sinica
Volume43
Issue number10
DOIs
StatePublished - Oct 1 2017
Externally publishedYes

Keywords

  • Independently-moving cameras
  • Non-rigid structure from motion
  • Rank constraint
  • Trajectory basis
  • Video synchronization

Fingerprint Dive into the research topics of 'Synchronization of Video Sequences Through 3D Trajectory Reconstruction'. Together they form a unique fingerprint.

  • Cite this