Activity recognition using dense long-duration trajectories

Ju Sun, Yadong Mu, Shuicheng Yan, Loong Fah Cheong

Research output: Chapter in Book/Report/Conference proceedingConference contribution

43 Scopus citations

Abstract

Current research on visual action/activity analysis has mostly exploited appearance-based static feature descriptions, plus statistics of short-range motion fields. The deliberate ignorance of dense, long-duration motion trajectories as features is largely due to the lack of mature mechanism for efficient extraction and quantitative representation of visual trajectories. In this paper, we propose a novel scheme for extraction and representation of dense, long-duration trajectories from video sequences, and demonstrate its ability to handle video sequences containing occlusions, camera motions, and nonrigid deformations. Moreover, we test the scheme on the KTH action recognition dataset [1], and show its promise as a scheme for general purpose long-duration motion description in realistic video sequences.

Original languageEnglish (US)
Title of host publication2010 IEEE International Conference on Multimedia and Expo, ICME 2010
Pages322-327
Number of pages6
DOIs
StatePublished - Nov 22 2010
Externally publishedYes
Event2010 IEEE International Conference on Multimedia and Expo, ICME 2010 - Singapore, Singapore
Duration: Jul 19 2010Jul 23 2010

Publication series

Name2010 IEEE International Conference on Multimedia and Expo, ICME 2010

Conference

Conference2010 IEEE International Conference on Multimedia and Expo, ICME 2010
CountrySingapore
CitySingapore
Period7/19/107/23/10

Keywords

  • Action recognition
  • Computer vision
  • Motion trajectories
  • Motion understanding
  • Tracking
  • Video analysis

Fingerprint Dive into the research topics of 'Activity recognition using dense long-duration trajectories'. Together they form a unique fingerprint.

Cite this