Object tracking with probabilistic hausdorff distance matching

Sang Cheol Park, Seong Whan Lee

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper proposes a new method of extracting and tracking a nonrigid object moving while allowing camera movement. For object extraction we first detect an object using watershed segmentation technique and then extract its contour points by approximating the boundary using the idea of feature point weighting. For object tracking we take the contour to estimate its motion in the next frame by the maximum likelihood method. The position of the object is estimated using a probabilistic Hausdorff measurement while the shape variation is modelled using a modified active contour model. The proposed method is highly tolerant to occlusion. Because the tracking result is stable unless an object is fully occluded during tracking, the proposed method can be applied to various applications.

Original languageEnglish
Title of host publicationAdvances in Intelligent Computing - International Conference on Intelligent Computing, ICIC 2005, Proceedings
EditorsDe-Shuang Huang, Xiao-Ping Zhang, Guang-Bin Huang
PublisherSpringer Verlag
Pages233-242
Number of pages10
ISBN (Print)9783540282266
DOIs
Publication statusPublished - 2005
EventInternational Conference on Intelligent Computing, ICIC 2005 - Hefei, China
Duration: 2005 Aug 232005 Aug 26

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume3644 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Other

OtherInternational Conference on Intelligent Computing, ICIC 2005
Country/TerritoryChina
CityHefei
Period05/8/2305/8/26

Keywords

  • Convolution

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint

Dive into the research topics of 'Object tracking with probabilistic hausdorff distance matching'. Together they form a unique fingerprint.

Cite this