Robust global motion estimation oriented to video object segmentation

Most global motion estimation (GME) methods are oriented to video coding while video object segmentation methods either assume no global motion (GM) or directly adopt a coding-oriented method to compensate for GM. This paper proposes a hierarchical differential GME method oriented to video object se...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 17(2008), 6 vom: 01. Juni, Seite 958-67
1. Verfasser: Qi, Bin (VerfasserIn)
Weitere Verfasser: Ghazal, Mohammed, Amer, Aishy
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2008
Zugriff auf das übergeordnete Werk:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Schlagworte:Evaluation Study Journal Article Research Support, Non-U.S. Gov't
Beschreibung
Zusammenfassung:Most global motion estimation (GME) methods are oriented to video coding while video object segmentation methods either assume no global motion (GM) or directly adopt a coding-oriented method to compensate for GM. This paper proposes a hierarchical differential GME method oriented to video object segmentation. A scheme which combines three-step search and motion parameters prediction is proposed for initial estimation to increase efficiency. A robust estimator that uses object information to reject outliers introduced by local motion is also proposed. For the first frame, when the object information is unavailable, a robust estimator is proposed which rejects outliers by examining their distribution in local neighborhoods of the error between the current and the motion-compensated previous frame. Subjective and objective results show that the proposed method is more robust, more oriented to video object segmentation, and faster than the referenced methods
Beschreibung:Date Completed 19.06.2008
Date Revised 10.12.2019
published: Print
Citation Status MEDLINE
ISSN:1941-0042
DOI:10.1109/TIP.2008.921985