Objective evaluation of video segmentation quality
Video segmentation assumes a major role in the context of object-based coding and description applications. Evaluating the adequacy of a segmentation result for a given application is a requisite both to allow the appropriate selection of segmentation algorithms as well as to adjust their parameters...
Veröffentlicht in: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 12(2003), 2 vom: 28., Seite 186-200 |
---|---|
1. Verfasser: | |
Weitere Verfasser: | |
Format: | Online-Aufsatz |
Sprache: | English |
Veröffentlicht: |
2003
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society |
Schlagworte: | Journal Article |
Zusammenfassung: | Video segmentation assumes a major role in the context of object-based coding and description applications. Evaluating the adequacy of a segmentation result for a given application is a requisite both to allow the appropriate selection of segmentation algorithms as well as to adjust their parameters for optimal performance. Subjective testing, the current practice for the evaluation of video segmentation quality, is an expensive and time-consuming process. Objective segmentation quality evaluation techniques can alternatively be used; however, it is recognized that, so far, much less research effort has been devoted to this subject than to the development of segmentation algorithms. This paper discusses the problem of video segmentation quality evaluation, proposing evaluation methodologies and objective segmentation quality metrics for individual objects as well as for complete segmentation partitions. Both stand alone and relative evaluation metrics are developed to cover the cases for which a reference segmentation is missing or available for comparison |
---|---|
Beschreibung: | Date Completed 15.12.2009 Date Revised 01.02.2008 published: Print Citation Status PubMed-not-MEDLINE |
ISSN: | 1941-0042 |
DOI: | 10.1109/TIP.2002.807355 |