Lossless image compression with multiscale segmentation

This paper is concerned with developing a lossless image compression method which employs an optimal amount of segmentation information to exploit spatial redundancies inherent in image data. Multiscale segmentation is obtained using a previously proposed transform which provides a tree-structured s...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 11(2002), 11 vom: 15., Seite 1228-37
1. Verfasser: Ratakonda, Krishna (VerfasserIn)
Weitere Verfasser: Ahuja, Narendra
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2002
Zugriff auf das übergeordnete Werk:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:This paper is concerned with developing a lossless image compression method which employs an optimal amount of segmentation information to exploit spatial redundancies inherent in image data. Multiscale segmentation is obtained using a previously proposed transform which provides a tree-structured segmentation of the image into regions characterized by grayscale homogeneity. In the proposed algorithm we prune the tree to control the size and number of regions thus obtaining a rate-optimal balance between the overhead inherent in coding the segmented data and the coding gain that we derive from it. Another novelty of the proposed approach is that we use an image model comprising separate descriptions of pixels lying near the edges of a region and those lying in the interior. Results show that the proposed algorithm can provide performance comparable to the best available methods and 15-20% better compression when compared with the JPEG lossless compression standard for a wide range of images
Beschreibung:Date Completed 20.05.2010
Date Revised 05.02.2008
published: Print
Citation Status PubMed-not-MEDLINE
ISSN:1941-0042
DOI:10.1109/TIP.2002.804528