Compressive source separation : theory and methods for hyperspectral imaging

We propose and analyze a new model for hyperspectral images (HSIs) based on the assumption that the whole signal is composed of a linear combination of few sources, each of which has a specific spectral signature, and that the spatial abundance maps of these sources are themselves piecewise smooth a...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 22(2013), 12 vom: 16. Dez., Seite 5096-110
1. Verfasser: Golbabaee, Mohammad (VerfasserIn)
Weitere Verfasser: Arberet, Simon, Vandergheynst, Pierre
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2013
Zugriff auf das übergeordnete Werk:IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Schlagworte:Journal Article Research Support, Non-U.S. Gov't
Beschreibung
Zusammenfassung:We propose and analyze a new model for hyperspectral images (HSIs) based on the assumption that the whole signal is composed of a linear combination of few sources, each of which has a specific spectral signature, and that the spatial abundance maps of these sources are themselves piecewise smooth and therefore efficiently encoded via typical sparse models. We derive new sampling schemes exploiting this assumption and give theoretical lower bounds on the number of measurements required to reconstruct HSI data and recover their source model parameters. This allows us to segment HSIs into their source abundance maps directly from compressed measurements. We also propose efficient optimization algorithms and perform extensive experimentation on synthetic and real datasets, which reveals that our approach can be used to encode HSI with far less measurements and computational effort than traditional compressive sensing methods
Beschreibung:Date Completed 16.05.2014
Date Revised 10.10.2013
published: Print-Electronic
Citation Status PubMed-not-MEDLINE
ISSN:1941-0042
DOI:10.1109/TIP.2013.2281405