Bayesian Low Rank Tensor Ring for Image Recovery
Low rank tensor ring based data recovery can recover missing image entries in signal acquisition and transformation. The recently proposed tensor ring (TR) based completion algorithms generally solve the low rank optimization problem by alternating least squares method with predefined ranks, which m...
Veröffentlicht in: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 30(2021) vom: 01., Seite 3568-3580 |
---|---|
1. Verfasser: | |
Weitere Verfasser: | , , |
Format: | Online-Aufsatz |
Sprache: | English |
Veröffentlicht: |
2021
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society |
Schlagworte: | Journal Article |
Zusammenfassung: | Low rank tensor ring based data recovery can recover missing image entries in signal acquisition and transformation. The recently proposed tensor ring (TR) based completion algorithms generally solve the low rank optimization problem by alternating least squares method with predefined ranks, which may easily lead to overfitting when the unknown ranks are set too large and only a few measurements are available. In this article, we present a Bayesian low rank tensor ring completion method for image recovery by automatically learning the low-rank structure of data. A multiplicative interaction model is developed for low rank tensor ring approximation, where sparsity-inducing hierarchical prior is placed over horizontal and frontal slices of core factors. Compared with most of the existing methods, the proposed one is free of parameter-tuning, and the TR ranks can be obtained by Bayesian inference. Numerical experiments, including synthetic data, real-world color images and YaleFace dataset, show that the proposed method outperforms state-of-the-art ones, especially in terms of recovery accuracy |
---|---|
Beschreibung: | Date Revised 12.03.2021 published: Print-Electronic Citation Status PubMed-not-MEDLINE |
ISSN: | 1941-0042 |
DOI: | 10.1109/TIP.2021.3062195 |