To Fold or Not to Fold : Graph Regularized Tensor Train for Visual Data Completion
Tensor train (TT) representation has achieved tremendous success in visual data completion tasks, especially when it is combined with tensor folding. However, folding an image or video tensor breaks the original data structure, leading to local information loss as nearby pixels may be assigned into...
| Publié dans: | IEEE transactions on pattern analysis and machine intelligence. - 1979. - PP(2025) vom: 30. Sept. |
|---|---|
| Auteur principal: | |
| Autres auteurs: | , , |
| Format: | Article en ligne |
| Langue: | English |
| Publié: |
2025
|
| Accès à la collection: | IEEE transactions on pattern analysis and machine intelligence |
| Sujets: | Journal Article |
| Résumé: | Tensor train (TT) representation has achieved tremendous success in visual data completion tasks, especially when it is combined with tensor folding. However, folding an image or video tensor breaks the original data structure, leading to local information loss as nearby pixels may be assigned into different dimensions and become far away from each other. In this paper, to fully preserve the local information of the original visual data, we explore not folding the data tensor, and at the same time adopt graph information to regularize local similarity between nearby entries. To overcome the high computational complexity introduced by the graph-based regularization in the TT completion problem, we propose to break the original problem into multiple sub-problems with respect to each TT core fiber, instead of each TT core as in traditional methods. Furthermore, to avoid heavy parameter tuning, a sparsity-promoting probabilistic model is built based on the generalized inverse Gaussian (GIG) prior, and an inference algorithm is derived under the mean-field approximation. Experiments on both synthetic data and real-world visual data show the superiority of the proposed methods |
|---|---|
| Description: | Date Revised 30.09.2025 published: Print-Electronic Citation Status Publisher |
| ISSN: | 1939-3539 |
| DOI: | 10.1109/TPAMI.2025.3615445 |