KD-INR : Time-Varying Volumetric Data Compression via Knowledge Distillation-Based Implicit Neural Representation
Traditional deep learning algorithms assume that all data is available during training, which presents challenges when handling large-scale time-varying data. To address this issue, we propose a data reduction pipeline called knowledge distillation-based implicit neural representation (KD-INR) for c...
Veröffentlicht in: | IEEE transactions on visualization and computer graphics. - 1996. - 30(2024), 10 vom: 21. Sept., Seite 6826-6838 |
---|---|
1. Verfasser: | |
Weitere Verfasser: | , |
Format: | Online-Aufsatz |
Sprache: | English |
Veröffentlicht: |
2024
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on visualization and computer graphics |
Schlagworte: | Journal Article |
Online verfügbar |
Volltext |