Kernelized Hypergraph Neural Networks
Hypergraph Neural Networks (HGNNs) have attracted much attention for high-order structural data learning. Existing methods mainly focus on simple mean-based aggregation or manually combining multiple aggregations to capture multiple information on hypergraphs. However, those methods inherently lack...
| Publié dans: | IEEE transactions on pattern analysis and machine intelligence. - 1979. - 47(2025), 10 vom: 01. Sept., Seite 8938-8954 |
|---|---|
| Auteur principal: | |
| Autres auteurs: | , , , |
| Format: | Article en ligne |
| Langue: | English |
| Publié: |
2025
|
| Accès à la collection: | IEEE transactions on pattern analysis and machine intelligence |
| Sujets: | Journal Article |
| Résumé: | Hypergraph Neural Networks (HGNNs) have attracted much attention for high-order structural data learning. Existing methods mainly focus on simple mean-based aggregation or manually combining multiple aggregations to capture multiple information on hypergraphs. However, those methods inherently lack continuous non-linear modeling ability and are sensitive to varied distributions. Although some kernel-based aggregations on GNNs and CNNs can capture non-linear patterns to some degree, those methods are restricted in the low-order correlation and may cause unstable computation in training. In this work, we introduce Kernelized Hypergraph Neural Networks (KHGNN) and its variant, Half-Kernelized Hypergraph Neural Networks (H-KHGNN), which synergize mean-based and max-based aggregation functions to enhance representation learning on hypergraphs. KHGNN's kernelized aggregation strategy adaptively captures both semantic and structural information via learnable parameters, offering a mathematically grounded blend of kernelized aggregation approaches for comprehensive feature extraction. H-KHGNN addresses the challenge of overfitting in less intricate hypergraphs by employing non-linear aggregation selectively in the vertex-to-hyperedge message-passing process, thus reducing model complexity. Our theoretical contributions reveal a bounded gradient for kernelized aggregation, ensuring stability during training and inference. Empirical results demonstrate that KHGNN and H-KHGNN outperform state-of-the-art models across 10 graph/hypergraph datasets, with ablation studies demonstrating the effectiveness and computational stability of our method |
|---|---|
| Description: | Date Revised 12.09.2025 published: Print Citation Status PubMed-not-MEDLINE |
| ISSN: | 1939-3539 |
| DOI: | 10.1109/TPAMI.2025.3585179 |