Adaptive hypergraph neural network integrating multisource structural information

QI Lin, BAO Peng, LIU Zhongyi, LI Liang

Journal of Tsinghua University(Science and Technology) ›› 2026, Vol. 66 ›› Issue (5) : 991-1004.

PDF(5517 KB)
PDF(5517 KB)
Journal of Tsinghua University(Science and Technology) ›› 2026, Vol. 66 ›› Issue (5) : 991-1004. DOI: 10.16511/j.cnki.qhdxxb.2026.21.003
BIG DATA

Adaptive hypergraph neural network integrating multisource structural information

  • {{article.zuoZhe_EN}}
Author information +
History +

Abstract

[Objective] With the widespread application of graph neural networks (GNNs) in domains such as social networks, bioinformatics, and recommender systems, complex high-order structural relationships embedded in graphs have become increasingly important factors determining model expressiveness and performance. Traditional GNN models are mostly built on pairwise edges, which limits their ability to effectively capture high-order interactions among multiple nodes. In recent years, hypergraph neural networks (HGNNs) have significantly extended the modeling capacity of GNNs by introducing hyperedges, enabling the representation of multi-way interactions and more expressive topological patterns. However, existing methods typically rely on a single hyperedge construction strategy, which restricts their ability to simultaneously capture the heterogeneity and structural diversity of different types of high-order relationships. This often leads to insufficient information fusion and limited structural generalization, rendering these methods inadequate for real-world graph data that exhibit multi-level and heterogeneous high-order semantics, such as overlapping communities or functional substructures. To address these challenges, we propose a multisource adaptive HGNN (MSA-HGNN), a unified and flexible high-order representation framework that integrates diverse structural information with structural awareness. [Methods] The proposed method introduces a multisource hyperedge construction framework that incorporates various structural types, including pairwise connectivity, k-hop neighborhoods, motif patterns, and frequent subgraphs. This design ensures comprehensive coverage of high-order associations across different semantic granularities and topological scales, thereby enhancing the completeness of structural modeling. These diverse sources complement one another and jointly reflect latent node dependencies from multiple relational perspectives. On this basis, we design a hyperedge fusion strategy that utilizes a learnable weight matrix to adaptively adjust the relative importance of different structural sources. This strategy enhances the discriminative capability of structure integration and effectively mitigates interference from semantic conflicts among heterogeneous structures. Furthermore, during the high-order information propagation stage, we incorporate an attention mechanism to assign structure-aware weights and context-sensitive relevance scores to node pairs across different high-order structures. This enhances the directionality, precision, and robustness of feature aggregation, particularly under noisy or incomplete structural conditions. [Results] We conducted extensive experiments on five benchmark datasets covering homogeneous and heterogeneous graph scenarios, including three citation networks (Cora, Citeseer, and PubMed) and two heterogeneous graphs (House and Senate). The results demonstrate that MSA-HGNN achieves the highest classification accuracy across all datasets. Specifically, on the Cora dataset, it outperforms the leading baseline, ED-HNN, by 2.6%, while maintaining comparable or superior performance on Citeseer and PubMed. On the more challenging heterogeneous datasets, MSA-HGNN improves accuracy by 12.1% and 6.7% over HyperGCN on House and Senate, respectively, demonstrating strong generalization ability. Further ablation studies confirm the necessity of each structural component and validate the effectiveness of the adaptive fusion mechanism for integrating multisource high-order structures. [Conclusions] This study presents a unified, adaptive, and structure-aware approach for modeling high-order relations in hypergraphs. By integrating multiple hyperedge construction sources with learnable fusion and attention-based propagation, MSA-HGNN effectively captures diverse structural semantics and addresses the limitations of single-source hypergraph modeling. Extensive experimental results demonstrate the model's superior robustness and generalization capability. These findings suggest that multisource structural integration is a promising direction for high-order graph representation learning and establish a solid foundation for developing more expressive and generalizable GNNs.

Key words

multisource structure fusion / hypergraph convolution / graph neural networks / high-order structure modeling / graph representation learning

Cite this article

Download Citations
QI Lin, BAO Peng, LIU Zhongyi, LI Liang. Adaptive hypergraph neural network integrating multisource structural information[J]. Journal of Tsinghua University(Science and Technology). 2026, 66(5): 991-1004 https://doi.org/10.16511/j.cnki.qhdxxb.2026.21.003

References

[1] YADATI N, NIMISHAKAVI M, YADAV P, et al. Hypergcn: A new method of training graph convolutional networks on hypergraphs[C]// Proceedings of the 33rd International Conference on Neural Information Processing Systems. Vancouver, BC, Canada: Curran Associates Inc., 2019.
[2] GAO Y, FENG Y F, JI S Y, et al. HGNN+: General hypergraph neural networks[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023, 45(3): 3181-3199.
[3] ARYA D, GUPTA D, RUDINAC S, et al. Hypersage: Generalizing inductive representation learning on hypergraphs[C]// Advances in Neural Information Processing Systems. Red Hook, NY, USA: Curran Associates, Inc., 2019: 1509-1520.
[4] DONG Y H, SAWIN W, BENGIO Y. HNHN: Hypergraph networks with hyperedge neurons[EB/OL]. (2020-06-22)[2026-01-17]. https://arxiv.org/abs/2006.12278.
[5] FENG Y F, YOU H X, ZHANG Z Z, et al. Hypergraph neural networks[C]// Proceedings of the AAAI conference on artificial intelligence. Palo Alto, CA, USA: AAAI Press, 2019: 3558-3565.
[6] BAI S, ZHANG F H, TORR P H S. Hypergraph convolution and hypergraph attention[J]. Pattern Recognition, 2021, 110: 107637.
[7] ZOU M H, GAN Z X, WANG Y T, et al. UniG-Encoder: A universal feature encoder for graph and hypergraph node classification[J]. Pattern Recognition, 2024, 147: 110115.
[8] HEIN M, SETZER S, JOST L, et al. The total variation on hypergraphs-learning on hypergraphs revisited[C]// Proceedings of the 27th International Conference on Neural Information Processing Systems. Red Hook, NY, USA: Curran Associates Inc., 2013: 2427-2435.
[9] LI P, MILENKOVIC O. Submodular hypergraphs: p-laplacians, cheeger inequalities and spectral clustering[C]// Proceedings of the 35th International Conference on Machine Learning (ICML). Stockholm, Sweden: International Machine Learning Society, 2018: 4690-4719.
[10] HUANG J, YANG J. Unignn: a unified framework for graph and hypergraph neural networks[C]// Proceedings of the 30th International Joint Conference on Artificial Intelligence (IJCAI). Montreal, Canada: International Joint Conferences on Artificial Intelligence Organization, 2021: 2563-2569.
[11] CHIEN E, PAN C, PENG J H, et al. You are AllSet: a multiset function framework for hypergraph neural networks[C/OL]// Proceedings of the 10th International Conference on Learning Representations. (2022-01-29)[2026-01-17]. https://openreview.net/forum?id=hpBTIv2uy_E.
[12] LEE J, LEE Y, KIM J, et al. Set transformer: A framework for attention-based permutation-invariant neural networks[C]// Proceedings of the 36th International Conference on Machine Learning (ICML). Long Beach, CA, USA: PMLR, 2019: 3744-3753.
[13] WU H R, LI N S, ZHANG J, et al. Collaborative contrastive learning for hypergraph node classification[J]. Pattern Recognition, 2024, 146: 109995.
[14] TANG J, WU S, SUN J M. Confluence: Conformity influence in large social networks[C]// Proceedings of the 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. Chicago, Illinois, USA: Association for Computing Machinery, 2013: 347-355.
[15] JIANG B, LIANG J G, SHA Y, et al. Retweeting behavior prediction based on one-class collaborative filtering in social networks[C]// Proceedings of the 39th International ACM SIGIR Conference on Research and Development in Information Retrieval. Pisa, Italy: Association for Computing Machinery, 2016: 977-980.
[16] SUN X G, CHENG H, LIU B, et al. Self-supervised hypergraph representation learning for sociological analysis[J]. IEEE Transactions on Knowledge and Data Engineering, 2023, 35(11): 11860-11871.
[17] SHEIKHPOUR R, BERAHMAND K, MOHAMMADI M, et al. Sparse feature selection using hypergraph Laplacian- based semi-supervised discriminant analysis[J]. Pattern Recognition, 2025, 157: 110882.
[18] GAO Y, ZHANG Z Z, LIN H J, et al. Hypergraph learning: Methods and practices[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022, 44(5): 2548-2566.
[19] HUANG Y C, LIU Q S, METAXAS D. Video object segmentation by hypergraph cut[C]// Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Miami, FL, USA: IEEE, 2009: 1738-1745.
[20] GAO Y, WANG M, TAO D C, et al. 3-D object retrieval and recognition with hypergraph analysis[J]. IEEE transactions on image processing, 2012, 21(9): 4290-4303.
[21] WANG M, LIU X L, WU X D. Visual classification by l1-hypergraph modeling[J]. IEEE Transactions on Knowledge and Data Engineering, 2015, 27(9): 2564-2574.
[22] FANG Y C, ZHENG Y D. Metric learning based on attribute hypergraph[C]// Proceedings of the 2017 IEEE International Conference on Image Processing (ICIP). Beijing, China: IEEE, 2017: 3440-3444.
[23] HUANG S, ELHOSEINY M, ELGAMMAL A, et al. Learning hypergraph-regularized attribute predictors[C]// Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Boston, MA, USA: IEEE, 2015: 409-417.
[24] JOSLYN C A, AKSOY S, ARENDT D, et al. Hypergraph analytics of domain name system relationships[C]// 17th International Workshop on Algorithms and Models for the Web Graph. Warsaw, Poland: Springer, 2020: 1-15.
[25] YANG D Q, QU B Q, Yang J, et al. Revisiting user mobility and social relationships in LBSNs: a hypergraph embedding approach[C]// Proceedings of the World Wide Web Conference (WWW). San Francisco, CA, USA: Association for Computing Machinery, 2019: 2147-2157.
[26] FRANZESE N, GROCE A, MURALI T M, et al. Hypergraph-based connectivity measures for signaling pathway topologies[J]. PLoS computational biology, 2019, 15(10): e1007384.
[27] KLAMT S, HAUS U U, THEIS F. Hypergraphs and cellular networks[J]. PLoS computational biology, 2009, 5(5): e1000385.
[28] ZU C, GAO Y, MUNSELL B, et al. Identifying high order brain connectome biomarkers via learning on hypergraph[C]// 7th International Workshop on Machine Learning in Medical Imaging. Athens, Greece: Springer, 2016: 1-9.
[29] SEN P, NAMATA G, BILGIC M, et al. Collective classification in network data[J]. AI magazine, 2008, 29(3): 93-106.
[30] CHODROW P S, VELDT N, BENSON A R. Generative hypergraph clustering: From blockmodels to modularity[J]. Science Advances, 2021, 7(28): eabh1303.
[31] FOWLER J H. Legislative cosponsorship networks in the US House and Senate[J]. Social networks, 2006, 28(4): 454-465.
[32] KIPF T N, WELLING M. Semi-supervised classification with graph convolutional networks[C]// Proceedings of the 5th International Conference on Learning Representations (ICLR). Toulon, France: ICLR, 2017: 2713-2726.
[33] VELI AČG KOVI AC'G P, CUCURULL G, CASANOVA A, et al. Graph attention networks[C]// Proceedings of the 6th International Conference on Learning Representations (ICLR) Vancouver, BC, Canada: ICLR, 2018: 2920-2931.
[34] WANG P H, YANG S H, LIU Y Y, et al. Equivariant hypergraph diffusion neural operators[C]// Proceedings of the 11th International Conference on Learning Represen- tations (ICLR). Kigali, Rwanda: ICLR, 2023: 16983-17006.
PDF(5517 KB)

Accesses

Citation

Detail

Sections
Recommended

/