Multi-level Graph Knowledge Contrastive Learning

Haoran Yang, Yuhao Wang, Xiangyu Zhao*, Hongxu Chen, Hongzhi Yin, Qing Li*, Guandong Xu*

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

1 Citation (Scopus)

Abstract

Graph Contrastive Learning (GCL) stands as a potent framework for unsupervised graph representation learning that has gained traction across numerous graph learning applications. The effectiveness of GCL relies on generating high-quality contrasting samples, enhancing the model's ability to discern graph semantics. However, the prevailing GCL methods face two key challenges: 1) introducing noise during graph augmentations and 2) requiring additional storage for generated samples, which degrade the model performance. In this paper, we propose novel approaches, GKCL (i.e., Graph Knowledge Contrastive Learning) and DGKCL (i.e., Distilled Graph Knowledge Contrastive Learning), that leverage multilevel graph knowledge to create noise-free contrasting pairs. This framework not only addresses the noise-related challenges but also circumvents excessive storage demands. Furthermore, our method incorporates a knowledge distillation component to optimize the trained embedding tables, reducing the model's scale while ensuring superior performance, particularly for the scenarios with smaller embedding sizes. Comprehensive experimental evaluations on three public benchmark datasets underscore the merits of our proposed method and elucidate its properties, which primarily reflect the performance of the proposed method equipped with different embedding sizes and how the distillation weight affects the overall performance. © 1989-2012 IEEE.
Original languageEnglish
Pages (from-to)8829-8841
JournalIEEE Transactions on Knowledge and Data Engineering
Volume36
Issue number12
Online published26 Sept 2024
DOIs
Publication statusPublished - Dec 2024

Funding

This work was supported in part by the National Natural Science Foundation of China under Grant 62072257, in part by Australian Research Council under Grant DP22010371 and Grant LE220100078, in part by Hong Kong Research Grants Council under the General Research Fund under Grant 15200023, in part by Research Impact Fund under Grant R1015-23, in part by APRC - CityU New Research Initiatives under Grant 9610565, Start-up Grant for New Faculty of CityU, in part by CityU - HKIDS Early Career Research under Grant 9360163, in part by Hong Kong ITC Innovation and Technology Fund Midstream Research Programme for Universities Project under Grant ITS/034/22MS, in part by Hong Kong Environmental and Conservation Fund under Grant 88/2022, in part by SIRG - CityU Strategic Interdisciplinary Research under Grant 7020046, in part by Huawei (Huawei Innovation Research Program), Tencent (CCF-Tencent Open Fund, Tencent Rhino-Bird Focused Research Program), Ant Group (CCF-Ant Research Fund, Ant Group Research Fund), Alibaba (CCF-Alimama Tech Kangaroo Fund under Grant 2024002), CCF-Bai Chuan-Ebtech Foundation Model Fund, and Kuaishou

Research Keywords

  • Graph Contrastive Learning
  • Graph Representation Learning
  • Knowledge Distillation

Fingerprint

Dive into the research topics of 'Multi-level Graph Knowledge Contrastive Learning'. Together they form a unique fingerprint.

Cite this