This repository is the implementation of CTGC, a self-supervised graph condensation approach designed to efficiently handle diverse downstream tasks.
All experiments are implemented in Python 3.9 with Pytorch 1.11.0.
To install the required dependencies, run:
pip install -r requirements.txtFollowing is the step-by-step instruction for our proposed method CTGC.
Set the dataset folder path using the --data_dir (defaut: ./data/).
- For Cora and CiteSeer, the code will directly download from PyG.
- For Ogbn-arxiv and Reddit, we use the datasets provided by GraphSAINT, which are available on Google Drive or BaiduYun (code: f1ao).
Prepare the dataset and perform eigenvalue decomposition by running:
$ bash ./scr/run_preprocess.sh- Split data files will be saved in the folder
./dataset_split/. - Eigenvectors and eigenvalues will be saved in the folder
./save_eigen/.
Pre-train the semantic relay model using the following command:
$ bash ./scr/run_pretrain.shThe pre-trained relay model will be saved in the folder: ./save_pretrain_model/.
Iteratively train the semantic and structural relay models by executing:
$ bash ./scr/run_relay.shThe trained relay models and centroid embeddings will be stored in the directory: ./save_pretrain_model/.
Generate the condensed graph by running:
$ bash ./scr/run_condense.shThe generated graph will be saved in the folder ./save_condensed_data/.
The performances for the pre-trained relay model, semantic relay model, and the condensed graph (from Steps 3–5) are recorded in CSV files located in the folder: ./results_proposed/.
To evaluate the models trained on the original graph, run:
$ bash ./scr/run_whole.shThe performances are recorded in the folder: ./results_whole/.
@inproceedings{gao2024contrastive,
title={Contrastive graph condensation: Advancing data versatility through self-supervised learning},
author={Gao, Xinyi and Li, Yayong and Chen, Tong and Ye, Guanhua and Zhang, Wentao and Yin, Hongzhi},
booktitle={Proceedings of the 31th ACM SIGKDD Conference on Knowledge Discovery and Data Mining},
year={2025}
}
