Are you sure you want to delete this task? Once this task is deleted, it cannot be recovered.
Guangyu Zhou eaa51ac4dc | 3 months ago | |
---|---|---|
.. | ||
README.md | 1 year ago | |
eval.py | 7 months ago | |
merit_trainer.py | 3 months ago | |
process.py | 7 months ago |
This GammaGL example implements the model proposed in the paper Multi-Scale Contrastive Siamese Networks for Self-Supervised Graph Representation Learning.
Author's code: https://github.com/GRAND-Lab/MERIT
This example was implemented by Ziyu Zheng
'Cora', 'Citeseer' and 'Pubmed'
Dataset | # Nodes | # Edges | # Classes |
---|---|---|---|
Cora | 2,708 | 10,556 | 7 |
Citeseer | 3,327 | 9,228 | 6 |
Pubmed | 19,717 | 88,651 | 3 |
--input_dim int Input dimension. Default is 1433.
--out_dim int Output dimension. Default is 512.
--proj_size int Encoder output dimension Default is 512.
--proj_hid int Encoder hidden dimension Default is 4096.
--pred_size int MLP output dimension Default is 512.
--pred_hid int MLP hidden dimension Default is 4096.
--drop_edge_rate_1 float Drop edge ratio 1. Default is 0.2.
--drop_edge_rate_2 float Drop edge ratio 2. Default is 0.2.
--drop_feature_rate_1 float Drop feature ratio 1. Default is 0.5.
--drop_feature_rate_2 float Drop feature ratio 2. Default is 0.5.
--dataset_path str path to save dataset. Default is r'../'
In the paper(as well as authors' repo), the training set are full graph training
# use paddle backend
# Cora by GammaGL
TL_BACKEND=paddle python merit_trainer.py --dataset cora --epochs 500 --drop_edge_rate_1 0.2 --drop_edge_rate_2 0.2 --drop_feature_rate_1 0.5 --drop_feature_rate_2 0.5 --lr 3e-4 --beta 0.5
#Citeseer by GammaGL
TL_BACKEND=paddle python merit_trainer.py --dataset citeseer --epochs 500 --drop_edge_rate_1 0.4 --drop_edge_rate_2 0.4 --drop_feature_rate_1 0.5 --drop_feature_rate_2 0.5 --lr 3e-4 --beta 0.6
# use tensorflow backend
# Cora by GammaGL
TL_BACKEND=tensorflow python merit_trainer.py --dataset cora --epochs 500 --drop_edge_rate_1 0.2 --drop_edge_rate_2 0.2 --drop_feature_rate_1 0.5 --drop_feature_rate_2 0.5 --lr 3e-4 --beta 0.5
#Citeseer by GammaGL
TL_BACKEND=tensorflow python merit_trainer.py --dataset citeseer --epochs 500 --drop_edge_rate_1 0.4 --drop_edge_rate_2 0.4 --drop_feature_rate_1 0.5 --drop_feature_rate_2 0.5 --lr 3e-4 --beta 0.6
Dataset | Cora | Citeseer | Pubmed |
---|---|---|---|
Author's Code | 83.1 | 74.0 | 80.2 |
GammaGL(tf) | 84.3 | 72.2 | --.- |
GammaGL(paddle) | 83.1 | --.- | --.- |
No Description
Python C++ Cuda Markdown Text
Dear OpenI User
Thank you for your continuous support to the Openl Qizhi Community AI Collaboration Platform. In order to protect your usage rights and ensure network security, we updated the Openl Qizhi Community AI Collaboration Platform Usage Agreement in January 2024. The updated agreement specifies that users are prohibited from using intranet penetration tools. After you click "Agree and continue", you can continue to use our services. Thank you for your cooperation and understanding.
For more agreement content, please refer to the《Openl Qizhi Community AI Collaboration Platform Usage Agreement》