Skip to content

NeuroPrim: an attention-based model for solving three spanning tree problems (DCMST, MRCST, STP) on Euclidean space

License

Notifications You must be signed in to change notification settings

CarlossShi/neuroprim

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NeuroPrim

Implementation of paper:

Shi Yuchen, Han Congying, Guo Tiande, NeuroPrim: An Attention-based Model for Solving NP-hard Spanning Tree Problems, SCIENCE CHINA Mathematics, 2023.

neuroprim

Dependencies

  • Python 3.10
  • PyTorch 2.0.1
  • TensorBoard 2.10.0
  • tqdm 4.65.0
  • wandb 0.15.4

Quick Start

Training

With the default settings of --num_batches 2500 and –-num_epochs 100, the training of each problem with 20 vertices will take about 10 hours on a single RTX 3090 GPU.

python generate_data.py --graph_size 20
python train --graph_size 20 --degree_constrain 2
python train --graph_size 20 --cost_function_key mrcst
python train --graph_size 20 --is_stp True

Testing

Run test.ipynb to see the performance of pretrained models on random data.

Example Solutions

DCMST (d=2) MRCST STP
dcmst mrcst stp

Acknowledgements

Citation

If you find our paper or repository helpful for your research or project, please considering citing our paper with the following BibTeX citation format:

@article{shi2023neuroprim,
  author = {Shi, Yuchen and Han, Congying and Guo, Tiande},
  title = {{NeuroPrim}: An Attention-based Model for Solving NP-hard Spanning Tree Problems},
  journal = {SCIENCE CHINA Mathematics},
  year = {2023},
  url = {https://www.sciengine.com/SCM/doi/10.1007/s11425-022-2175-5},
  doi = {10.1007/s11425-022-2175-5}
}

About

NeuroPrim: an attention-based model for solving three spanning tree problems (DCMST, MRCST, STP) on Euclidean space

Resources

License

Stars

Watchers

Forks