Skip to content

mingrui-zhao/SweepNet

Repository files navigation

Mingrui Zhao · Yizhi Wang · Fenggen Yu · Changqing Zou · Ali Mahdavi-Amiri
European Conference on Computer Vision (ECCV), 2024

arXiv | Project Page

This is the official implementation for SweepNet.

Environment Setup

We recommend using Anaconda for managing the environment. Execute the following commands to create and activate the SweepNet environment:

conda create --name sweepnet python=3.8
conda activate sweepnet
pip install -r requirements.txt
pip install torch_geometric
pip install pyg_lib torch_scatter torch_sparse torch_cluster torch_spline_conv -f https://data.pyg.org/whl/torch-2.2.0+cu121.html

Dataset

Download the preprocessed dataset GC-objects and quadrupeds at huggingface or Onedrive.

Unzip the files and put them at ./data

Run the following command to enable branch-wise initialisation.

python misc/get_skeletons.py

The GC-objects models are sourced from OreX, GCD, and various internet sources. The quadruped dataset is adopted from this paper.

Please cite these sources if you use our processed dataset.

Neural Sweeper

Download the pretrained checkpoint of the neural sweeper here and place the checkpoint in the ./neural_sweeper/ckpt directory.

The training data is available on huggingface and Onedrive. Please refer to our paper for data preparation details.

Train the Model

Voxel input

Execute the following command to train the SweepNet on a single shape with voxel input. The outcomes will be stored in the ./results directory.

python train.py --config_path ./configs/default.json

Pointcloud input

Execute the following command to train the SweepNet on a single shape with point cloud input. The outcomes will be stored in the ./results directory.

python train.py --config_path ./configs/pcd.json

Execute the following command to train the SweepNet with different numbers of available primitives over the GC-object dataset. The outcomes will be stored in the ./results directory.

bash script/ablate_num_prim.sh

Configuration options, including the target shape name, can be adjusted within the ./configs/default.json file.

Produce sweep surfaces from primitive parameters

Execute the command below to reproduce sweep surfaces from their parameters.

python misc/produce_model_from_parameters.py

Try some other primitive parameters we provided on the project page for fun! 😄

Limitations

SweepNet prefers models exhibiting sweep elements and may converge to local optima with different initializations. If the abstraction result is unsatisfactory, try providing multiple runs for a better fit:

bash script/train_multiple_runs.sh

Updates: Implemented branch-wise initialisation for faster inference and stablised convergence.

Acknowledgements

Our codebased is developed based on ExtrudeNet, POCO and CAPRI-NET. The data pre-processing code is available at D2CSG.

Related Works

If you are interested in sweeping-based shape representation. Please also check out SECAD-Net with neural profiles and ExtrudeNet with Bezier polygon profiles.

Citation

If you find our work interesing, please cite

@inproceedings{zhao2024sweepnet,
  title={SweepNet: Unsupervised Learning Shape Abstraction via Neural Sweepers},
  author={Zhao, Mingrui and Wang, Yizhi and Yu, Fenggen and Zou, Changqing and Mahdavi-Amiri, Ali},
  booktitle={European Conference on Computer Vision (ECCV)},
  year={2024},
  organization={Springer}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors