Skip to content

geometry-processing/manifold-aware-transformers

 
 

Repository files navigation

Neural Garment Dynamics via Manifold-Aware Transformers

Python Pytorch

This repository provides the implementation for our manifold-aware transformers, a novel neural network architecture for predicting the dynamics of garments. It is based on our work Neural Garment Dynamics via Manifold-Aware Transformers that is published in EUROGRAPHICS 2024.

Prerequisites

This code has been tested under Ubuntu 20.04. Before starting, please configure your Anaconda environment by

conda env create -f environment.yml
conda activate manifold-aware-transformers

Alternatively, you may install the following packages (and their dependencies) manually:

  • numpy == 1.23.1 (note numpy.bool is deprecated in higher version and it causes an error when loading SMPL model)
  • pytorch == 2.0.1
  • scipy >= 1.10.1
  • cholespy == 1.0.0
  • scikit-sparse == 0.4.4
  • libigl == 2.4.1
  • tensorboard >= 2.12.1
  • tqdm >= 4.65.0
  • chumpy == 0.70

Quick Start

We provide several pre-trained models trained on different datasets. Download the pre-trained models and the example sequences from Google Drive. Please extract the pre-trained models and example sequences, and put them under the pre-trained and data directory directly under the root of the project directory, respectively.

Garment Prediction

Run demo.sh.

The prediction of the network will be stored in [path to pre-trained model]/sequence/prediction.pkl. The corresponding ground truth and body motion are stored in gt.pkl and body.pkl respectively. Please refer to here for the specification and visualization of the predicted mesh sequence.

Mesh Sequence Format and Visualization

We use a custom format to store a sequence of meshes. The specific format can be found in the function write_vert_pos_pickle() in mesh_utils.py.

We provide a plugin for visualizing the mesh sequences directly in Blender here. It is based on the STOP-motion-OBJ plugin by @neverhood311.

Evaluation

Coming soon.

Data Preprocessing

The input garment geometry is decimated to improve the running efficiency. A separate module implemented with C++ is required.

Our model requires the signed distance function from the garment geometry to the body geometry as input. It is calculated on the fly for inference time using a highly optimized GPU implementation.

The current implementation for inference requires the same format, and you may use the following steps to preprocess the data for inference as well.

VTO dataset

Please download the VTO dataset from here, and run the following command to preprocess the data:

python parse_data_vto.py --data_path_prefix=[path to downloaded vto dataset] --save_path=[path to save the preprocessed data]

Cloth3D dataset

Coming soon.

Train from scratch

Coming soon.

Acknowledgments

The code in dataset/smpl.py is adapted from SMPL by @CalciferZh.

Citation

If you use this code for your research, please cite our paper:

@article{Li2024NeuralGarmentDynamics,
  author  = {Li, Peizhuo and Wang, Tuanfeng Y. and Kesdogan, Timur Levent and Ceylan, Duygu and Sorkine-Hornung, Olga},
  title   = {Neural Garment Dynamics via Manifold-Aware Transformers},
  journal = {Computer Graphics Forum (Proceedings of EUROGRAPHICS 2024)},
  volume  = {43},
  number  = {2},
  year    = {2024},
}

About

A novel manifold-aware transfomer architecture for predicting garment dynamics on unseen geometries [EUROGRAPHICS 2024]

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Python 99.9%
  • Shell 0.1%