Stars
The simplest, fastest repository for training/finetuning medium-sized GPTs.
ChatGLM-6B: An Open Bilingual Dialogue Language Model | 开源双语对话语言模型
Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
☁️ Build multimodal AI applications with cloud-native stack
Open source code for AlphaFold 2.
Implementation of Graph Convolutional Networks in TensorFlow
A quickstart and benchmark for pytorch distributed training.
An Open-Source Library for Training Binarized Neural Networks
[ICLR 2025] Code for the paper "Implicit Search via Discrete Diffusion: A Study on Chess"
A simple utility to create user-specified git commit hashes


