Stars
A collection of AWESOME things about mixture-of-experts
Machine Learning Engineering Open Book
PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
Implementation of Soft MoE, proposed by Brain's Vision team, in Pytorch
[ECCV 2024] Official pytorch implementation of "Switch Diffusion Transformer: Synergizing Denoising Tasks with Sparse Mixture-of-Experts"
A beautiful, simple, clean, and responsive Jekyll theme for academics
BenchMARL is a library for benchmarking Multi-Agent Reinforcement Learning (MARL). BenchMARL allows to quickly compare different MARL algorithms, tasks, and models while being systematically ground…
This is the official implementation of Multi-Agent PPO (MAPPO).
PoolFormer: MetaFormer Is Actually What You Need for Vision (CVPR 2022 Oral)
Adaptive FNO transformer - official Pytorch implementation
[NeurIPS 2021] [T-PAMI] Global Filter Networks for Image Classification
[TMLR] "SC2 Benchmark: Supervised Compression for Split Computing"
Official code for Conformer: Local Features Coupling Global Representations for Visual Recognition
[NeurIPS 2019] This is the code repo of our novel passport-based DNN ownership verification schemes, i.e. we embed passport layer into various deep learning architectures (e.g. AlexNet, ResNet) for…
PyTorch implementations of Generative Adversarial Networks.
LPIPS metric. pip install lpips
This is the official implementation of NNSplitter (ICML'23)
A Collection of Variational Autoencoders (VAE) in PyTorch.
This is a term project for ELE851 - Detection & Estimation Theory - Spring 2021
[NeurIPS 2022] DataMUX: Data Multiplexing for Neural Networks
Code for paper "Orthogonal Convolutional Neural Networks".
《代码随想录》LeetCode 刷题攻略:200道经典题目刷题顺序,共60w字的详细图解,视频难点剖析,50余张思维导图,支持C++,Java,Python,Go,JavaScript等多语言版本,从此算法学习不再迷茫!🔥🔥 来看看,你会发现相见恨晚!🚀