- 2025.06: 🎉 Our FREE-Merging is accepted by ICCV 2025.
This is the official implementation of our paper: FREE-Merging: Fourier Transform for Model Merging with Lightweight Experts. We realize cost-free model merging method with lightweight task-specific experts.
We provide the code for merging ViT models, language models, and Large Language Models. We will gradually open-source the merging code of more models in the future. At the same time, we will continue to improve our code framework.
The workflow of FREE-Merging involves two main steps. First, FR-Merging utilizes high-pass filtering to remove harmful specialized information from each model, thereby constructing a high-quality merged backbone. Second, it employs lightweight extraction of task experts, which are dynamically added during inference to mitigate the impact of task conflicts
Please follow task_vectors and EMR-Merging to install the dependencies.
torch==2.3.1
torchvision==0.18.1
transformers==4.37.2
transformers-stream-generator==0.0.5
numpy==1.26.4
triton==2.3.1
typer==0.12.3
visualizer==0.0.1
wandb==0.18.5
fastapi==0.111.0
fastapi-cli==0.0.4
Our implementation references the code below, thanks to them.
Task-Vectors: task_vectors: Editing Models with Task Arithmetic
EMR-Merging: EMR-Merging: Tuning-Free High-Performance Model Merging
Twin-Merging: Twin-Merging: Dynamic Integration of Modular Expertise in Model Merging
Ties-Merging: https://github.com/prateeky2806/ties-merging/tree/main
MergeKit: arcee-ai/mergekit: Tools for merging pretrained large language models.
BEiT-3: https://github.com/microsoft/unilm/tree/master/beit3
If you find this project helpful for you, feel free to cite our paper:
@article{zheng2024free,
title={FREE-Merging: Fourier Transform for Model Merging with Lightweight Experts},
author={Zheng, Shenghe and Wang, Hongzhi},
journal={arXiv preprint arXiv:2411.16815},
year={2024}
}
