MB-TaylorFormer V2: Improved Multi-branch Linear Transformer Expanded by Taylor Formula for Image Restoration
This repository contains the official implementation of the following paper:
MB-TaylorFormer V2: Improved Multi-branch Linear Transformer Expanded by Taylor Formula for Image Restoration
Zhi Jin, Yuwei Qiu, Kaihao Zhang, Chenxi Wang, Hongdong Li, Wenhan Luo*
Paper Link: [official link]
Architecture of MB-TaylorFormer V2. (a) MB-TaylorFormer V2 consists of the multi-branch hierarchical design based on multi-scale patch embedding. (b) Multi-scale patch embedding embeds coarse-to-fine patches. (c) T-MSA++ with linear computational complexity
See INSTALL.md for the installation of dependencies required to run MB-TaylorFormerV2.
Download Links: [Baidu Netdisk] password: pami
| Task | Training Instructions | Testing Instructions |
|---|---|---|
| Dehazing | Link | Link |
| Deraining | Link | Link |
| Desnowing | Link | Link |
| Denoising | Link | Link |
| Deblurring | Link | Link |
If you find our repo useful for your research, please consider citing our paper:
@article{jin2025mb,
title={MB-TaylorFormer V2: Improved Multi-branch Linear Transformer Expanded by Taylor Formula for Image Restoration},
author={Jin, Zhi and Qiu, Yuwei and Zhang, Kaihao and Li, Hongdong and Luo, Wenhan},
journal={TPAMI},
year={2025}
}