Skip to content

FVL2020/MB-TaylorFormerV2

Repository files navigation

MB-TaylorFormer V2: Improved Multi-branch Linear Transformer Expanded by Taylor Formula for Image Restoration

This repository contains the official implementation of the following paper:

MB-TaylorFormer V2: Improved Multi-branch Linear Transformer Expanded by Taylor Formula for Image Restoration
Zhi Jin, Yuwei Qiu, Kaihao Zhang, Chenxi Wang, Hongdong Li, Wenhan Luo*

Paper Link: [official link]

Overview

overall_structure Architecture of MB-TaylorFormer V2. (a) MB-TaylorFormer V2 consists of the multi-branch hierarchical design based on multi-scale patch embedding. (b) Multi-scale patch embedding embeds coarse-to-fine patches. (c) T-MSA++ with linear computational complexity

Installation

See INSTALL.md for the installation of dependencies required to run MB-TaylorFormerV2.

Prepare pretrained models

Download Links: [Baidu Netdisk] password: pami

Training and Evaluation

Task Training Instructions Testing Instructions
Dehazing Link Link
Deraining Link Link
Desnowing Link Link
Denoising Link Link
Deblurring Link Link

Citation

If you find our repo useful for your research, please consider citing our paper:

@article{jin2025mb,
title={MB-TaylorFormer V2: Improved Multi-branch Linear Transformer Expanded by Taylor Formula for Image Restoration},
author={Jin, Zhi and Qiu, Yuwei and Zhang, Kaihao and Li, Hongdong and Luo, Wenhan},
journal={TPAMI},
year={2025}
}

Acknowledgments

This code is based on Restormer and MPViT.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors