Skip to content

Dr-Lyt/SPIFFNet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 

Repository files navigation

SPIFFNet

SPIFFNet Codes

Cross-Spatial Pixel Integration and Cross-Stage Feature Fusion Based Transformer Network for Remote Sensing Image Super-Resolution

Official Pytorch implementation of the paper "Cross-Spatial Pixel Integration and Cross-Stage Feature Fusion Based Transformer Network for Remote Sensing Image Super-Resolution".

Abstract Remote sensing image super-resolution (RSISR) plays a vital role in enhancing spatial detials and improving the quality of satellite imagery. Recently, Transformer-based models have shown competitive performance in RSISR. To mitigate the quadratic computational complexity resulting from global self-attention, various methods constrain attention to a local window, enhancing its efficiency. Consequently, the receptive fields in a single attention layer are inadequate, leading to insufficient context modeling. Furthermore, while most transform-based approaches reuse shallow features through skip connections, relying solely on these connections treats shallow and deep features equally, impeding the model's ability to characterize them. To address these issues, we propose a novel transformer architecture called Cross-Spatial Pixel Integration and Cross-Stage Feature Fusion Based Transformer Network (SPIFFNet) for RSISR. Our proposed model effectively enhances global cognition and understanding of the entire image, facilitating efficient integration of features cross-stages. The model incorporates cross-spatial pixel integration attention (CSPIA) to introduce contextual information into a local window, while cross-stage feature fusion attention (CSFFA) adaptively fuses features from the previous stage to improve feature expression in line with the requirements of the current stage. We conducted comprehensive experiments on multiple benchmark datasets, demonstrating the superior performance of our proposed SPIFFNet in terms of both quantitative metrics and visual quality when compared to state-of-the-art methods.

Requirements

  • Python 3.6+
  • Pytorch>=1.6
  • torchvision>=0.7.0
  • einops
  • matplotlib
  • cv2
  • scipy
  • tqdm
  • scikit

Installation

Clone or download this code and install aforementioned requirements

cd codes

Train

Download the UCMerced dataset[Baidu Drive,password:terr][Google Drive]and AID dataset[Baidu Drive,password:id1n][Google Drive], they have been split them into train/val/test data, where the original images would be taken as the HR references and the corresponding LR images are generated by bicubic down-sample.

# x4
python demo_train.py --model=SPIFFNET --dataset=UCMerced --scale=4 --train_patch_size=192 --test_patch_size=256 --ext=img --save=SPIFFNETx4_UCMerced
# x3
python demo_train.py --model=SPIFFNET --dataset=UCMerced --scale=3 --train_patch_size=144 --test_patch_size=255 --ext=img --save=SPIFFNETx3_UCMerced
# x2
python demo_train.py --model=SPIFFNET --dataset=UCMerced --scale=2 --train_patch_size=96  --test_patch_size=256 --ext=img --save=SPIFFNETx2_UCMerced

Download the trained model in this paper Baidu Drive,password:abbt].

The train/val data pathes are set in data/init.py

Test

The test data path and the save path can be edited in demo_test.py

# x4
python demo_test.py --model=SPIFFNET --scale=4 --test_patch_size=256 --test_block=True
# x3
python demo_test.py --model=SPIFFNET --scale=3 --test_patch_size=255 --test_block=True
# x2
python demo_test.py --model=SPIFFNET --scale=2 --test_patch_size=256 --test_block=True

Evaluation

Compute the evaluated results in term of PSNR and SSIM, where the SR/HR paths can be edited in calculate_PSNR_SSIM.py

cd metric_scripts 
python calculate_PSNR_SSIM.py

Acknowledgements

This code is built on HSENET (Pytorch) and TRANSENET (Pytorch). The LAM results in this paper is tested on LAM_DEMO (Pytorch). We thank the authors for sharing the codes.

About

SPIFFNet Codes

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published