APANet: Asymmetrical Parallax Attention Network for Efficient Stereo Image Deraining

Chenglong Wang, Tao Yan*, Weilong Huang, Xianglong Chen, Ke Xu, Xiaojun Chang

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

1 Citation (Scopus)

Abstract

Recently, several stereo image deraining methods have been proposed to recover clean backgrounds from rainy stereo images by exploring and exploiting intra and inter-view information. Despite these methods have achieved great progress, they under-utilize the parallax information of input images, and do not take advantage of existing high-quality and abundant single image rainy datasets for learning. In this paper, we propose an effective and efficient network, named Asymmetrical Parallax Attention Network (APANet), for stereo image deraining. Specifically, to fully exploit the parallax information, we first adopt an External Attention Module (EAM), which consists of an external attention block with two learnable memories, and a gated feedforward network, for achieving a better feature representation by incorporating the correlations between all samples. Subsequently, we propose an Asymmetrical Parallax Attention Module (APAM) to efficiently exploit the cross-Attention between the features separately extracted from the left and right views, which filters useless stereo feature relationships with a well-designed mask calculated by excavating the parallax information (positional information of each matched pixel pair within a stereo image). For learning our network, we also construct an unpaired real-world stereo rainy image dataset, called StereoRealRain, which consists of some video clips (including 11803 image pairs). Moreover, we also introduce a Single-To-Stereo Image Deraining Distillation strategy for transferring the knowledge learned from single images deraining to stereo images deraining to improve the generalization ability of our network. Extensive experiments conducted on synthetic and real-world stereo rainy datasets demonstrate the effectiveness of our method.

© 2025 IEEE. All rights reserved, including rights for text and data mining, and training of artificial intelligence and similar technologies.
Original languageEnglish
Pages (from-to)101-115
JournalIEEE Transactions on Computational Imaging
Volume11
Online published8 Jan 2025
DOIs
Publication statusPublished - 2025

Research Keywords

  • deep learning
  • Rain removal
  • stereo image

Fingerprint

Dive into the research topics of 'APANet: Asymmetrical Parallax Attention Network for Efficient Stereo Image Deraining'. Together they form a unique fingerprint.

Cite this