Joint Sparse Representations and Coupled Dictionary Learning in Multisource Heterogeneous Image Pseudo-Color Fusion

Long Bai, Shilong Yao, Kun Gao*, Yanjun Huang, Ruijie Tang, Hong Yan, Max Q.-H. Meng, Hongliang Ren*

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

10 Citations (Scopus)
66 Downloads (CityUHK Scholars)

Abstract

Considering that coupled dictionary learning (CDL) method can obtain a reasonable linear mathematical relationship between resource images, we propose a novel CDL-based synthetic aperture radar (SAR) and multispectral pseudo-color fusion method. First, the traditional Brovey transform is employed as a preprocessing method on the paired SAR and multispectral images. Then, CDL is used to capture the correlation between the preprocessed image pairs based on the dictionaries generated from the source images via enforced joint sparse coding. Afterward, the joint sparse representation in the pair of dictionaries is utilized to construct an image mask via calculating the reconstruction errors and therefore generate the final fusion image. The experimental verification results of the SAR images from the Sentinel-1 satellite and the multispectral images from the Landsat-8 satellite show that the proposed method can achieve superior visual effects and excellent quantitative indicators in terms of spectral distortion, correlation coefficient, mean square error (mse), natural image quality evaluator (NIQE), Blind/Referenceless Image Spatial QUality Evaluator (BRISQUE), and perception-based image quality evaluator (PIQE). © 2023 The Authors.
Original languageEnglish
Pages (from-to)30620-30632
JournalIEEE Sensors Journal
Volume23
Issue number24
Online published23 Oct 2023
DOIs
Publication statusPublished - 15 Dec 2023

Funding

This work was supported in part by the National Natural Science Foundation of China (NSFC) under Grant U2241275 and Grant 61827814; in part by the Beijing Natural Science Foundation under Grant Z190018; in part by the China High-Resolution Earth Observation System Project under Grant 52-L10D01-0613-20/22; in part by the Hong Kong Research Grants Council (RGC) under Grant CRF C4026-21GF, Grant CRF C4063-18G, Grant GRF 14203323, Grant GRF 14216022, Grant GRF 14211420, and Grant GRS 3110167; in part by the NSFC/RGC Joint Research Scheme under Grant N_CUHK420/22; in part by the Shenzhen–Hong Kong–Macau Technology Research Programme (Type C) under Grant 202108233000303; in part by Guangdong Basic and Applied Basic Research Foundation under Grant 2021B1515120035; and in part by the City University of Hong Kong under Grant 11204821.

Research Keywords

  • Brovey transform
  • coupled dictionary learning (CDL)
  • Dictionaries
  • Image fusion
  • multispectral image
  • Principal component analysis
  • pseudo-color fusion
  • Radar polarimetry
  • remote sensing
  • synthetic aperture radar (SAR)
  • Transforms

Publisher's Copyright Statement

  • This full text is made available under CC-BY-NC-ND 4.0. https://creativecommons.org/licenses/by-nc-nd/4.0/

RGC Funding Information

  • RGC-funded

Fingerprint

Dive into the research topics of 'Joint Sparse Representations and Coupled Dictionary Learning in Multisource Heterogeneous Image Pseudo-Color Fusion'. Together they form a unique fingerprint.

Cite this