Semantic-Aware Gated Fusion Network For Interactive Colorization

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

1 Scopus Citations
View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Title of host publicationProceedings of the 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
PublisherIEEE
Number of pages5
ISBN (Electronic)9781728163277
ISBN (Print)978-1-7281-6328-4
Publication statusPublished - 2023

Publication series

NameICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
ISSN (Print)1520-6149
ISSN (Electronic)2379-190X

Conference

Title48th IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP 2023)
LocationRodos Palace Luxury Convention Resort
PlaceGreece
CityRhodes Island
Period4 - 10 June 2023

Abstract

Deep neural networks boost many successful colorization methods, including automatic, interactive, and exemplar-based methods. Among them, interactive methods with global and/or local inputs are probably the most flexible to accurately add colors to a gray image. However, due to the sparseness of input-semantic correspondences, existing methods encounter difficulties in distributing inputs into correct regions. Moreover, they simply add or concatenate the features of different inputs to the network before color reconstruction, which cannot balance the influences of different inputs. To this end, we propose a novel interactive colorization network, which explicitly builds input-semantic correspondences with an attention mechanism and proposes a gated feature fusion module to balance the influences of global and local inputs. We further apply a differentiable histogram loss to impose a smooth impact of the global inputs. Extensive experiments demonstrate that our method can flexibly control the results and outperforms other state-of-the-art interactive methods. © 2023 IEEE.

Research Area(s)

  • attention mechanism, feature fusion, Interactive colorization, semantic-aware

Bibliographic Note

Full text of this publication does not contain sufficient affiliation information. With consent from the author(s) concerned, the Research Unit(s) information for this record is based on the existing academic department affiliation of the author(s).

Citation Format(s)

Semantic-Aware Gated Fusion Network For Interactive Colorization. / Zhang, Jie; Xiao, Yi; Zhenga, Yan et al.
Proceedings of the 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2023. (ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings).

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review