RaFE: Generative Radiance Fields Restoration

Zhongkai Wu, Ziyu Wan, Jing Zhang*, Jing Liao, Dong Xu

*Corresponding author for this work

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

1 Citation (Scopus)

Abstract

NeRF (Neural Radiance Fields) has demonstrated tremendous potential in novel view synthesis and 3D reconstruction, but its performance is sensitive to input image quality, which struggles to achieve high-fidelity rendering when provided with low-quality sparse input viewpoints. Previous methods for NeRF restoration are tailored for specific degradation type, ignoring the generality of restoration. To overcome this limitation, we propose a generic radiance fields restoration pipeline, named RaFE, which applies to various types of degradations, such as low resolution, blurriness, noise, compression artifacts, or their combinations. Our approach leverages the success of off-the-shelf 2D restoration methods to recover the multi-view images individually. Instead of reconstructing a blurred NeRF by averaging inconsistencies, we introduce a novel approach using Generative Adversarial Networks (GANs) for NeRF generation to better accommodate the geometric and appearance inconsistencies present in the multi-view images. Specifically, we adopt a two-level tri-plane architecture, where the coarse level remains fixed to represent the low-quality NeRF, and a fine-level residual tri-plane to be added to the coarse level is modeled as a distribution with GAN to capture potential variations in restoration. We validate RaFE on both synthetic and real cases for various restoration tasks, demonstrating superior performance in both quantitative and qualitative evaluations, surpassing other 3D restoration methods specific to single task. Please see our project website zkaiwu.github.io/RaFE. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.
Original languageEnglish
Title of host publicationComputer Vision – ECCV 2024 - 18th European Conference, Proceedings
EditorsAleš Leonardis, Elisa Ricci, Stefan Roth, Olga Russakovsky, Torsten Sattler, Gül Varol
PublisherSpringer, Cham
Pages163-179
VolumePart LXVII
ISBN (Electronic)9783031728556
ISBN (Print)9783031728549
DOIs
Publication statusPublished - 2025
Event18th European Conference on Computer Vision (ECCV 2024) - MiCo Milano, Milan, Italy
Duration: 29 Sept 20244 Oct 2024
https://eccv.ecva.net/

Publication series

NameLecture Notes in Computer Science
Volume15125
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference18th European Conference on Computer Vision (ECCV 2024)
Abbreviated titleECCV2024
PlaceItaly
CityMilan
Period29/09/244/10/24
Internet address

Bibliographical note

Full text of this publication does not contain sufficient affiliation information. With consent from the author(s) concerned, the Research Unit(s) information for this record is based on the existing academic department affiliation of the author(s).

Funding

This work was supported in part by the Hong Kong Research Grants Council General Research Fund (17203023), in part by The Hong Kong Jockey Club Charities Trust under Grant 2022-0174, in part by the Startup Fund and the Seed Fund for Basic Research for New Staff from The University of Hong Kong, in part by the funding from UBTECH Robotics, in part by a GRF grant from the Research Grants Council (RGC) of the Hong Kong Special Administrative Region, China [Project No. CityU 11208123], and in part by the National Natural Science Foundation of China (62132001).

Research Keywords

  • 3D Restoration
  • Generative Model
  • Neural Radiance Fields
  • Neural Rendering

RGC Funding Information

  • RGC-funded

Fingerprint

Dive into the research topics of 'RaFE: Generative Radiance Fields Restoration'. Together they form a unique fingerprint.

Cite this