Virtual view synthesis for the nonuniform illuminated between views in surgical video

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

2 Scopus Citations
View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Number of pages21
Journal / PublicationMultimedia Tools and Applications
Online published8 Mar 2021
Publication statusOnline published - 8 Mar 2021

Abstract

For high-quality surgical video virtual view synthesis, a Weighted Autoregressive Interpolation (WAI) algorithm and an Adaptively-enhanced Hole Filling (AHF) are proposed to reduce the artifacts caused by up-sampling and relieve the luma difference. First, high quality up-sampled reference views are acquired by the WAI algorithm. A Piecewise Autoregressive (PAR) model is introduced and the distance weight of pixels is also considered. The precision of the virtual view is improved by the WAI and the texture edges are well preserved. Next, for the AHF, the intermediate view with more structure details is selected as the template. The other intermediate view is calibrated to it. And the luma difference is relieved. Then, a Nearest background Holes Filling algorithm (NHF) is adopted to blend these two intermediate views, in which only background pixels are selected to fill the remaining holes. Combining the WAI with AHF, the visual quality of the surgical virtual video is improved. For the objective quality, the experimental results show that the PSNR of the proposed algorithm is 0.5841 dB higher than the VSRS 1D-Fast algorithm on average. For subjective quality, the proposed method can reduce the artifacts and gain higher subjective quality for the synthesized virtual view of the surgical video.

Research Area(s)

  • Virtual view synthesis, Interpolation, Hole filling, Surgical video, Nonuniform illuminated