ACCURATE LIGHT FIELD DEPTH ESTIMATION VIA AN OCCLUSION-AWARE NETWORK

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

28 Scopus Citations
View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Title of host publication2020 IEEE International Conference on Multimedia and Expo (ICME)
PublisherInstitute of Electrical and Electronics Engineers, Inc.
ISBN (electronic)9781728113319
ISBN (print)9781728113326
Publication statusPublished - Jul 2020

Publication series

NameProceedings - IEEE International Conference on Multimedia and Expo
Volume2020-July
ISSN (Print)1945-7871
ISSN (electronic)1945-788X

Conference

Title2020 IEEE International Conference on Multimedia and Expo (ICME 2020)
LocationVirtual
PlaceUnited Kingdom
CityLondon
Period6 - 10 July 2020

Abstract

Depth estimation is a fundamental problem for light field based applications. Although recent learning-based methods have proven to be effective for light field depth estimation, they still have troubles when handling occlusion regions. In this paper, by leveraging the explicitly learned occlusion map, we propose an occlusion-aware network, which is capable of estimating accurate depth maps with sharp edges. Our main idea is to separate the depth estimation on non-occlusion and occlusion regions, as they contain different properties with respect to the light field structure, i.e., obeying and violating the angular photo consistency constraint. To this end, three modules are involved in our network: the occlusion region detection network (ORDNet), the coarse depth estimation network (CDENet), and the refined depth estimation network (RDENet). Specifically, ORDNet predicts the occlusion map as a mask, while under the guidance of the resulting occlusion map, CDENet and REDNet focus on the depth estimation on non-occlusion and occlusion areas, respectively. Experimental results show that our method achieves better performance on 4D light field benchmark, especially in occlusion regions, when compared with current state-of-the-art light-field depth estimation algorithms.

Research Area(s)

  • Light fields, depth estimation, deep neural network, occlusion

Bibliographic Note

Research Unit(s) information for this publication is provided by the author(s) concerned.

Citation Format(s)

ACCURATE LIGHT FIELD DEPTH ESTIMATION VIA AN OCCLUSION-AWARE NETWORK. / Guo, Chunle; Jin, Jing; Hou, Junhui et al.
2020 IEEE International Conference on Multimedia and Expo (ICME). Institute of Electrical and Electronics Engineers, Inc., 2020. 9102829 (Proceedings - IEEE International Conference on Multimedia and Expo; Vol. 2020-July).

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review