Illumination direction estimation for augmented reality using a surface input real valued output regression network

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

8 Scopus Citations
View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)1700-1716
Journal / PublicationPattern Recognition
Volume43
Issue number4
Publication statusPublished - Apr 2010

Abstract

Due to low cost for capturing depth information, it is worthwhile to reduce the illumination ambiguity by employing scenario depth information. In this article, a neural computation approach is reported that estimates illuminant direction from scenario reflectance map. Since the reflectance map recovered from depth map and image is a variable sized point cloud, we propose to parameterize it as a two dimensional polynomial function. Afterwards, a novel network model is presented for mapping from continuous function (reflectance map) to vectorial output (illuminant direction). Experimental results show that the proposed model works well on both synthetic and real scenes. © 2009 Elsevier Ltd. All rights reserved.

Research Area(s)

  • Illuminant direction estimation, Neural network with functions as input, Surface input pattern