3D Gaze Estimation for Head-Mounted Eye Tracking System with Auto-Calibration Method

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

4 Scopus Citations
View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Article number9107144
Pages (from-to)104207-104215
Journal / PublicationIEEE Access
Volume8
Online published3 Jun 2020
Publication statusPublished - 2020

Link(s)

Abstract

The general challenges of 3D gaze estimation for head-mounted eye tracking systems are inflexible marker-based calibration procedure and significant errors of depth estimation. In this paper, we propose a 3D gaze estimation with an auto-calibration method. To acquire the accurate 3D structure of the environment, an RGBD camera is applied as the scene camera of our system. By adopting the saliency detection method, saliency maps can be acquired through scene images, and 3D salient pixels in the scene are considered potential 3D calibration targets. The 3D eye model is built on the basis of eye images to determine gaze vectors. By combining 3D salient pixels and gaze vectors, the auto-calibration can be achieved with our calibration method. Finally, the 3D gaze point is obtained through the calibrated gaze vectors, and the point cloud is generated from the RGBD camera. The experimental result shows that the proposed system can achieve an average accuracy of 3.7° in the range of 1 m to 4 m indoors and 4.0° outdoors. The proposed system also presents a great improvement in depth measurement, which is sufficient for tracking users' visual attention in real scenes.

Research Area(s)

  • 3D gaze estimation, Auto-calibration, Head-mounted gaze tracking system, Saliency maps

Download Statistics

No data available