Hand-writing motion tracking with vision-inertial sensor fusion : Calibration and error correction

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

11 Scopus Citations
View graph of relations



Original languageEnglish
Pages (from-to)15641-15657
Journal / PublicationSensors (Switzerland)
Issue number9
Publication statusPublished - 25 Aug 2014



The purpose of this study was to improve the accuracy of real-time ego-motion tracking through inertial sensor and vision sensor fusion. Due to low sampling rates supported by web-based vision sensor and accumulation of errors in inertial sensors, ego-motion tracking with vision sensors is commonly afflicted by slow updating rates, while motion tracking with inertial sensor suffers from rapid deterioration in accuracy with time. This paper starts with a discussion of developed algorithms for calibrating two relative rotations of the system using only one reference image. Next, stochastic noises associated with the inertial sensor are identified using Allan Variance analysis, and modeled according to their characteristics. Finally, the proposed models are incorporated into an extended Kalman filter for inertial sensor and vision sensor fusion. Compared with results from conventional sensor fusion models, we have shown that ego-motion tracking can be greatly enhanced using the proposed error correction model. © 2014 by the authors; licensee MDPI, Basel, Switzerland.

Research Area(s)

  • Human motion tracking, Inertial sensor calibration, MEMS-based motion tracking, Sensors fusion, Stochastic error modeling, Vision-based motion tracking

Citation Format(s)

Download Statistics

No data available