Human computer interface for quadriplegic people based on face position/gesture detection

Zhen-Peng Bian, Junhui Hou, Lap-Pui Chau, Nadia Magnenat-Thalmann

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

3 Citations (Scopus)

Abstract

This paper proposes a human computer interface using a single depth camera for quadriplegic people. The nose position is employed to control the cursor along with the commands provided by mouth's status. The detection of nose position and mouth's status is based on randomized decision tree algorithm. The experimental results show that the proposed interface is comfortable, easy to use, robust, and outperforms the existing assistive technology.
Original languageEnglish
Title of host publicationMM 2014 - Proceedings of the 2014 ACM Conference on Multimedia
PublisherAssociation for Computing Machinery
Pages1221-1224
ISBN (Print)9781450330633
DOIs
Publication statusPublished - 3 Nov 2014
Externally publishedYes
Event2014 ACM Conference on Multimedia, MM 2014 - Orlando, United States
Duration: 3 Nov 20147 Nov 2014

Conference

Conference2014 ACM Conference on Multimedia, MM 2014
PlaceUnited States
CityOrlando
Period3/11/147/11/14

Research Keywords

  • Assistive technology
  • Computer access
  • Hand-free interface
  • Humancomputer interaction (HCI)
  • Quadriplegic
  • Severe disabilities
  • Vision-based

Fingerprint

Dive into the research topics of 'Human computer interface for quadriplegic people based on face position/gesture detection'. Together they form a unique fingerprint.

Cite this