Human computer interface for quadriplegic people based on face position/gesture detection

Research output: Chapters, Conference Papers, Creative and Literary Works (RGC: 12, 32, 41, 45)32_Refereed conference paper (with ISBN/ISSN)peer-review

2 Scopus Citations
View graph of relations

Author(s)

  • Zhen-Peng Bian
  • Junhui Hou
  • Lap-Pui Chau
  • Nadia Magnenat-Thalmann

Detail(s)

Original languageEnglish
Title of host publicationMM 2014 - Proceedings of the 2014 ACM Conference on Multimedia
PublisherAssociation for Computing Machinery, Inc
Pages1221-1224
ISBN (Print)9781450330633
Publication statusPublished - 3 Nov 2014
Externally publishedYes

Conference

Title2014 ACM Conference on Multimedia, MM 2014
PlaceUnited States
CityOrlando
Period3 - 7 November 2014

Abstract

This paper proposes a human computer interface using a single depth camera for quadriplegic people. The nose position is employed to control the cursor along with the commands provided by mouth's status. The detection of nose position and mouth's status is based on randomized decision tree algorithm. The experimental results show that the proposed interface is comfortable, easy to use, robust, and outperforms the existing assistive technology.

Research Area(s)

  • Assistive technology, Computer access, Hand-free interface, Humancomputer interaction (HCI), Quadriplegic, Severe disabilities, Vision-based

Citation Format(s)

Human computer interface for quadriplegic people based on face position/gesture detection. / Bian, Zhen-Peng; Hou, Junhui; Chau, Lap-Pui; Magnenat-Thalmann, Nadia.

MM 2014 - Proceedings of the 2014 ACM Conference on Multimedia. Association for Computing Machinery, Inc, 2014. p. 1221-1224.

Research output: Chapters, Conference Papers, Creative and Literary Works (RGC: 12, 32, 41, 45)32_Refereed conference paper (with ISBN/ISSN)peer-review