Person authentication using ASM based lip shape and intensity information

L. L. Mok, W. H. Lau, S. H. Leung, S. L. Wang, H. Yan

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

17 Citations (Scopus)

Abstract

Authentication system solely based on visual lip information is of advantage since the uttering characteristics/manner is unique to individual and difficult to imitate. This paper will present the study of using lip shape-based and intensity features in person authentication. These features are derived from a 14-point Active Shape Model (ASM) lip model with the use of Principal Component Analysis (PCA). The differential change of the feature parameters reflecting the uttering characteristics are also considered in the study. A database containing the visual utterance of 40 speakers has been generated and each of the utterance is of duration 3 seconds. The visual features are then extracted from this database and a Hidden Markov Model (HMM) classifier is used to perform the analysis. It is observed that the best authentication result is obtained when the first 8 modes of intensity profile is used together with the lip shaped-based parameters. © 2004 IEEE.
Original languageEnglish
Title of host publicationProceedings - International Conference on Image Processing, ICIP
Pages561-564
Volume1
DOIs
Publication statusPublished - 2004
Event2004 International Conference on Image Processing, ICIP 2004 - , Singapore
Duration: 18 Oct 200421 Oct 2004

Publication series

Name
Volume1
ISSN (Print)1522-4880

Conference

Conference2004 International Conference on Image Processing, ICIP 2004
PlaceSingapore
Period18/10/0421/10/04

Fingerprint

Dive into the research topics of 'Person authentication using ASM based lip shape and intensity information'. Together they form a unique fingerprint.

Cite this