Skip to main navigation Skip to search Skip to main content

Lip-sync in human face animation based on video analysis and spline models

Sy-Sen Tang, Alan Wee-Chung Liew, Hong Yan

    Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

    Abstract

    Human facial animation is an interesting and difficult problem in computer graphics. In this paper, a novel B-spline (NURBS) muscle system is proposed to simulate a 3D facial expression and talking animation. The system gets the lip shape parameters from the video, which captures a real person's lip movement, to control the proper muscles to form different phonemes. The muscles are constructed by the non-uniform rational B-spline curves, which are based on anatomical knowledge. By using different number of control points on the muscles, more detailed facial expression and mouth shapes can be simulated. We demonstrate the flexibility of our model by simulating different emotions and lip-sync to a video with a talking head using the automatically extracted lip parameters.
    Original languageEnglish
    Title of host publicationProceedings - 10th International Multimedia Modelling Conference, MMM 2004
    Pages102-108
    Publication statusPublished - 2004
    EventProceedings - 10th International Multimedia Modelling Conference, MMM 2004 - Brisbana, Australia
    Duration: 5 Jan 20047 Jan 2004

    Conference

    ConferenceProceedings - 10th International Multimedia Modelling Conference, MMM 2004
    PlaceAustralia
    CityBrisbana
    Period5/01/047/01/04

    Fingerprint

    Dive into the research topics of 'Lip-sync in human face animation based on video analysis and spline models'. Together they form a unique fingerprint.

    Cite this