Research

Perception of linear and nonlinear motion properties using a FACS validated 3D facial model


Reference:

Cosker, D., Krumhuber, E. and Hilton, A., 2010. Perception of linear and nonlinear motion properties using a FACS validated 3D facial model. In: APGV '10 Proceedings of the 7th Symposium on Applied Perception in Graphics and Visualization. New York, U. S. A.: ACM, pp. 101-108.

Related documents:

This repository does not currently have the full-text of this item.
You may be able to access a copy if URLs are provided below.

Official URL:

http://dx.doi.org/10.1145/1836248.1836268

Abstract

In this paper we present the first Facial Action Coding System (FACS) valid model to be based on dynamic 3D scans of human faces for use in graphics and psychological research. The model consists of FACS Action Unit (AU) based parameters and has been independently validated by FACS experts. Using this model, we explore the perceptual differences between linear facial motions - represented by a linear blend shape approach - and real facial motions that have been synthesized through the 3D facial model. Through numerical measures and visualizations, we show that this latter type of motion is geometrically nonlinear in terms of its vertices. In experiments, we explore the perceptual benefits of nonlinear motion for different AUs. Our results are insightful for designers of animation systems both in the entertainment industry and in scientific research. They reveal a significant overall benefit to using captured nonlinear geometric vertex motion over linear blend shape motion. However, our findings suggest that not all motions need to be animated nonlinearly. The advantage may depend on the type of facial action being produced and the phase of the movement.

Details

Item Type Book Sections
CreatorsCosker, D., Krumhuber, E. and Hilton, A.
DOI10.1145/1836248.1836268
DepartmentsFaculty of Science > Computer Science
StatusPublished
ID Code20915

Export

Actions (login required)

View Item