Example-based natural three-dimensional expression generation for virtual human faces

Description
Title: Example-based natural three-dimensional expression generation for virtual human faces
Authors: Zhu, Lijia
Date: 2006
Abstract: This thesis proposes a methodology which produces natural facial expressions for 3D human face models by making use of a databank. Firstly, guided by the specified feature point motions, an approach is introduced for generating facial expressions by blending the examples in an expression databank. The optimized blending weights are obtained implicitly by Genetic Algorithms (GA). Secondly, a consistent parameterization technique is shown to model the desired structured human face by processing the raw laser-scanned face data. Based on this technique, we construct a facial motion databank consisting of consistent facial meshes. Thirdly, rather than producing novel facial expressions for a different subject from scratch, an efficient method is presented to retarget facial motions from one person to another. Finally, these techniques are combined together to reconstruct the 3D facial motions (surface motions) based on the motion capture data (feature point motions).
URL: http://hdl.handle.net/10393/27435
http://dx.doi.org/10.20381/ruor-12083
CollectionTh├Ęses, 1910 - 2010 // Theses, 1910 - 2010
Files
MR25848.PDF7.01 MBAdobe PDFOpen