Repository logo

Example-based natural three-dimensional expression generation for virtual human faces

dc.contributor.authorZhu, Lijia
dc.date.accessioned2013-11-07T18:14:12Z
dc.date.available2013-11-07T18:14:12Z
dc.date.created2006
dc.date.issued2006
dc.degree.levelMasters
dc.degree.nameM.C.S.
dc.description.abstractThis thesis proposes a methodology which produces natural facial expressions for 3D human face models by making use of a databank. Firstly, guided by the specified feature point motions, an approach is introduced for generating facial expressions by blending the examples in an expression databank. The optimized blending weights are obtained implicitly by Genetic Algorithms (GA). Secondly, a consistent parameterization technique is shown to model the desired structured human face by processing the raw laser-scanned face data. Based on this technique, we construct a facial motion databank consisting of consistent facial meshes. Thirdly, rather than producing novel facial expressions for a different subject from scratch, an efficient method is presented to retarget facial motions from one person to another. Finally, these techniques are combined together to reconstruct the 3D facial motions (surface motions) based on the motion capture data (feature point motions).
dc.format.extent100 p.
dc.identifier.citationSource: Masters Abstracts International, Volume: 45-05, page: 2537.
dc.identifier.urihttp://hdl.handle.net/10393/27435
dc.identifier.urihttp://dx.doi.org/10.20381/ruor-12083
dc.language.isoen
dc.publisherUniversity of Ottawa (Canada)
dc.subject.classificationComputer Science.
dc.titleExample-based natural three-dimensional expression generation for virtual human faces
dc.typeThesis

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail ImageThumbnail Image
Name:
MR25848.PDF
Size:
6.85 MB
Format:
Adobe Portable Document Format