
Example-based Computer- Generated Facial Mimicry by Glyn Cowe BSc Mathematics, University of Bath (1997) Submitted to the Department of Psychology University College London In partial fulfilment of the requirements for the Degree of Doctorate of Philosophy January 2003 ProQuest Number: U643864 All rights reserved INFORMATION TO ALL USERS The quality of this reproduction is dependent upon the quality of the copy submitted. In the unlikely event that the author did not send a complete manuscript and there are missing pages, these will be noted. Also, if material had to be removed, a note will indicate the deletion. uest. ProQuest U643864 Published by ProQuest LLC(2016). Copyright of the Dissertation is held by the Author. All rights reserved. This work is protected against unauthorized copying under Title 17, United States Code. Microform Edition © ProQuest LLC. ProQuest LLC 789 East Eisenhower Parkway P.O. Box 1346 Ann Arbor, Ml 48106-1346 A b stract Computer-generated faces, or avatars, are becoming increasingly sophisticated, but are visually unrealistic, particularly in motion, and their control remains problematic. Previous work has implemented complex three- dimensional polygonal models, often generated from laser-scans, with intricate hard-coded muscle models for actuation of speech and expression. Driving the avatar through mimicry, or performance-driven animation, involves tracking a real actor’s facial movements and associating them with analogues on the model. Motion is tracked usually through markers physically attached to the actor’s face or by locating natural feature boundaries. Here, complex three- dimensional models are avoided by taking an image-based approach. Novel techniques are presented for automatically creating and driving photo-realistic moveable face models, generated from example footage of a face in motion. Image changes for each frame, coupled with dense motion fields extracted using an optic flow algorithm, are analyzed to extract a set of basis actions by application of principal components analysis. These techniques yield a virtual avatar onto which the movements of an actor can automatically be projected for convincing performance-driven animation, with no need for markers. C ontents Example-based Computer-Generated Facial Mimicry... 1 Abstract .............................................................................. 2 Contents............................................................................. 3 Chapter 1- introduction..................................................... 8 1.1 - Applications for Photo-realistic Computer-Generated Facial Mimicry 9 1.2 - History of Facial Animation ........................................................................10 1.3 - Computer-Generated Facial Animation ................................................... 12 1.4 - Example-based Facial Mimicry................................................................. 12 Chapter 2- Modelling the face........................................ 14 2.1 - Computer-Generated Models of Faces .................................................... 14 2.1.1 Surface representations ....................................................................14 2.1.2 Volume representations ....................................................................17 2.1.3 Dynamic Representations .................................................................18 2.1.4 Summary............................................................................................ 19 2.2 - Biological Representations of Faces ........................................................ 20 2.2.1 Specialist Mechanisms ......................................................................20 2.2.2 Separate Modules ............................................................................. 21 2.2.3 3D Representation ............................................................................ 22 2.2.4 Invariance.......................................................................................... 23 2.2.5 Features or configuration? ............................................................... 25 2.2.6 Principal sources of variance in the face .........................................28 2.2.7 Encoding of Motion ........................................................................... 29 2.3 - Biologically Inspired Facial Representation .............................................31 2.3.1 Eigenfaces .........................................................................................31 2.3.2 Conclusion .........................................................................................33 Chapter 3- Related Work................................................. 35 3.1 - Motion Capture ...........................................................................................35 3.1.1 Key-framing ........................................................................................36 3.1.2 Dot tracking ........................................................................................36 3.1.3 Deformable contours .........................................................................37 3.1.4 Feature tracking ................................................................................ 37 3.1.5 Optic flow ...........................................................................................39 3.1.6 3D tracking .........................................................................................40 3.1.7 Muscle Tracking ................................................................................ 41 3.2 - Two-dimensional Facial Animation ...........................................................41 3.2.1 Example-based modelling ............................................................... 41 3.2.2 Prototyping ........................................................................................42 3.2.3 Videoconferencing ...........................................................................43 3.2.4 Lip-synching ......................................................................................43 3.3 - Principal Components Analysis on Faces ................................................45 3.3.1 Images of Faces .............................................................................. 45 3.3.2 Separated Shape and Texture ....................................................... 46 3.3.3 Laser Scanned Heads......................................................................47 3.3.4 Dot Tracking Data ............................................................................ 47 3.3.5 Dynamic Laser-scanned Heads..................................................... 47 3.3.6 Facial Motion from Optic Flow ......................................................... 48 3.4 - Summary....................................................................................................48 Chapter 4- Example-based Generation of Facial Avatars ............................................................................................50 4.1 - An Example-based Approach to Modelling Faces .................................. 50 4.1.1 Facial motion as pixel-wise intensity variations ............................. 50 4.1.2 Centring the Examples .....................................................................51 4.1.3 Principal Components Analysis ....................................................... 52 4.1.4 First principal component .................................................................53 4.1.5 Second principal component ........................................................... 54 4.1.6 The remaining principal components ............................................. 55 4.1.7 Reducing computation .....................................................................57 4.2 - Vectorising Faces by Warping .................................................................. 57 4.2.1 Warping a reference ......................................................................... 58 4.2.2 The Multi-channel Gradient Model(McGM) .................................... 61 4.2.3 Results...............................................................................................64 4.3 - Vectorising Faces by Morphing ................................................................. 66 4.3.1 Image Morphing ................................................................................67 4.3.2 Morph Vectorisation Procedure ....................................................... 70 4.3.3 Results...............................................................................................74 4.4 - Summary.................................................................................................... 76 Chapter 5- Facial Movement Analysis ........................... 77 5.1 - Existing Facial Action Coding Schemes .................................................. 77 5.1.1 The Facial Action Coding Scheme (FACS) ..................................... 77 5.1.2 FACS+...............................................................................................77 5.1.3 MPEG-4............................................................................................ 78 5.2 - PCA for Coding Facial Action ................................................................... 78 5.2.1 Previous Applications of PCA in Facial Motion Analysis ...............78 5.2.2 Methods ............................................................................................ 79 5.2.3 Comparison of results .....................................................................
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages171 Page
-
File Size-