In Partial Fulfillment of the Requirements for the Degree of
Doctor of Philosophy
Will defend his pre-defense
Most of current computer facial animation approaches largely focus on the accuracy or efficiency of their algorithms, or how to optimally utilize pre-collected facial motion data. However, human perception, the ultimate measuring stick of the visual fidelity of synthetic facial animations, was not effectively exploited in these approaches. In this research, we propose a novel perceptually guided computational framework for computer facial animation, by bridging objective facial motion patterns with subjective perceptual outcomes. First, we will construct a facial perceptual metric using a hybrid of region-based facial motion analysis and statistical learning techniques. The constructed facial perceptual metric can automatically measure the emotional expressiveness of a facial motion sequence. Second, we will incorporate the constructed facial perceptual metric into various facial animation algorithms and thus develop various perceptually guided facial animation applications (e.g., expressive speech animation synthesis, facial animation editing, and facial motion transferring). Through a comparative user study, we will compare with the traditional facial animation algorithms in order to analyze that how the introduced perceptually guided expressive facial animation algorithms can increase the perceptual believability of synthesized facial animations.