Ujir, Hamimah (2013). 3D facial expression classification using a statistical model of surface normals and a modular approach. University of Birmingham. Ph.D.
|
UjirH13PhD.pdf
PDF - Redacted Version Download (3MB) |
Abstract
Following the success in 3D face recognition, the face processing community is now trying to establish good 3D facial expression recognition. Facial expressions provide the cues of communication in which we can interpret the mood, meaning and emotions at the same time. With current advanced 3D scanners technology, direct anthropometric measurements (i.e. the comparative study of sizes and proportions of the human body) are easily obtainable and it offers 3D geometrical data suitable for 3D face processing studies. Instead of using the raw 3D facial points, we extracted its derivative which gives us 3D facial surface normals. We constructed a statistical model for variations in facial shape due to changes in six basic expressions using 3D facial surface normals as the feature vectors. In particular, we are interested in how such facial expression variations manifest themselves in terms of changes in the field of 3D facial surface normals. We employed a modular approach where a module contains the facial features of a distinct facial region. The decomposition of a face into several modules promotes the learning of a facial local structure and therefore the most discriminative variation of the facial features in each module is emphasized. We decomposed a face into six modules and the expression classification for each module is carried out independently. We constructed a Weighted Voting Scheme (WVS) to infer the emotion underlying a collection of modules using a weight that is determined using the AdaBoost learning algorithm. Using our approach, using 3D facial surface normal as the feature vector of WVS yields a better performance than 3D facial points and 3D distance measurements in facial expression classification using both WVS and Majority Voting Scheme (MVS). The attained results suggest surface normals do indeed produce a comparable result particularly for six basic facial expressions with no intensity information.
Type of Work: | Thesis (Doctorates > Ph.D.) | ||||||
---|---|---|---|---|---|---|---|
Award Type: | Doctorates > Ph.D. | ||||||
Supervisor(s): |
|
||||||
Licence: | |||||||
College/Faculty: | Colleges (2008 onwards) > College of Engineering & Physical Sciences | ||||||
School or Department: | School of Engineering, Department of Electronic, Electrical and Systems Engineering | ||||||
Funders: | None/not applicable | ||||||
Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science | ||||||
URI: | http://etheses.bham.ac.uk/id/eprint/4371 |
Actions
Request a Correction | |
View Item |
Downloads
Downloads per month over past year