Face, Age and Gender Recognition Using Local Descriptors

Title: Face, Age and Gender Recognition Using Local Descriptors
Authors: Mousa Pasandi, Mohammad Esmaeel
Date: 2014
Abstract: This thesis focuses on the area of face processing and aims at designing a reliable framework to facilitate face, age, and gender recognition. A Bag-of-Words framework has been optimized for the task of face recognition by evaluating different feature descriptors and different bag-of-words configurations. More specifically, we choose a compact set of features (e.g., descriptors, window locations, window sizes, dictionary sizes, etc.) in order to produce the highest possible rate of accuracy. Experiments on a challenging dataset shows that our framework achieves a better level of accuracy when compared to other popular approaches such as dimension reduction techniques, edge detection operators, and texture and shape feature extractors. The second contribution of this thesis is the proposition of a general framework for age and gender classification. Although the vast majority of the existing solutions focus on a single visual descriptor that often only encodes a certain characteristic of the image regions, this thesis aims at integrating multiple feature types. For this purpose, feature selection is employed to obtain more accurate and robust facial descriptors. Once descriptors have been computed, a compact set of features is chosen, which facilitates facial image processing for age and gender analysis. In addition to this, a new color descriptor (CLR-LBP) is proposed and the results obtained is shown to be comparable to those of other pre-existing color descriptors. The experimental results indicates that our age and gender framework outperforms other proposed methods when examined on two challenging databases, where face objects are present with different expressions and levels of illumination. This achievement demonstrates the effectiveness of our proposed solution and allows us to achieve a higher accuracy over the existing state-of-the-art methods.
URL: http://hdl.handle.net/10393/31747
CollectionThèses, 2011 - // Theses, 2011 -