PROJECT TITLE :
Neutral Face Classification Using Personalized Appearance Models for Fast and Robust Emotion Detection - 2015
Countenance recognition is one in all the open problems in laptop vision. Robust neutral face recognition in real time may be a major challenge for numerous supervised learning-based mostly facial expression recognition ways. This is because of the actual fact that supervised strategies cannot accommodate all look variability across the faces with respect to race, pose, lighting, facial biases, and thus on, within the limited amount of coaching data. Moreover, processing each and each frame to classify emotions isn't needed, as user stays neutral for majority of the time in usual applications like video chat or photo album/net browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this paper, we propose a light-weight neutral versus emotion classification engine, which acts as a pre-processer to the traditional supervised emotion classification approaches. It dynamically learns neutral look at key emotion (KE) points using a statistical texture model, created by a set of reference neutral frames for each user. The proposed methodology is created robust to varied types of user head motions by accounting for affine distortions based mostly on a statistical texture model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the previous information concerning the directionality of specific facial action units performing on the respective KE purpose. The proposed methodology, consequently, improves emotion recognition (ER) accuracy and simultaneously reduces computational complexity of the ER system, as validated on multiple databases.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here