FaceReader™ is the premier software tool for facial analysis. It automatically detects a number of specific properties in facial images, such as facial expressions.
FaceReader has been trained to classify the expressions happy, sad, angry, surprised, scared, disgusted, and neutral. Paul Ekman described these emotional categories as the basic or universal emotions.
Additionally, FaceReader can recognize a 'neutral' state and analyze 'contempt'. It also calculates Action Units, valence, arousal, gaze direction, head orientation, and personal characteristics such as gender and age.
The software immediately analyzes your data (live, video, or still images), saving valuable time. The option to record audio as well as video makes it possible to hear what people have been saying – for example, during human-computer interactions, or while watching stimuli.
Prof. Dr. E. Bartkiene | Lithuanian University of Health Sciences, Lithuania
Many researchers have turned towards using automated facial expression analysis software to better provide an objective assessment of emotions. FaceReader works for all ethnicities and for ages 3 and older. Baby FaceReader is developed for infants ranging in age from 6 to 24 months old.
The software works by following these consecutive steps:
Types of input sources commonly used with FaceReader include video analysis, live analysis using a webcam, and still images.
The Social Media Lab in Mons, Belgium brings together researchers, students, professionals, and professors from different disciplines (communication, marketing, journalism, computer science, etc.). Together they try to understand the digital world, train themselves in the use of new technology, and learn more and advice about new professional practices.
FaceReader’s main output is a classification of the facial expressions of your test participant. These results are visualized in several different charts such as line and/or bar graphs, or pie charts, which shows the percentage per emotion.
FaceReader also calculates valence and arousal. The valence indicates whether the emotional state of the subject is positive or negative. Arousal indicates whether the subject is active or not.
The circumplex model of affect describes the distribution of emotions in a 2D circular space, containing arousal and valence dimensions.