Analyze facial expressions to find emotions

Analyze emotions in inter-personal communication

Tuesday, 19 July, 2011

The human face provides a number of signals essential for inter-personal communication. It is one of our most direct means of communication and allows us to recognize someone’s affective state and intentions.

“Happy, sad, angry, disgusted, scared, and surprised” are emotional categories also known as the ‘basic emotions’ or ‘universal emotions’ [1]. Sometimes just one look can say more than a thousand words.

How do we reliably measure this kind of communication?

Although a researcher can observe emotional cues, he or she will often miss crucial information.

Consequently, only observing facial expressions may not provide enough detailed information. Working with video offers the possibility to code facial expressions afterwards, for example by using Facial Action Coding System [2].

Using FACS, human coders can manually code almost all possible facial expressions. However, this procedure can be time-consuming. To become an experienced FACS coder, training and practice is essential.

Software for facial expression analysis

Another option is to automate your experiment. By using software, facial expressions can be recognized with an accuracy between 85% and 96% [3]. So working with a video camera and emotion recognition software would definitely be a possibility to consider. Of course, you can also interview your test participants and ask them about their feelings or emotions but then the answers might be based on what is considered socially desirable behavior. Often it is a combination of methods and techniques which will deliver the most complete data.

FaceReader

FaceReader methodology white paper

Request the FREE FaceReader methodoloty note to learn more about facial expression analysis theory.

  • Learn what FaceReader is and how it works
  • Learn how the calibration works
  • Get insight in quality of analysis and output

References

  1. Ekman, P. (1970). Universal facial expressions of emotion. California Mental Health Research Digest, 8, 151-158.
  2. Ekman, P., Friesen, W. V., & Hager, J. C. (2002). Facial action coding system. Salt Lake City: Research Nexus.
  3. Bijlstra, G., & Dotsch, R. (2011). FaceReader 4 emotion classification performance on images from the Radboud Faces Database. Unpublished manuscript retrieved from http://www.gijsbijlstra.nl/ and http://ron.dotsch.org/.