Software screen FaceReader software FaceReader emotion analysis Facial expression analysis

Emotion Analysis

FaceReader is the premier software tool for facial analysis. It automatically detects a number of specific properties in facial images, such as facial expressions.

Why you should use it

  • Automated analysis of facial expressions brings clear insights into the effect of different stimuli on emotions.
  • Very easy-to-use: save valuable time and resources.
  • Objective and unobtrusive observations.
  • Easy integration with eye tracking data and physiology data.

Publications with Noldus

FaceReader is used in over 1,000 publications!


Facial expression recognition software

FaceReader has been trained to classify the expressions happy, sad, angry, surprised, scared, disgusted, and neutral. Paul Ekman described these emotional categories as the basic or universal emotions.

Additionally, FaceReader can recognize a 'neutral' state and analyze 'contempt'. It also calculates Action Units, valence, arousal, gaze direction, head orientation, and personal characteristics such as gender and age.

The software immediately analyzes your data (live, video, or still images), saving valuable time. The option to record audio as well as video makes it possible to hear what people have been saying – for example, during human-computer interactions, or while watching stimuli.

FaceReader Emotion Analysis

FaceReader is the most reliable software tool for facial expression analysis (source).

Customer testimonial

"FaceReader software is very promising and sufficiently accurate to detect differences in facial emotion expressions induced by different tastes of food for different mood groups (with and without depressive disorder)."

Prof. Dr. E. Bartkiene  |  Lithuanian University of Health Sciences, Lithuania


Emotion analysis with FaceReader

Many researchers have turned towards using automated facial expression analysis software to better provide an objective assessment of emotions. FaceReader works for all ethnicities and for ages 3 and older. Baby FaceReader is developed for infants ranging in age from 6 to 24 months old.

The software works by following these consecutive steps:

  1. Face finding - FaceReader finds an accurate position of the face using the popular Viola-Jones algorithm.
  2. Modeling - An accurate 3D model of the face is being made, which describes over 500 key points.
  3. Deep face classification - This popular Artificial Intelligence method allows FaceReader to directly classify the face from image pixels using an artificial neural network to recognize patterns. This allows the software to analyze the face even if a part of it is hidden.
  4. Classification - An artificial neural network uses over 10,000 pictures to classify the basic emotional expressions and a number of properties, including gaze direction, head orientation, and characteristics such as gender and age.

Types of input sources commonly used with FaceReader include video analysis, live analysis using a webcam, and still images.

FaceReader is the complete solution for facial expression analysis!

Tiffany Andry: "I like to use these tools that seems quantitative to make qualitative results."

The Social Media Lab in Mons, Belgium brings together researchers, students, professionals, and professors from different disciplines (communication, marketing, journalism, computer science, etc.). Together they try to understand the digital world, train themselves in the use of new technology, and learn more and advice about new professional practices.


Classifying facial expressions

FaceReader’s main output is a classification of the facial expressions of your test participant. These results are visualized in several different charts such as line and/or bar graphs, or pie charts, which shows the percentage per emotion.

FaceReader also calculates valence and arousal. The valence indicates whether the emotional state of the subject is positive or negative. Arousal indicates whether the subject is active or not.

The circumplex model of affect describes the distribution of emotions in a 2D circular space, containing arousal and valence dimensions.



Ready to kick start your research?

Want better insights and faster results? Contact us now to learn how we can help you!