Learn more about FaceReader

Want to learn more about FaceReader? Discover in-depth information in our white papers, product videos, and customer success stories.

You'll also find relevant publications, as well as product overviews for your research area.

FaceReader customer success stories

Studying user behavior and interactions

Researchers at the Social Media Lab analyze how users interact with social media.

Learn what your users need

This type of research helps civil society organisations and other professionals to identify their customers' needs and implement new tools.

Efficient coding of social behavior

At the Social Behavior Lab at Western University, dr. Erin Heerey explores human behavior during social interactions.

Save hours of manual coding

Frame-by-frame expression analysis of her project would have taken 800 hours of manual coding. FaceReaderTM did it in only 14 hours!

The role of sensory evaluation

At Virginia Tech Food Science & Technology, researchers use FaceReader to capture how people are responding to food products.

Observe unconscious responses

Studying unconscious responses helps researchers to gain insight in the effects of flavor, sensory quality, and nutrient value of food.

FaceReader webinars

Project Analysis Module and custom expressions in FaceReader

In this webinar, you'll learn more about what you can achieve with FaceReader.

  • Learn about new modeling methods
  • Watch a demonstration of the Project Analysis Module
  • Learn how you can use custom expressions

The psychophysiology of emotion

In this webinar, you'll learn about the relationship between emotional states and psychophysiological measures.

  • Discover how evolution shaped our brains
  • Learn how to measure heart rate and skin conductance
  • Observe patterns in physiological responses to certain emotions

Reading materials

FaceReader methodology

Learn more about how FaceReader classifies facial expressions.

You'll also discover what types of data you can collect for your research and how the software is validated.

Facial Action Coding System

The Facial Action Coding System (FACS) was developed by Ekman.

This model describes different Action Units - the smallest visible units of muscular activity in the face.

Custom expressions

In FaceReader, you can combine different metrics to create your own expressions.

Learn how to create custom expressions and get inspired by examples from other researchers.

FaceReader Online

Looking for a way to study participants remotely, from any location?

Discover the benefits of FaceReader Online and read about best practices when designing your study.

Featured blog posts

How emotions are made

Neuroscience research shows that emotions are created in our brains.

It's how our brains give meaning to our experiences and sensations. Learn more in this blog post.

5 tips to optimize your facial expression analyses

Emotion data allows researchers to gain in-depth insights in complex human behaviors.

These 5 tips will guarantee the best results in your facial expression analysis!

Using Baby FaceReader for automated analysis of infant emotions

What if you had a way to understand a baby's unspoken needs?

This study highlights to benefits of analyzing facial expressions in infants.

FaceReader videos

FaceReader classifications demo

See for yourself how FaceReader classifies facial expressions!

Affective attitudes in FaceReader

Measure interest, boredom, and confusion with FaceReader's Action Unit Module.

Baby FaceReader

Measure the facial expressions of an infant automatically.

Relevant publications

Krause, F.; Franke, N. (2023). Understanding Consumer Self-Design Abandonment: A Dynamic Perspective. Journal of Marketing. https://doi.org/10.1080/00140139.2022.2157493.
De Wijk, R.; Kaneko, D.; Dijksterhuis, G.; van Bergen, G.; Vingerhoeds, M.; Visalli, M.; Zandstra, E. (2022). A preliminary investigation on the effect of immersive consumption contexts on food-evoked emotions using facial expressions and subjective ratings. Food Quality and Preference. https://doi.org/10.1016/j.foodqual.2022.104572.
Märtin, C., Bissinger B.C., & Asta, P. (2021). Optimizing the digital customer journey - Improving user experience by exploiting emotions, personas and situations for individualized user interface adaptations. Journal of Consumer Behavior, 1-12. https://doi.org/10.1002/cb.1964.
Talen, L. & den Uyl, T.E. (2021). Complex Website Tasks Increase the Expression Anger Measured with FaceReader Online. International Journal of Human–Computer Interaction. https://doi.org/10.1080/10447318.2021.1938390.
Bourret, M., Ratelle, C.F., Plamondon, A. & Boisclair Châteauvert, G. (2023). Dynamics of parent-adolescent interactions during a discussion on career choice: The role of parental behaviors and emotions. Journal of Vocational Behavior, 141, https://doi.org/10.1016/j.jvb.2022.103837
Liu, S.; Wang, Y., Song, Y. (2023). Atypical facial mimicry for basic emotions in children with autism spectrum disorder. Autism Research, 16, 1375-1388.
Zaharieva, M.; Salvadori, E.; Messinger, D.; Visser, I.; Colonnesi, C. (2024). Automated facial expression measurement in a longitudinal sample of 4 and 8 month olds: Baby FaceReader 9 and manual coding of affective expressions. Behavior Research Methods, https://doi.org/10.3758/s13428-023-02301-3.
Malfait, A.; Puyvelde, M.; Detaille, F.; Neyt, X.; Waroquier, F. & Pattyn, N. (2023). Unveiling Readiness of Medical First Responders in Simulation Trainings: Insights beyond Queries. In: Jay Kalra (eds) Emerging Technologies in Healthcare and Medicine. AHFE International Conference. AHFE Open Access, vol 116. AHFE International, USA. https://doi.org/10.1177/01939459241233360.
Meng, Q. et al. (2020). On the effectiveness of facial expression recognition for evaluation of urban sound perception. Science of The Total Environment, 710, 135484, ISSN 0048-9697. https://doi.org/10.1016/j.scitotenv.2019.135484.
Yang, L.; Chen, X.; Guo, Q.; Zhang, J.; Luo, M.; Chen, Y.; Zou, X.; Xu, F. (2022). Changes in facial expressions in patients with Parkinson’s disease. Computer Speech & Language, 72(3). https://doi.org/10.1016/j.csl.2021.101286.