FaceReader

Resources


Download our white papers

FaceReader Methodology

FaceReader can detect facial expressions and offers a number of extra classifications. Do you want to learn more about how FaceReader works? Read on!

Best practices FaceReader Online

Do you want to know how online facial expression analysis service FaceReader Online can benefit your UX and market research? Read on!

Custom expressions

Use Action Units (AUs) and other metrics to create custom expressions in FaceReader. This white paper will provide some best practices and advice to help you along.



Eye Tracking & FaceReader

Both eye tracking and facial expression analysis add substantial power to your research by providing information about attention and emotion.

Facial Action Coding System (FACS)

Coding facial actions enables a greater awareness to subtle facial behaviors. Recent advances in computer vision have allowed for reliable automated facial action coding.

EyeReader in FaceReader Online

Find out more about the methodology and accuracy of webcam-based eye tracking technology within FaceReader Online by downloading this white paper.

 


FaceReader product videos

FaceReader Classifications Demo

Discover the realtime modeling capabilities of FaceReader.

Affective attitudes in FaceReader

Using the Action Unit Module, Facereader is able to detect the affective attitude 'confusion'.

Baby FaceReader

Baby FaceReader enables you to recognize the facial expressions of an infant automatically!

 

Download our free product overviews

Psychology research

Superior behavior recording and analysis is possible in an AV lab. Facilitating educational research, developmental psychology studies, or infant behavior studies.

Education and Training

Getting trained and educated helps individuals acquire and improve skills. Feedback is essential. Discover which tools support you with this.

User experience research

Improve your user experience testing and research facility with integrated and easy-to-use equipment and software from Noldus. Unprecedented capabilities!



Neuroscience research

Tools and integrated lab solutions for research in cognitive workload, language acquisition, and developmental behavior studies. Eye tracking, video, and physiology.

Consumer behavior research

Really understand your customer by getting real insights into behavior actions. Observational studies give insight into unconscious decision making. Learn more!

Healthcare research (interactive)

Medical professionals, nurses and students benefit from trainings in a simulation lab. Let us help you choose the tools you need in your education and research.

 



FaceReader measures emotions

What does your face say?

Curious what emotions your face shows? Upload a photo here and our FaceReader software will test it for emotionality. The better quality the picture, the better the results! To get the best results, test only photos where your face is clearly visible. The lighting should be sufficient as well. 

Try it now and find out what your face says.

 
 


Customer success stories

How to measure emotions in the Uses and Acceptability Lab

Jean-Marc Diverrez, bcom Institute of Research and Technology

Sensory evaluation - Food science & technology

Prof. Susan Duncan, Virginia Tech

Social Media Lab: How people make sense of data

Tiffany Andry, University of Louvain

 


References

  • Dupré, D. et al. (2020). A performance comparison of eight commercially available automatic classifiers for facial affect recognition. PLoS ONE, 15(4):e0231968. https://doi.org/10.1371/journal.pone.0231968.
  • Märtin, C., Bissinger B.C., & Asta, P. (2021). Optimizing the digital customer journey - Improving user experience by exploiting emotions, personas and situations for individualized user interface adaptations. Journal of Consumer Behavior, 1-12. https://doi.org/10.1002/cb.1964.
  • Meng, Q. et al. (2020). On the effectiveness of facial expression recognition for evaluation of urban sound perception. Science of The Total Environment, 710, 135484, ISSN 0048-9697. https://doi.org/10.1016/j.scitotenv.2019.135484.
  • Rogers, J. (2022). Ethnicity & FaceReader 9 - A FairFace Case Study. Volume 2 of the Proceedings of the joint 12th International Conference on Methods and Techniques in Behavioral Research and 6th Seminar on Behavioral Methods, held online, May 18-20, 2022. www.measuringbehavior.org. Editors: Andrew Spink, Jaroslaw Barski, Anne-Marie Brouwer, Gernot Riedel, & Annesha Sil (2022). https://doi.org/10.6084/m9.figshare.20066849. ISBN: 978-90-74821-94-0
  • Stöckli, S. et al. (2018). Facial expression analysis with AFFDEX and FACET: A validation study. Behav Res Methods, 50(4), 1446-1460. https://doi.org/10.3758/s13428-017-0996-1.
  • Talen, L. & den Uyl, T.E. (2021). Complex Website Tasks Increase the Expression Anger Measured with FaceReader Online. International Journal of Human–Computer Interaction. https://doi.org/10.1080/10447318.2021.1938390.
 
 


Recent blog posts

How emotions are made

How emotions are made

Neuroscience research in the past decades has shown that our brain gives meaning to our experiences and sensations, through concepts such as emotions.
How to study human behavior

How to study human behavior

Many people are fascinated by human behavior. Why do we act the way we do? How is our behavior influenced, or measured? And why is behavioral change so difficult?
How the ability to manage emotions shapes perception of risk

How the ability to manage emotions shapes perception of risk

Can our ability to recognize and control our emotions determine how dangerous we perceive certain hazards to be and whether or not we think we are at risk?