Emotions are what makes us human, and we all experience them. They are created by the brain and are related to interests and past experiences: they signal that interests are at stake, in a positive or negative way. Emotions cause your cortex to pay attention, since the main task of the brain is to keep us alive and well.
Emotion data provides crucial insights that allow researchers to gain insight in complex human behaviors in greater depth. Emotions can play a role in all kind of matters. For example, in the decisions we make whether or not to buy something, in food choices, and in how to interact with others.
Facial expression recognition software
Facial expression analysis software like FaceReader™ is ideal for collecting this emotion data. The software automatically analyzes the expressions happy, sad, angry, surprised, scared, disgusted, and neutral.
Additionally, FaceReader can recognize a 'neutral' state and analyze 'contempt'. It also calculates Action Units, valence, arousal, gaze direction, head orientation, and personal characteristics such as gender and age.
Objective assessment of emotions
Many researchers have turned towards using FaceReader to better provide an objective assessment of emotions. It is used worldwide at more than 900 universities (including 6 out of 8 Ivy League universities), research institutes, and companies in many markets such as psychology, consumer research, user experience, human factors, and neuromarketing.
Using the software eliminates biases and since it immediately analyzes your data and is very easy to use, it saves a huge amount of valuable time. According to a validation study, FaceReader shows the best performance out of the major software tools for emotion classification currently available.
All emotions, whether they are suppressed or not, are likely to have a physical effect. Biometric research will bring these effects to the surface by studying subconscious processes related to attention, cognition, emotion, and physiological arousal.
For example, combine eye tracking and facial expression analysis to find out where the participant looked when frustrated, or, for example, what element in the video surprised the participant the most. Here, perfect synchronicity is key to matching the measurement of attention or cognitive workload to emotions. Check out CubeHX for more information.
Tiffany Andry: "I like to use these tools that seems quantitative to make qualitative results."
The Social Media Lab in Mons, Belgium brings together researchers, students, professionals, and professors from different disciplines (communication, marketing, journalism, computer science, etc.). Together they try to understand the digital world, train themselves in the use of new technology, and learn more and advice about new professional practices.
A diverse collection of scientific articles citing Noldus products are published in renowned journals. The following list is only a small selection of scientific publications. Please contact us if you need more reference material.
- Bartkiene, E.; Steibliene, V.: Adomaitiene, V.; Juodeikiene, G.; Cernauskas, D.; Lele, V.; Klupsaite, D.; Zadeike, D.; Jarutiene, L. & Guiné, R.P.F. (2019). Factors Affecting Consumer Food Preferences: Food Taste and Depression-Based Evoked Emotional Expressions with the Use of Face Reading Technology. BioMed Research International, 4, 1-10. https://doi.org/10.1155/2019/2097415
- Danner, L.; Sidorkina, L.; Joechl, M.; Duerrschmid. (2014) Make a face! Implicit and explicit measurement of facial expressions elicited by orange juices using face reading technology. Food Quality and Preference, https://doi.org/10.1016/j.foodqual.2013.01.004
- Stöckli, S.; Schulte-Mecklenbeck, M.; Borer, S. & Samson, A.C. (2018) Facial expression analysis with AFFDEX and FACET: A validation study. Behavior Research Methods, 50 (4), 1446-1460.
- Vergura, D.T.; Luceri, B. (2018). Product packaging and consumers’ emotional response. Does spatial representation influence product evaluation and choice? Journal of Consumer Marketing, https://doi.org/10.1108/JCM-12-2016-2021.