When seeing another person wearing a face mask, humans have to rely on visible facial signals, which are the eyes and eyebrows. Looking someone in the eye is an important part of communication.
What is the best way to communicate an important message such as 'Stay at home'? The research team of Mauri compared the emotional reactions to three short videos containing this message using FaceReader.
What happens when we’re in pain, real physical pain, but we cannot tell someone where or how badly it hurts? We can look at the facial expression!
Within the field of human factors and usability, frustration poses an interesting challenge. It can be a barrier for learning. So how can we measure frustration in order to minimize it?
Is there a relationship between food choice and a person’s mood? Bartkiene et al. examined the factors that influence our food choice, using facial expression analysis.
The SUKIPANI smile is an exercise to train the muscles you use while smiling. Dr. Sugahara explains the effect of the movements of the muscles and uses FaceReader to analyze the smiles.
Nowadays, measuring heart rate and heart rate variability can be done remotely, without all kinds of devices being attached to the test participant, using remote photoplethysmography (RPPG). What is RPPG and how does it work?
Hearing an infant cry can cause negative emotions, which can impact the way we respond. Researchers Riem and Karreman instructed parents to apply specific emotion regulation strategies in response to infant crying.
Measuring or assessing emotions is not always straightforward and easy. How do we view the nature of emotions in the first place?
In a previous blog titled “How emotions are made”, I outlined how neuroscience research in the past decades has shown that our brain gives meaning to our experiences/sensations through concepts such as emotions.