Measuring experiential, behavioral, and physiological outputs

Measuring experiential, behavioral, and physiological outputs

Posted by Annelies Querner-Verkerk on Fri 26 Sep. 2014 - 2 minute read

In a romantic relationship, it is undoubtedly important to show support when one’s partner shares his or her accomplishments and positive life events. Retelling and reliving such events can evoke certain emotions, but the listener’s response often impacts the storyteller’s attitude as well. To simulate this process, researchers Samuel Monfort and colleagues created a structured social interaction task with couples: 1) positive event of one partner, 2) disclosure of the event to the other partner, and 3) clear communication of a capitalization response that ranged from actively destructive to enthusiastic, supportive and constructive. 

FaceReader romantic relationships

Integrating multiple modalities

Monfort and his team set out to capture a full range of emotional responses and therefore measured experiential (subjective feelings), behavioral (facial-motor activity), and physiological (skin conductance) outputs. Behavior and physiology are closely linked; as Patrick Zimmerman and his colleagues explain, more and more researchers now see the benefit of combining behavioral observations with other types of data such as heart rate, blood pressure or eye movements. By integrating multiple modalities, researchers achieve a more complete picture of the phenomena being studied.

Observing couples

A total of 69 couples participated in Monfort’s study. During lab interactions, couples were separated into cubicles with no eye contact or talking. Each experiment consisted of partner A successfully completing a computer task, then sharing this success with partner B. Partner B could choose from four possible responses: active-constructive (e.g., ‘‘Wonderful! You did a great job!’’), passive-constructive (‘‘Ok. Good.’’), active-destructive (‘‘I bet the task wasn’t very hard’’) or passive-destructive (‘‘Not much happening here’’).

For each individual, Monfort et al. used FaceReader software to analyze his/her facial expressions. They explain that this software automatically calculates a compound index of facial behavior, i.e., valence. The valence is determined as the intensity of the happy expression minus the intensity of the negative emotion (sadness, fear, anger, and disgust) with the highest intensity at a given moment. Monfort also measured skin conductance via a pair of electrodes taped on digits II and III of the non-dominant hand.

Woman looking at monitor romantic partners

Partner's response

It was hypothesized that supportive comments such as “Great job” or “Wonderful” would result in increases in happy facial expressions and positive emotions, decreases in sympathetic nervous system activity (skin conductance response), and fewer negative emotions in both partners. While Monfort et al. did not see any clear direct physiological effects, their results on the subjective and behavioral measurements suggested that supportive capitalization responses did indeed lead to greater positive emotion and less negative emotion (felt emotions) in both partners, and an increase in positive facial expressions on the part of the receiver. 

FREE WHITE PAPER: FaceReader methodology

Download the free FaceReader methodology note to learn more about facial expression analysis theory.

  • How FaceReader works
  • More about the calibration
  • Insight in quality of analysis & output

Effects were expected to be stronger for the person receiving supportive capitalization responses than for the giver, which proved to be true in this case. In other words, emotions were most greatly impacted when a person received verbal support. Monfort’s sample only involved couples with high relationship satisfaction, but it would be interesting to duplicate this experiment using couples with low relationship satisfaction, to test whether these findings can be extended to individuals in different types of relationships.


"Project funded by National Science Center (Poland). Grant nr N N106 0167 40 awarded to Lukasz D. Kaczmarek (Adam Mickiewicz University).".

Image and movie courtesy - Monfort, S. S., Kaczmarek, L. D., Kashdan, T. B., Drążkowski, D., Kosakowski, M., Guzik, P., Krauze, T., Gracanin, A.

Don't miss out on the latest blog posts
Share this post
Relevant Blogs

How to measure emotions

Measuring or assessing emotions is not always straightforward and easy. How do we view the nature of emotions in the first place?

Predicting Advertising Effectiveness: Facial Coding of 120.000 Video Frames

The advertising and marketing companies have just received a new addition to their repertoire of the neuromarketing tools – automated coding of facial expressions of basic emotions.

How to master automatic Facial Expression Recognition

Many researchers have turned towards using automated facial expression recognition software to better provide an objective assessment of emotions.