NoldusHub
Resources
As today’s scientists need to gain deeper insights, a major trend in behavioral research has been the adoption of multimodal measurements. Multimodal research is an innovative and interdisciplinary approach to studying human behavior. It combines multiple modalities, ranging from speech, gestures and eye tracking, to physiological measures like skin conductance and pupil dilation to measure cognitive load. In order to help you out, we've collected useful resources about multimodal research and behavior research labs: from recent blog posts and white papers to videos, and more.
Videos to watch
Why is data integration important?
Only if your data is in sync, you can draw the proper conclusions from multiple data streams.
Integrating multiple data streams
In this video we'll show you why a Noldus lab is a great choice when you want to obtain data out of multiple data streams.
Research in Educational Psychology at Leipzig University
The now installed eye tracker enables them to include reading research in their study.
Blog posts: Multi-modal research
Five studies showing the power of multi-modal data in behavioral research
Whereas in the past a questionnaire or a video observation, might have been sufficient to answer the research question, today’s scientists need to gain deeper insights. Multi-modal data can help with that. Find out more in this blog post.
Biometric Research: the study of subconscious processes
Many researchers worldwide study both explicit and implicit behavior. After all, there is more to behavior than what the eye can see, such as subconscious processes related to attention, cognition, emotion, and physiological arousal. Read more.
Neuromarketing research: Innovative research methods and techniques
In the field of neuromarketing several different technologies are used to measure changes in activity in parts of the brain, outward expressions such as facial expressions, and changes in one's physiological state. Read more.
Download our free White Papers
Blog posts: How emotions are made
Emotions are created in the brain: neuroscience research in the past decades has shown that our brain gives meaning to our experiences and sensations, through concepts such as emotions. Here are three blog posts about measuring emotions:
Blog posts: What can you use eye tracking for?
Eye-tracking adds substantial power to your lab set-up. At its simplest, it records if someone looked at a given object or not. With more complex analysis, it can give all sorts of information about a subject's mental state and the tasks they are carrying out. Here are three blog posts with more information about eye tracking and lab set-ups:
Download our free product overviews
Blog posts: What is cognitive neuroscience?
Cognitive neuroscience: The basics
Cognitive neuroscience is the overlapping science of the ‘dry and the wet’ part of the brain: where dry stands for the cognitive part and where wet stands for the slippery organ consisting of different lobes, called the brain. Read more in this blog post.
Cognitive neuroscience: Emotions
Most of our emotional moments are caused by electric signals in our brain, leading to an increase in hormones, creating an instant feeling of happiness, sadness, or anger. This blog post will zoom into a more specific part of cognitive neuroscience: emotions.
Cognitive neuroscience: Behavior
Researchers typically combine brain and behavioral observations. They can present visuals on a screen, have the subject play a game, or try to solve a puzzle. Therewith, we typically observe a correlation between behavior and brain activity. Read more.
References
Bradley, M. M.; Miccoli, L.; Escrig, M. A.; Lang, P. J. (2008). The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology, 45(4), 602-607, https://doi.org/10.1111/j.1469-8986.2008.00654.x
Cortez, M. M.; Rea, N. A.; Hunter, L. A.; Digre, K. B.; Brennan, K. (2017). Altered pupillary light response scales with disease severity in migrainous photophobia. Cephalalgia, 37(8), 801-811. https://doi.org/10.1177%2F0333102419845641
Gollan, B.; Ferscha, A. (2016). Modelling pupil dilation as online input for estimation of cognitive load in non-laboratory attention-aware systems. COGNITIVE 2016-The Eighth International Conference on Advanced Cognitive Technologies and Applications.
Kahneman, D. (1973). Attention and Effort. New York: Prentice-Hall.
Krejtz, K.; Duchowski, A. T.; Niedzielska, A.; Biele, C.; Krejtz, I. (2018). Eye tracking cognitive load using pupil diameter and microsaccades with fixed gaze. PLoS ONE, 13(9), e0203629.
Slooten, J. C. van; Jahfari, S.; Knapen, T.; Theeuwes, J. (2018). How pupil responses track value-based decision-making during and after reinforcement learning. PLoS Computational Biology, 15(5), e1007031.
Stolte, M.; Gollan, B.; Ansorge, U. (2020). Tracking visual search demands and memory load through pupil dilation. Journal of Vision, 20(6), 1-19.
Gugerell, D.; Gollan, B.; Stolte, M.; Ansorge, U. (2024). Studying the Role of Visuospatial Attention in the Multi-Attribute Task Battery II. Applied Sciences, 14(8), 3158. https://doi.org/10.3390/app14083158