Integration of observational data and data from other sensors (e.g. physiological signals) is an important topic in behavioral research, especially in human factors (e.g. automotive) and applied psychology research (e.g. consumer testing). Expectations have risen in the last years. Was manual synchronization and integration of data sufficient some years ago, nowadays substantial automation of this process is expected. The system integration theme follows this trend by focusing on automated data integration, i.e. development of real-time interfaces for simulators, eye trackers and physiological data acquisition systems in projects such as ADVICE (automotive) and FOCOM (consumer behavior). Another topic in this theme is data analysis, aimed at new methods to interpret combinations of data modalities. For instance, we work on real-time assessment of mental workload through pattern recognition in EEG signals.
Traditionally, scoring the behaviors of test participants or patients was done manually using pen and paper. We have come a long way since then. Nowadays, overt behavioral responses can be coded digitally with The Observer XT while simultaneously external signals, either originating from the human body (e.g. ECG, GSR, EMG, EEG) or from a system (e.g. car simulator data), are acquired using dedicated data acquisition systems. The integration of the video streams, observational events and external/physiological data is vital in interpreting a person’s response to test conditions or a particular treatment. From the researcher’s perspective, some important requirements for integrating observational and external data are:
- The observational and external data should be perfectly synchronized.
- Integration and synchronization of observational and external data should be automated as much as possible.
- The system should offer analysis of combinations of data modalities, e.g. behavioral events during a particular physiological state, or vice versa.
- Data analysis and report generation should also be automated, making the results available to the researcher as soon as possible after the test.
What research projects such as ADVICE, FOCOM and NeuroSIPE (see the pictures below) have in common is that observations, video streams and a large amount of external data need to be synchronized, integrated and analyzed in an efficient way with a minimum of user involvement. To this aim, we are developing methods and tools for real-time integration and data analysis, which will result in new integrated systems and solutions for research in various fields.
In the DRIVOBS project, driver behavior in a car simulator (Cruden) is observed using video, eye tracking, physiological measurements and system identification in complex driving scenarios. Simulator events and the signals from camera, sensors and eye tracker are synchronized in The Observer XT for integrated analysis. From this we can learn how drivers use vision, motion, and other information to control vehicles. The project ADVICE goes one step further: behavior and physiology are analyzed in real-time, and translated to feedback to the driver. The intended result is a suite of automotive simulation products with integrated driver observation and feedback to aid the development of vehicle dynamic control systems, active safety systems, infotainment systems and human-machine interfaces.
In the FOCOM project, we are working on the design of a “virtual shop simulator”, which combines an immersive virtual reality environment with measurement of eye gaze, physiology (galvanic skin response, heart rate) and brain responses (EEG and NIRS) in a consumer-friendly test environment. The Observer XT will serve as the data integration and analysis platform.