MultiCom is a multidisciplinary team, involved in the design and the evaluation of interactive systems. The human resources include researchers, computer scientists, a statistician, and ergonomists. These various skills allow MultiCom to take part in research projects and industrial contracts concerning new technologies.
Our methodology is user-centered and a dedicated experimentation platform is used as support with observation and behavior capture of users interacting with specific electronic devices. The platform features a large experiment room (100 m2) well adjusted to simulate small-scale environments (e.g. shop, classroom, house, etc.). Two one-way mirrors, as well as cameras and microphones embedded in the experiment room ceiling enable evaluators to observe and record all interesting user activities from the observation room. A second small experiment room is dedicated to the study of the gaze strategy of users interacting with a web site, or any application, with an eye tracker. The modular and flexible architecture of the platform has been designed to support experimentations in various contexts such as house automation, e-learning, intelligent environments with RFID technology, museums, etc.
Eye tracking studies are performed in a dedicated room. Typical applications are software evaluation and internet site evaluation. MultiCom uses eye tracking in contracts for industrial clients (e.g. air traffic control) and in research projects (e.g. facial expression recognition).
A user writes information on a paper sheet and, in real time, a "wizard" writes the same information on the PDA (he takes control of the PDA via the network and can observe the scene via a camera), so that the user believes the electronic pen really exists. This technique is used to test the future functionalities of an interactive system and allows to avoid writing complex software if the concerned functionality is not accepted by the users.
Another feature of our lab is the use of participatory design methodology, whereby all actors in a future interactive product or service, such as designers, engineers, ergonomists, market researchers and end-users, are involved in the design cycle. Some specific tools have been designed by MultiCom to instrument these sessions.
For manual annotation of video streams after an experiment we use The Observer. For instance, in the ACE project (“Agent Conversationnel Expressif”, in English: Expressive Conversational Agent), we annotated the different gestures, postures, dialogue phases and facial expressions of an actress, simulating a conversational agent on the web, whose final purpose would consist in helping net surfers to navigate on a commercial site. The goal of the annotation was to identify which sequences of postures, gestures and facial expressions are associated with which dialogue phases (i.e. give information, ask question, give a choice, give an advice, wait for an answer, etc.), in order to provide examples to the graphical designer in charge of the animation of the future web agent.
For this purpose, using The Observer, different Video Play Lists (VPL) were extracted, e.g. all sequences where the actress was waiting for the answer of the net surfer, or giving information about a product. These video sequences aimed to help the graphical designer to give a very human behavior to the web agent.
For another project, manual video annotation with The Observer has been performed in order to analyze four focus groups about the usage of new technologies. Each focus group gathered people with a specific sociological profile, such as fanatic people about new technology, or detractors. The goal of the observation was to understand the social interaction between the users, according to their profiles.
Automatic real-time annotation
We use a variety of techniques for automated real-time data collection. The type of data collected is different, depending on the experiment. For example, we started using the specific Noldus software uLog Pro to automatically collect keypresses, mouse movements, and any sentences typed on a keyboard. This allows us to easily collect information during human-computer interactions. In other cases, during usability evaluations of mobile devices "in the wild", we record user’s position and actions on the device, system feedback, and device localization in the building. All these data are monitored and stored in the observation room thanks to wireless technologies (HF or WiFi). The digital data collection is possible thanks to our specialized software bus "Usybus" that enables real-time annotation of the events in The Observer. Results of this work have been presented at CHI 2006 by Francis Jambon.
MultiCom - CLIPS IMAG Laboratory
220, rue de la chimie
F-38041 Grenoble cedex