To develop a context-aware system to help people with eating related disorders like obesity, a method must be available to recognize eating behavior first. To make progress in the field of automatic eating behavior recognition, data must be available to train state-of-the-art machine learning algorithms and test their performance in terms of accuracy.
The iEatSet provides a dataset with 12 participants, recorded 5 times having different meals for 5 days. Ground truth labelling is provided to use for both testing and validation of future algorithm development for eating behavior as well as other activity recognition algorithms.
- Compressed and synchronized RGB videos of the recordings from the IP cameras and the Kinect.
- The calibration data of each camera including the intrinsic and extrinsic parameters.
- 13-bit depth data from the Kinect.
- The raw, calibrated and synchronized 48-bit data from all the 4 IMUs.
- The labelled annotations.
- The timestamps generated by all the sensors.
- Accompanying software to read the data.
All the data is time synchronized.
Read more about the iEatSet dataset in these articles:
- Kakra, V.D.; Aa, van der, N.P.; Noldus. L.P.J.J.; Amft, O. Recognising eating behaviour in restaurants.
- Kakra, V.D.; Aa, van der, N.P.; Noldus, L.P.J.J.; Amft, O, (2014). A multimodal benchmark tool for automated eating behaviour recognition. Proceeding of Measuring Behavior 2014.