The Multimodal Multisensory Dataset in support of Musculoskeletal Disorder (MMD-MSD) research

The Multimodal Multisensory Dataset in support of MsculoSkeletal Disorder (MMD-MSD) research consists of the recordings of 100 persons made with a video recorder, camera, and wrist-attached sensors that capture physiological signals (PPG, EDA, Skin temperature), as well as motion sensors (three-axis accelerometer). MMD-MSD opens new opportunities for modelling the body stance (sitting posture and movements), physiological state (stress level, attention, emotional arousal & valence), and performance (success rate on the Stroop test) of people working with a computer.

A subset of the MMD-MSD dataset is available here (5 participants).  The entire dataset contains the recordings of 100 volunteers and is available upon request. Please, contact Този имейл адрес е защитен от спам ботове. Трябва да имате пусната JavaScript поддръжка, за да го видите., Head of the Dept. of Communication Technology at The Technical University of Varna. 

Detailed database description is available in:  Markova V, Ganchev T, Filkova S, Markov M. MMD-MSD: A Multimodal Multisensory Dataset in Support of Research and Technology Development for Musculoskeletal Disorders. Algorithms. 2024; 17(5):187. https://www.mdpi.com/1999-4893/17/5/187 https://doi.org/10.20944/preprints202404.0508.v1

Emotional and Cognitive State Tracking Dataset (ETICS)

The ETICS dataset was collected from 21 participants. It consists of three .csv files for each individualization level from the second part of the test and, correspondingly – 3 files for the third part (with task adaptation). These files consist of the following features:

  • Timestamp/ Stimuli/ Answer (User Response)/Performance/ Speed/ Reaction time - from the application manager;
  • Timestamp / Arousal/ Valence/ Attention/ Angry/ Disgust/ Fear/ Happy/ Neutral/ Sad/ Surprise – from MorphCast platform;
  • Data fusion of the above, matched by timestamp in the moment of user response.

In addition, synchronized high-resolution video (with audio) from the PC camera, capturing the face (and face expressions) of the user and screen capture video from the application (the stimuli of the cognitive test), are recorded in .mp4 file format.

The ETICS dataset is available for Download.

Please cite the ETICS dataset as follows:

M. Markov, Y. Kalinin, V. Markova, T. Ganchev (2023). "Towards implementation of emotional intelligence in human-machine collaborative systems", Special Issue on "Application of Artificial Intelligence in the New Era of Communication Networks", Electronics 202312(18), 3852; DOI: https://doi.org/10.3390/electronics12183852.

Cognitive Load, Affect and Stress dataset (CLAS)

The CLAS (Cognitive Load, Affect and Stress) dataset was conceived as a freely-accessible repository purposely developed to support research on the automated assessment of certain states of mind and a person's emotional condition. This resource is intended to support RTD activities aiming at the development of intelligent human-computer interaction (HCI) interfaces that incorporate functionalities allowing for the automated recognition of human emotions, the automated detection of stress conditions, the automated assessment of the degree of concentration, cognitive load, and momentary cognitive capacity, and can account for some personality traits related to the ability to quickly solve logical and mathematical problems under strict time constraints.

The CLAS dataset is available through the IEEEDataPort repository: link

Please, cite this dataset as follows:

V. Markova, T. Ganchev and K. Kalinkov, "CLAS: A Database for Cognitive Load, Affect and Stress Recognition," 2019 International Conference on Biomedical Innovations and Applications (BIA), 2019, pp. 1-4, doi: https://doi.org/10.1109/BIA48344.2019.8967457.

Motion-Capture Dataset in support of Musculoskeletal Disorders (MCD-MSD) research

The Motion-Capture Dataset in support of Musculoskeletal Disorders research (MCD-MSD) can be downloaded here.

The MCD-MSD dataset consists of motion capture data, images, videos, and models, created in an experimental setup that uses the Perception Neuron motion-capture system and the Axis Neuron software tool.  Specifically, in the current research, we consider three main recording scenarios:

∙                 SK – regular working posture when using a standard keyboard;
∙                 EK - regular working posture when using an ergonomic keyboard;
∙                 EKC – correct working posture when using an ergonomic keyboard;

Please, cite the MCD-MSD dataset as follows:

Feradov, F.; Markova, V.; Ganchev, T., “Automated Detection of Improper Sitting Postures in Computer Users Based on Motion Capture Sensors”, Computers 2022, 11, 116. https://doi.org/10.3390/computers11070116