Emotional and Cognitive State Tracking Dataset (ETICS)

The ETICS dataset was collected from 21 participants. It consists of three .csv files for each individualization level from the second part of the test and, correspondingly – 3 files for the third part (with task adaptation). These files consist of the following features:

  • Timestamp/ Stimuli/ Answer (User Response)/Performance/ Speed/ Reaction time - from the application manager;
  • Timestamp / Arousal/ Valence/ Attention/ Angry/ Disgust/ Fear/ Happy/ Neutral/ Sad/ Surprise – from MorphCast platform;
  • Data fusion of the above, matched by timestamp in the moment of user response.

In addition, synchronized high-resolution video (with audio) from the PC camera, capturing the face (and face expressions) of the user and screen capture video from the application (the stimuli of the cognitive test), are recorded in .mp4 file format.

ETICS was used in the manuscript by M. Markov, Y. Kalinin, V. Markova, T. Ganchev (2023 under review). "Towards implementation of emotional intelligence in human-machine collaborative systems", submitted to Electronics, 2023, Special Issue on "Application of Artificial Intelligence in the New Era of Communication Networks". Download

Cognitive Load, Affect and Stress dataset (CLAS)

The CLAS (Cognitive Load, Affect and Stress) dataset was conceived as a freely-accessible repository purposely developed to support research on the automated assessment of certain states of mind and a person's emotional condition. This resource is intended to support RTD activities aiming at the development of intelligent human-computer interaction (HCI) interfaces that incorporate functionalities allowing for the automated recognition of human emotions, the automated detection of stress conditions, the automated assessment of the degree of concentration, cognitive load, and momentary cognitive capacity, and can account for some personality traits related to the ability to quickly solve logical and mathematical problems under strict time constraints.

The CLAS dataset is available through the IEEEDataPort repository: link

Please, cite this dataset as follows:

V. Markova, T. Ganchev and K. Kalinkov, "CLAS: A Database for Cognitive Load, Affect and Stress Recognition," 2019 International Conference on Biomedical Innovations and Applications (BIA), 2019, pp. 1-4, doi: https://doi.org/10.1109/BIA48344.2019.8967457.

Motion-Capture Dataset in support of Musculoskeletal Disorders research

The Motion-Capture Dataset in support of Musculoskeletal Disorders research (MCD-MSD) can be downloaded here.

The MCD-MSD dataset consists of motion capture data, images, videos, and models, created in an experimental setup that uses the Perception Neuron motion-capture system and the Axis Neuron software tool.  Specifically, in the current research, we consider three main recording scenarios:

∙                 SK – regular working posture when using a standard keyboard;
∙                 EK - regular working posture when using an ergonomic keyboard;
∙                 EKC – correct working posture when using an ergonomic keyboard;

Please, cite the MCD-MSD dataset as follows:

Feradov, F.; Markova, V.; Ganchev, T., “Automated Detection of Improper Sitting Postures in Computer Users Based on Motion Capture Sensors”, Computers 2022, 11, 116. https://doi.org/10.3390/computers11070116