Advances in Emotion Recognition and Modeling

The field of emotion recognition and modeling is moving towards a more nuanced understanding of human emotions, taking into account individual differences, contextual factors, and temporal dynamics. Recent studies have highlighted the importance of integrating physiological, psychological, and behavioral data to improve emotion recognition accuracy. The use of multimodal datasets, such as those including eye-tracking, EEG, and personality assessments, has been shown to enhance the precision of emotion modeling. Additionally, research has focused on the temporal dynamics of emotional processes, including facial mimicry and emotional impulsivity. These advancements have significant implications for the development of more accurate, adaptive, and individualized emotion recognition systems. Noteworthy papers include:

  • A study on modelling emotions in face-to-face settings, which demonstrated the benefit of combining eye-tracking data, temporal dynamics, and personality traits to enhance emotion detection.
  • The introduction of a novel multimodal emotion recognition dataset, EVA-MED, which captures a broad spectrum of affective responses while accounting for individual differences.
  • A study on exploring the temporal dynamics of facial mimicry in emotion processing, which revealed significant emotional variations and correlations with personality traits.

Sources

Modelling Emotions in Face-to-Face Setting: The Interplay of Eye-Tracking, Personality, and Temporal Dynamics

EVA-MED: An Enhanced Valence-Arousal Multimodal Emotion Dataset for Emotion Recognition

Exploring the Temporal Dynamics of Facial Mimicry in Emotion Processing Using Action Units

Core Components of Emotional Impulsivity: A Mouse-Cursor Tracking Study

Reading Decisions from Gaze Direction during Graphics Turing Test of Gait Animation

CMED: A Child Micro-Expression Dataset

Built with on top of