Recognising Stress Semantics, Speech and Gestures
This paper investigates how speech and gestures convey stress, and how they can be used for automatic stress recognition. As a first step, we look into how humans use speech and gestures to convey stress. In particular, for both speech and gestures, we distinguish between stress conveyed by the intended semantic message (e.g. spoken words for speech, symbolic meaning for gestures), and stress conveyed by the modulation of either speech and gestures (e.g. intonation for speech, speed and rhythm for gestures).
WP-02
Date of Publication:
30 November -0001
30 November -0001
Year of Publication:
2015
2015
@article{sensecare:505,
- title = {Recognising Stress Semantics, Speech and Gestures},
- year = {2015},
- date = {November 30, -0001},
2015 Recognising Stress Semantics, Speech and Gestures November 30, -0001
Click on the link under to view document:
Workpackages
WP2 Affective Computing (AC) & Machine Learning
Related Articles
Document
ACM Physiological Computing and BCI
Erin Treacy Solovey, Daniel Afergan, Evan M. Peck, Samuel W. Hincks, Robert J. K. Jacob
Document
Multiple Arousal Theory and EDA Asymmetry
Rosalind W. Picard, Szymon Fedor, Yadid Ayzenberg
Document
MIT Media Lab AC Current and Past Project
MIT researchers led by Rosalind Picard researchers led by
Document
Dynamic Emotion Wheel_ Emotion Self-Report and Awareness Tool
Swiss Centre for Affective Sciences for
Document
The Center for the Study of Emotion and Attention VITAL
Dr. Peter J. Lang, Ph.D, University of Florida of
Document
Automatic Detection of Engagement Kinect
Hamed Monkaresi, Nigel Bosch, Rafael Calvo, Sidney D'Mello
Document
Auto Detection of Chronic Pain Expressions
Min S. H. Aung, Sebastian Kaltwang, Bernardino Romera-Paredes, Brais Martinez, Aneesha Singh, Matteo Cella, Michel Valstar, Hongying Meng, Andrew Kemp, Moshen Shafizadeh, Aaron C. Elkins, Natalie Kanakam, Amschel de Rothschild de, Nick Tyler, Paul J. Watson, Amanda C. de C. Williams de, Maja Pantic, Nadia Bianchi-Berthouze