The browser you are using is not supported by this website. All versions of Internet Explorer are no longer supported, either by us or Microsoft (read more here: https://www.microsoft.com/en-us/microsoft-365/windows/end-of-ie-support).

Please use a modern browser to fully experience our website, such as the newest versions of Edge, Chrome, Firefox or Safari etc.

Nonlinguistic vocalizations from online amateur videos for emotion research : A validated corpus

Author

Summary, in English

This study introduces a corpus of 260 naturalistic human nonlinguistic vocalizations representing nine emotions: amusement, anger, disgust, effort, fear, joy, pain, pleasure, and sadness. The recognition accuracy in a rating task varied greatly per emotion, from <40% for joy and pain, to >70% for amusement, pleasure, fear, and sadness. In contrast, the raters’ linguistic–cultural group had no effect on recognition accuracy: The predominantly English-language corpus was classified with similar accuracies by participants from Brazil, Russia, Sweden, and the UK/USA. Supervised random forest models classified the sounds as accurately as the human raters. The best acoustic predictors of emotion were pitch, harmonicity, and the spacing and regularity of syllables. This corpus of ecologically valid emotional vocalizations can be filtered to include only sounds with high recognition rates, in order to study reactions to emotional stimuli of known perceptual types (reception side), or can be used in its entirety to study the association between affective states and vocal expressions (production side).

Department/s

Publishing year

2017-04-29

Language

English

Pages

758-771

Publication/Series

Behavior Research Methods

Volume

49

Issue

2

Document type

Journal article

Publisher

Springer

Topic

  • Psychology (excluding Applied Psychology)

Keywords

  • Emotion
  • Nonlinguistic vocalizations
  • Naturalistic vocalizations
  • Acoustic analysis

Status

Published

Research group

  • LUCS Cognitive Zoology Group

ISBN/ISSN/Other

  • ISSN: 1554-3528