TY - CONF T1 - Correlations Between Musical Descriptors and Emotions Recognized in Beethoven’s Eroica T2 - Ninth Triennial Conference of the European Society for the Cognitive Sciences of Music (ESCOM) Y1 - 2015 A1 - Erika S. Trent A1 - Emilia Gómez KW - classical music KW - emotion KW - music description KW - music information retrieval KW - personalization AB -

Investigations on music and emotion have identified broad musical elements that influence emotions recognized by listeners, such as timbre, rhythm, melody, and harmony. Not many studies have studied the correlation between quantifiable musical descriptors and their associated emotions; furthermore, only few studies have focused on how listeners’ demographic and musical backgrounds influence the emotion they recognize. In this preliminary study, participants rated how strongly they recognized the six GEMS emotions (transcendence, peacefulness, power, joyful activation, tension, and sadness) while listening to excerpts from Beethoven’s Eroica. Musical descriptors (loudness, brightness, noisiness, tempo/rhythm, harmony, and timbre) were also extracted from each excerpt. Results indicate significant correlations between emotional ratings and musical descriptors, notably positive correlations between key clarity and peacefulness/joyful activation ratings, and negative correlations between key clarity and tension/sadness ratings. Key clarity refers to the key strength associated to the best key candidate; as such, these results suggest that listeners recognize positive emotions in music with a straightforward key, whereas listeners recognize negative emotions in music with a less clear sense of key. The second part of the study computed correlations between demographics and emotional ratings, to determine whether people of similar demographic and musical backgrounds recognized similar emotions. The results indicate that na{\"ıve listeners (i.e. younger subjects, and subjects with less frequent exposure to classical music) experienced more similar emotions from the same musical excerpts than did other subjects. Our findings contribute to developing a quantitative understanding of how musical descriptors, and listeners’ backgrounds, correlate with emotions recognized by listeners.

JF - Ninth Triennial Conference of the European Society for the Cognitive Sciences of Music (ESCOM) CY - Manchester, UK UR - http://phenicx.upf.edu/system/files/publications/0168TrentGomez-ESCOM2015.pdf ER - TY - JOUR T1 - Personality Correlates for Digital Concert Program Notes JF - UMAP 2015, Springer LNCS 9146 Y1 - 2015 A1 - Tkalčič, Marko A1 - Ferwerda, Bruce A1 - Hauger, David A1 - Schedl, Markus KW - classical music KW - digital program notes KW - personality ER - TY - CONF T1 - Beat Tracking from Conducting Gestural Data: A Multi-Subject Study T2 - Proceedings of the 2014 International Workshop on Movement and Computing Y1 - 2014 A1 - Sarasua, Alvaro A1 - Guaus, Enric KW - beat tracking KW - classical music KW - conducting KW - expressive performance KW - motion capture AB -

The musical conductor metaphor has been broadly used in the design of musical interfaces where users control the expressive aspects of the performance imitating the movements of conductors. Most of the times, there are predefined rules for the interaction to which users have to adapt. Other works have focused on studying the relation between conductors' gestures and the resulting performance of the orchestra. Here, we study how different subjects move when asked to conduct on top of classical music excerpts, with a focus on the influence of the beat of the performance. Twenty-five subjects were asked to conduct on top of three classical music fragments and recorded with a commercial depth-sense camera. We evaluated predicted beats using ground truth annotations from score-performance alignment by an expert musicologist and a modified F-measure that is able to account for different tendencies on beat anticipation across subjects. The results show that these tendencies can be used for possible improvements in the design of conducting musical interfaces in terms of user adaptation.

JF - Proceedings of the 2014 International Workshop on Movement and Computing PB - ACM CY - Paris, France SN - 978-1-4503-2814-2 UR - http://doi.acm.org/10.1145/2617995.2618016 ER -