TY - CONF T1 - Crowdsourcing Audience Perspectives on Classical Music T2 - International Workshop on Multimedia Artworks Analysis (MMArt) at IEEE ICME Y1 - 2016 A1 - Cynthia C. S. Liem JF - International Workshop on Multimedia Artworks Analysis (MMArt) at IEEE ICME PB - IEEE CY - Seattle, WA, USA ER - TY - CONF T1 - Classical Music on the Web - User Interfaces and Data Representations T2 - Proceedings of the 16th International Society for Music Information Retrieval Conference, {ISMIR} 2015, Málaga, Spain, October 26-30, 2015 Y1 - 2015 A1 - Martin Gasser A1 - Andreas Arzt A1 - Thassilo Gadermaier A1 - Maarten Grachten A1 - Gerhard Widmer JF - Proceedings of the 16th International Society for Music Information Retrieval Conference, {ISMIR} 2015, Málaga, Spain, October 26-30, 2015 UR - http://ismir2015.uma.es/articles/123_Paper.pdf ER - TY - CONF T1 - Comparative Analysis of Orchestral Performance Recordings: An Image-Based Approach T2 - 16th International Society for Music Information Retrieval Conference Y1 - 2015 A1 - Cynthia C. S. Liem A1 - Alan Hanjalic JF - 16th International Society for Music Information Retrieval Conference CY - Málaga, Spain ER - TY - CHAP T1 - Contextual set-class analysis T2 - Computational Music Analysis Y1 - 2015 A1 - Martorell, Agustín A1 - Gómez, Emilia ED - Meredith, David JF - Computational Music Analysis PB - Springer CY - Heidelberg ER - TY - CONF T1 - Correlations Between Musical Descriptors and Emotions Recognized in Beethoven’s Eroica T2 - Ninth Triennial Conference of the European Society for the Cognitive Sciences of Music (ESCOM) Y1 - 2015 A1 - Erika S. Trent A1 - Emilia Gómez KW - classical music KW - emotion KW - music description KW - music information retrieval KW - personalization AB -

Investigations on music and emotion have identified broad musical elements that influence emotions recognized by listeners, such as timbre, rhythm, melody, and harmony. Not many studies have studied the correlation between quantifiable musical descriptors and their associated emotions; furthermore, only few studies have focused on how listeners’ demographic and musical backgrounds influence the emotion they recognize. In this preliminary study, participants rated how strongly they recognized the six GEMS emotions (transcendence, peacefulness, power, joyful activation, tension, and sadness) while listening to excerpts from Beethoven’s Eroica. Musical descriptors (loudness, brightness, noisiness, tempo/rhythm, harmony, and timbre) were also extracted from each excerpt. Results indicate significant correlations between emotional ratings and musical descriptors, notably positive correlations between key clarity and peacefulness/joyful activation ratings, and negative correlations between key clarity and tension/sadness ratings. Key clarity refers to the key strength associated to the best key candidate; as such, these results suggest that listeners recognize positive emotions in music with a straightforward key, whereas listeners recognize negative emotions in music with a less clear sense of key. The second part of the study computed correlations between demographics and emotional ratings, to determine whether people of similar demographic and musical backgrounds recognized similar emotions. The results indicate that na{\"ıve listeners (i.e. younger subjects, and subjects with less frequent exposure to classical music) experienced more similar emotions from the same musical excerpts than did other subjects. Our findings contribute to developing a quantitative understanding of how musical descriptors, and listeners’ backgrounds, correlate with emotions recognized by listeners.

JF - Ninth Triennial Conference of the European Society for the Cognitive Sciences of Music (ESCOM) CY - Manchester, UK UR - http://phenicx.upf.edu/system/files/publications/0168TrentGomez-ESCOM2015.pdf ER - TY - CONF T1 - The Complete Classical Music Companion V0.9 T2 - 53rd AES Conference on Semantic Audio Y1 - 2014 A1 - Andreas Arzt A1 - Sebastian Böck A1 - Flossmann, Sebastian A1 - Frostel, Harald A1 - Gasser, Martin A1 - Widmer, Gerhard JF - 53rd AES Conference on Semantic Audio CY - London, UK ER - TY - CONF T1 - Context-Aware Gesture Recognition in Classical Music Conducting T2 - ACM Multimedia Y1 - 2013 A1 - Sarasua, Alvaro AB -

Body movement has received increasing attention in music technology research during the last years. Some new mu- sical interfaces make use of gestures to control music in a meaningful and intuitive way. A typical approach is to use the orchestra conducting paradigm, in which the computer that generates the music would be a virtual orchestra con- ducted by the user. However, although conductors’ gestures are complex and their meaning can vary depending on the musical context, this context-dependency is still to explore. We propose a method to study context-dependency of body and facial gestures of conductors in orchestral classical mu- sic based on temporal clustering of gestures into actions, followed by an analysis of the evolution of audio features after action occurrences. For this, multi-modal data (audio, video, motion capture) will be recorded in real live concerts and rehearsals situations using unobtrusive techniques. 

 

JF - ACM Multimedia CY - Barcelona ER -