%0 Conference Proceedings %B Proceedings of the 17th International Society for Music Information Retrieval Conference (ISMIR 2016) %D 2016 %T An Analysis of Agreement in Classical Music Perception and Its Relationship to Listener Characteristics %A Markus Schedl %A Hamid Eghbal-zadeh %A Emilia Gómez %A Marko Tkalčič %B Proceedings of the 17th International Society for Music Information Retrieval Conference (ISMIR 2016) %C New York, USA %8 08/2016 %G eng %0 Conference Paper %B 2nd International Conference on New Music Concepts (ICNMC 2016) %D 2016 %T Modeling Loudness Variations in Ensemble Performance %A Gadermaier, Thassilo %A Grachten, Maarten %A Cancino-Chacon, Carlos Eduardo %B 2nd International Conference on New Music Concepts (ICNMC 2016) %I ABEditore %C Treviso, Italy %G eng %0 Conference Paper %B Proceedings of the International Joint Conference on Artificial Intelligence %D 2015 %T Artificial Intelligence in the Concertgebouw %A Andreas Arzt %A H. Frostel %A Th. Gadermaier %A M. Gasser %A G. Widmer %A M. Grachten %B Proceedings of the International Joint Conference on Artificial Intelligence %C Buenos Aires, Argentina %G eng %0 Conference Paper %B Proceedings of the 16th International Society for Music Information Retrieval Conference, {ISMIR} 2015, Málaga, Spain, October 26-30, 2015 %D 2015 %T Classical Music on the Web - User Interfaces and Data Representations %A Martin Gasser %A Andreas Arzt %A Thassilo Gadermaier %A Maarten Grachten %A Gerhard Widmer %B Proceedings of the 16th International Society for Music Information Retrieval Conference, {ISMIR} 2015, Málaga, Spain, October 26-30, 2015 %G eng %U http://ismir2015.uma.es/articles/123_Paper.pdf %0 Book Section %B Computational Music Analysis %D 2015 %T Contextual set-class analysis %A Martorell, Agustín %A Gómez, Emilia %E Meredith, David %B Computational Music Analysis %I Springer %C Heidelberg %P 81-110 %G eng %& 4 %R 10.1007/978-3-319-25931-4 %0 Conference Paper %B Ninth Triennial Conference of the European Society for the Cognitive Sciences of Music (ESCOM) %D 2015 %T Correlations Between Musical Descriptors and Emotions Recognized in Beethoven’s Eroica %A Erika S. Trent %A Emilia Gómez %K classical music %K emotion %K music description %K music information retrieval %K personalization %X

Investigations on music and emotion have identified broad musical elements that influence emotions recognized by listeners, such as timbre, rhythm, melody, and harmony. Not many studies have studied the correlation between quantifiable musical descriptors and their associated emotions; furthermore, only few studies have focused on how listeners’ demographic and musical backgrounds influence the emotion they recognize. In this preliminary study, participants rated how strongly they recognized the six GEMS emotions (transcendence, peacefulness, power, joyful activation, tension, and sadness) while listening to excerpts from Beethoven’s Eroica. Musical descriptors (loudness, brightness, noisiness, tempo/rhythm, harmony, and timbre) were also extracted from each excerpt. Results indicate significant correlations between emotional ratings and musical descriptors, notably positive correlations between key clarity and peacefulness/joyful activation ratings, and negative correlations between key clarity and tension/sadness ratings. Key clarity refers to the key strength associated to the best key candidate; as such, these results suggest that listeners recognize positive emotions in music with a straightforward key, whereas listeners recognize negative emotions in music with a less clear sense of key. The second part of the study computed correlations between demographics and emotional ratings, to determine whether people of similar demographic and musical backgrounds recognized similar emotions. The results indicate that na{\"ıve listeners (i.e. younger subjects, and subjects with less frequent exposure to classical music) experienced more similar emotions from the same musical excerpts than did other subjects. Our findings contribute to developing a quantitative understanding of how musical descriptors, and listeners’ backgrounds, correlate with emotions recognized by listeners.

%B Ninth Triennial Conference of the European Society for the Cognitive Sciences of Music (ESCOM) %C Manchester, UK %8 17/08/2015 %G eng %U http://phenicx.upf.edu/system/files/publications/0168TrentGomez-ESCOM2015.pdf %0 Conference Paper %B Proceedings of the 18th International Conference on Discovery Science (DS 2015) %D 2015 %T An evaluation of score descriptors combined with non-linear models of expressive dynamics in music %A Cancino Chacón, C. E. %A M. Grachten %B Proceedings of the 18th International Conference on Discovery Science (DS 2015) %I Springer %C Banff, Canada %G eng %0 Conference Paper %B Proceedings of the Vienna Talk on Music Acoustics %D 2015 %T Flexible Score Following: The Piano Music Companion and Beyond %A Andreas Arzt %A Goebl, W. %A Widmer, G. %B Proceedings of the Vienna Talk on Music Acoustics %G eng %0 Conference Paper %B Proceedings of the 21st International Conference on MultiMedia Modeling (MMM 2015) %D 2015 %T Iron Maiden while jogging, Debussy for dinner? - An analysis of music listening behavior in context %A Michael Gillhofer %A Markus Schedl %B Proceedings of the 21st International Conference on MultiMedia Modeling (MMM 2015) %C Sydney, Australia %8 January %G eng %0 Conference Paper %B Music Information Retrieval Evaluation eXchange (MIREX) %D 2015 %T Melody extraction by means of a source-filter model and pitch contour characterization (MIREX 2015) %A Bosch, J. %A Gómez, E. %B Music Information Retrieval Evaluation eXchange (MIREX) %G eng %0 Conference Paper %B ISMIR (Late Breaking Demo) %D 2015 %T Melovizz: A Web-based tool for Score-Informed Melody Extraction Visualization %A Bosch, J. %A Mayor, O. %A Gómez, E. %B ISMIR (Late Breaking Demo) %G eng %0 Conference Paper %B Proceedings of the IEEE International Conference on Multimedia and Expo (ICME 2015) %D 2015 %T PHENICX: Innovating the Classical Music Experience %A Cynthia C. S. Liem %A Emilia Gómez %A Markus Schedl %B Proceedings of the IEEE International Conference on Multimedia and Expo (ICME 2015) %C Torino, Italy %8 June–July %G eng %0 Conference Paper %B 1st international workshop on computer and robotic Systems for Automatic Music Performance (SAMP14) %D 2014 %T Analysis and prediction of expressive dynamics using Bayesian linear models %A Grachten, M %A Cancino Chacón, C. E. %A Widmer, G. %B 1st international workshop on computer and robotic Systems for Automatic Music Performance (SAMP14) %C Venice, Italy %8 July %G eng %0 Journal Article %J {IEEE} Transactions on Multimedia %D 2014 %T An Assessment of Learned Score Features for Modeling Expressive Dynamics in Music %A M. Grachten %A F. Krebs %B {IEEE} Transactions on Multimedia %V 16 %P 1211–1218 %G eng %U http://dx.doi.org/10.1109/TMM.2014.2311013 %R 10.1109/TMM.2014.2311013 %0 Conference Paper %B Proceedings of the 2014 International Workshop on Movement and Computing %D 2014 %T Beat Tracking from Conducting Gestural Data: A Multi-Subject Study %A Sarasua, Alvaro %A Guaus, Enric %K beat tracking %K classical music %K conducting %K expressive performance %K motion capture %X

The musical conductor metaphor has been broadly used in the design of musical interfaces where users control the expressive aspects of the performance imitating the movements of conductors. Most of the times, there are predefined rules for the interaction to which users have to adapt. Other works have focused on studying the relation between conductors' gestures and the resulting performance of the orchestra. Here, we study how different subjects move when asked to conduct on top of classical music excerpts, with a focus on the influence of the beat of the performance. Twenty-five subjects were asked to conduct on top of three classical music fragments and recorded with a commercial depth-sense camera. We evaluated predicted beats using ground truth annotations from score-performance alignment by an expert musicologist and a modified F-measure that is able to account for different tendencies on beat anticipation across subjects. The results show that these tendencies can be used for possible improvements in the design of conducting musical interfaces in terms of user adaptation.

%B Proceedings of the 2014 International Workshop on Movement and Computing %I ACM %C Paris, France %@ 978-1-4503-2814-2 %G eng %U http://doi.acm.org/10.1145/2617995.2618016 %R 10.1145/2617995.2618016 %0 Conference Paper %B 53rd AES Conference on Semantic Audio %D 2014 %T The Complete Classical Music Companion V0.9 %A Andreas Arzt %A Sebastian Böck %A Flossmann, Sebastian %A Frostel, Harald %A Gasser, Martin %A Widmer, Gerhard %B 53rd AES Conference on Semantic Audio %C London, UK %8 01/2014 %G eng %0 Conference Paper %B Proceedings of the International Conference on New Interfaces for Musical Expression %D 2014 %T Dynamics in Music Conducting: A Computational Comparative Study Among Subjects %A Álvaro Sarasúa %A Enric Guaus %X

Many musical interfaces have used the musical conductor metaphor, allowing users to control the expressive aspects of a performance by imitating the gestures of conductors. In most of them, the rules to control these expressive aspects are predefined and users have to adapt to them. Other works have studied conductors' gestures in relation to the performance of the orchestra. The goal of this study is to analyze, following the path initiated by this latter kind of works, how simple motion capture descriptors can explain the relationship between the loudness of a given performance and the way in which different subjects move when asked to impersonate the conductor of that performance. Twenty-five subjects were asked to impersonate the conductor of three classical music fragments while listening to them. The results of different linear regression models with motion capture descriptors as explanatory variables show that, by studying how descriptors correlate to loudness differently among subjects, different tendencies can be found and exploited to design models that better adjust to their expectations.

%B Proceedings of the International Conference on New Interfaces for Musical Expression %I Goldsmiths, University of London %C London, United Kingdom %8 06/2014 %G eng %U http://nime2014.org/proceedings/papers/464_paper.pdf %0 Journal Article %J Journal of Mathematics and Music %D 2014 %T Hierarchical multi-scale set-class analysis %A Martorell, Agustín %A Gómez, Emilia %B Journal of Mathematics and Music %P 1-14 %8 05/2014 %G eng %U http://dx.doi.org/10.1080/17459737.2014.906072 %R 10.1080/17459737.2014.906072 %0 Conference Proceedings %B 9th Conference on Interdisciplinary Musicology – CIM14 %D 2014 %T Melody extraction in symphonic classical music: a comparative study of mutual agreement between humans and algorithms %A Bosch, J. %A Gómez, E. %B 9th Conference on Interdisciplinary Musicology – CIM14 %C Berlin %8 12/2014 %G eng %0 Journal Article %J {Foundations and Trends in Information Retrieval} %D 2014 %T Music Information Retrieval: Recent Developments and Applications %A Markus Schedl %A Emilia Gómez %A Julián Urbano %B {Foundations and Trends in Information Retrieval} %V 8 %P 127–261 %G eng %R http://dx.doi.org/10.1561/1500000042 %0 Conference Paper %B Proceedings of the Conference on Prestigious Applications of Intelligent Systems (PAIS) %D 2014 %T The Piano Music Companion %A Andreas Arzt %A Sebastian Böck %A Flossmann, S. %A Frostel, H. %A Gasser, M. %A Cynthia C. S. Liem %A Widmer, G. %B Proceedings of the Conference on Prestigious Applications of Intelligent Systems (PAIS) %G eng %0 Conference Paper %B Proceedings of the 15th International Conference on Music Information Retrieval %D 2014 %T Predicting expressive dynamics in piano performances using neural networks %A Van Herwaarden, S %A Grachten, M %A De Haas, W. B. %B Proceedings of the 15th International Conference on Music Information Retrieval %C Taipei, Taiwan %8 October %G eng %0 Conference Proceedings %B 15th International Society for Music Information Retrieval Conference, Taipei, Taiwan %D 2014 %T Systematic multi-scale set-class analysis %A Martorell, Agustín %A Gómez, Emilia %B 15th International Society for Music Information Retrieval Conference, Taipei, Taiwan %C Taipei (Taiwan) %G eng %0 Conference Paper %B Proceedings of the 14th International Society for Music Information Retrieval Conference %D 2013 %T Automatic alignment of music performances with structural differences %A Grachten, Maarten %A Gasser, Martin %A Andreas Arzt %A Widmer, Gerhard %X

Both in interactive music listening, and in music performance research, there is a need for automatic alignment of different recordings of the same musical piece. This task is challenging, because musical pieces often contain parts that may or may not be repeated by the performer, possibly leading to structural differences between performances (or between performance and score). The most common alignment method, dynamic time warping (DTW), cannot handle structural differences adequately, and existing approaches to deal with structural differences explicitly rely on the annotation of ``break points'' in one of the sequences. We propose a simple extension of the Needleman-Wunsch algorithm to deal effectively with structural differences, without relying on annotations. We evaluate several audio features for alignment, and show how an optimal value can be found for the cost-parameter of the alignment algorithm. A single cost value is demonstrated to be valid across different types of music. We demonstrate that our approach yields roughly equal alignment accuracies compared to DTW in the absence of structural differences, and superior accuracies when structural differences occur.

 

%B Proceedings of the 14th International Society for Music Information Retrieval Conference %C Curitiba, Brazil %8 November %G eng %0 Conference Paper %B 1st International Workshop on Interactive Content Consumption (WSICC) at EuroITV 2013 %D 2013 %T Innovating the Classical Music Experience in the PHENICX Project: Use Cases and Initial User Feedback %A Cynthia C. S. Liem %A Ron van der Sterren %A Marcel van Tilburg %A Álvaro Sarasúa %A Juan J. Bosch %A Jordi Janer %A Mark S. Melenhorst %A Emilia Gómez %A Alan Hanjalic %K interactivity %K multimedia information systems %K multimodality %K music information retrieval %K performing arts %K social networks %K user studies %X

The FP7 PHENICX project focuses on creating a new digital classical concert experience, improving the accessibility of classical music concert performances by enhancing and enriching them in novel digital ways, In this paper, we present the project’s foreseen use cases. Subsequently, we summarize initial use case feedback from two different user groups. Despite the early stage of the project, the feedback already gives important insight into real-world considerations to make for interactive music content consumption solutions.

%B 1st International Workshop on Interactive Content Consumption (WSICC) at EuroITV 2013 %C Como, Italy %8 06/2013 %G eng %0 Conference Paper %B Proceedings of the 21st ACM International Conference on Multimedia %D 2013 %T Multimedia Information Retrieval: Music and Audio %A Markus Schedl %A Emilia Gómez %A Masataka Goto %B Proceedings of the 21st ACM International Conference on Multimedia %C Barcelona, Spain %8 October %G eng %0 Conference Paper %B SMAC Stockholm Music Acoustics Conference 2013 and SMC Sound and Music Computing Conference 2013 %D 2013 %T PHENICX: Performances as Highly Enriched aNd Interactive Concert Experiences %A Gómez, E. %A Grachten, M. %A Hanjalic, A. %A Janer, J. %A Jordà, S. %A Julià, C. F. %A Cynthia C. S. Liem %A Martorell, A. %A Schedl, M. %A Widmer, G. %X

Modern digital multimedia and internet technology have radically changed the ways people find entertainment and discover new interests online, seemingly without any phys- ical or social barriers. Such new access paradigms are in sharp contrast with the traditional means of entertainment. An illustrative example of this is live music concert perfor- mances that are largely being attended by dedicated audi- ences only.


This papers introduces the PHENICX project, which aims at enriching traditional concert experiences by using state- of-the-art multimedia and internet technologies. The project focuses on classical music and its main goal is twofold: (a) to make live concerts appealing to potential new au- dience and (b) to maximize the quality of concert experi- ence for everyone. Concerts will then become multimodal, multi-perspective and multilayer digital artifacts that can be easily explored, customized, personalized, (re)enjoyed and shared among the users. The paper presents the main scientific objectives on the project, provides a state of the art review on related research and presents the main chal- lenges to be addressed.

%B SMAC Stockholm Music Acoustics Conference 2013 and SMC Sound and Music Computing Conference 2013 %C Stockholm, Sweden %8 08/2013 %G eng