TY - Generic T1 - Machine Learning of Personal Gesture Variation in Music Conducting T2 - CHI - Human Factors in Computing Systems Y1 - 2016 A1 - Sarasua, Alvaro A1 - Caramiaux, Baptiste A1 - Tanaka, Atau AB -

This note presents a system that learns expressive and idiosyncratic gesture variations for gesture-based interaction. The system is used as an interaction technique in a music conducting scenario where gesture variations drive music articulation. A simple model based on Gaussian Mixture Modeling is used to allow the user to configure the system by providing variation examples. The system performance and the influence of user musical expertise is evaluated in a user study, which shows that the model is able to learn idiosyncratic variations that allow users to control articulation, with better performance for users with musical expertise.

JF - CHI - Human Factors in Computing Systems PB - ACM Press CY - San Jose, CA ER - TY - CONF T1 - Beat Tracking from Conducting Gestural Data: A Multi-Subject Study T2 - Proceedings of the 2014 International Workshop on Movement and Computing Y1 - 2014 A1 - Sarasua, Alvaro A1 - Guaus, Enric KW - beat tracking KW - classical music KW - conducting KW - expressive performance KW - motion capture AB -

The musical conductor metaphor has been broadly used in the design of musical interfaces where users control the expressive aspects of the performance imitating the movements of conductors. Most of the times, there are predefined rules for the interaction to which users have to adapt. Other works have focused on studying the relation between conductors' gestures and the resulting performance of the orchestra. Here, we study how different subjects move when asked to conduct on top of classical music excerpts, with a focus on the influence of the beat of the performance. Twenty-five subjects were asked to conduct on top of three classical music fragments and recorded with a commercial depth-sense camera. We evaluated predicted beats using ground truth annotations from score-performance alignment by an expert musicologist and a modified F-measure that is able to account for different tendencies on beat anticipation across subjects. The results show that these tendencies can be used for possible improvements in the design of conducting musical interfaces in terms of user adaptation.

JF - Proceedings of the 2014 International Workshop on Movement and Computing PB - ACM CY - Paris, France SN - 978-1-4503-2814-2 UR - http://doi.acm.org/10.1145/2617995.2618016 ER - TY - CONF T1 - Context-Aware Gesture Recognition in Classical Music Conducting T2 - ACM Multimedia Y1 - 2013 A1 - Sarasua, Alvaro AB -

Body movement has received increasing attention in music technology research during the last years. Some new mu- sical interfaces make use of gestures to control music in a meaningful and intuitive way. A typical approach is to use the orchestra conducting paradigm, in which the computer that generates the music would be a virtual orchestra con- ducted by the user. However, although conductors’ gestures are complex and their meaning can vary depending on the musical context, this context-dependency is still to explore. We propose a method to study context-dependency of body and facial gestures of conductors in orchestral classical mu- sic based on temporal clustering of gestures into actions, followed by an analysis of the evolution of audio features after action occurrences. For this, multi-modal data (audio, video, motion capture) will be recorded in real live concerts and rehearsals situations using unobtrusive techniques. 

 

JF - ACM Multimedia CY - Barcelona ER -