Demo overview of the PHENICX exhibit at ICT 2013

On this page, detailed information on the demos shown at ICT 2013 is given. The demos can be seen at booth 5J6. Next to this, we will present an overview of all our demos on the main performance stage on Wednesday, November 6 at 15.30 h and Thursday, November 7 at 16.30 h.


Enriched concert e-magazines: RCO Editions

RCO Editions is a bimonthly video magazine for the iPad, featuring concert registrations of the Royal Concertgebouw Orchestra and supporting commentary, documentaries and articles. Drawing on earlier experience with an app called “RCO meets Fink”, an innovative way of adding enriching content to the musical presentation has been developed, offering either written text or video as extra layer on specific moments in the music.

For more info, see http://www.concertgebouworkest.nl/en/rco-editions/‎ .


Handling multimodal, time-synchronous data: RepoVizz

RepoVizz is an integrated online system capable of structural formatting and remote storage, browsing, exchange, annotation, and visualization of synchronous multi-modal, time-aligned data. Motivated by a growing need for data-driven collaborative research, repoVizz aims to resolve commonly encountered difficulties in sharing or browsing large collections of multi-modal data. At its current state, repoVizz is designed to hold time-aligned streams of heterogeneous data: audio, video, motion capture, physiological signals, extracted descriptors, annotations, et cetera. Most popular formats for audio and video are supported, while CSV formats are adopted for streams other than audio or video (e.g., motion capture or physiological signals). Datasets are stored in an online database, allowing the user to interact with the data remotely through a powerful HTML5 visual interface accessible from any standard web browser. In the context of PHENICX, RepoVizz is the data repository where the multimodal data gathered from classical music concerts is stored and shared between the different partners. Customized visualizations are being created within RepoVizz to enrich and improve the user experience when exploring the multimodal concerts.

For more info, see http://repovizz.upf.edu


Tracking musical gestures

The conductor plays a key role in orchestral classical music. His gestures convey important information to the orchestra and are closely related to the expressive aspects of the performance. In PHENICX, we use depth-sense cameras to unobtrusively track the movements of the conductor. From the position of different parts of the body we can automatically compute descriptors that relate to the characteristics of the movements that are relevant for the conductor and the orchestra. This system allows to explore and visualize in real-time the descriptors being computed from each of the parts of the body being tracked. Also, it allows to visualize descriptors that describe the movement of the whole body. By being able to compute all this information in real-time, we will use it not just to visualize and analyze the movement during performance, but to actually allow users to "impersonate" the conductor and control the performance of a virtual orchestra with his/her movements.


Informing the performance from audio/score analysis

As repoVizz handles a variety of time-aligned streams, different sources of information can feed the system. Given the richness and complexity of the symphonic format, we analyze both the audio tracks from the concerts and the scores of the compositions, which are precisely time-aligned to their counterparts. The performance-related information, such as loudness or tempo, is extracted directly from the audio, while other musically meaningful descriptions are computed from the score. This provides information layers at several degrees of sophistication, purposeful for informing users with varied musical backgrounds. This ranges the localization of the main themes of the symphony, a precise account of the notes being played by each instrument, or structure analyses based on orchestration or tonality. Some of these descriptions can be adapted, in combination with proper visualizations, for navigating the piece interactively. Thus, we can choose to visualize just the melodic lines of the clarinets, jump back-and-forth among the appearances of a recurrent motif, or localize the tutti sections. Content providers can take advantage of these extraction and description methods for speeding-up the edition process, as many of them can be automated to a great extent.


Tracking performances through live score-following

One of the goals of PHENICX is to enhance the listening experience of live music concerts. To present multimodal information in sync with live music, algorithms are needed that continuously compute the current position of the performers within a piece. The prototype we present listens to a live performance of classical piano music and almost instantly identifies the piece that is being played and the position within the piece. It then tracks the performance via an accurate and robust score following algorithm and shows the sheet music accordingly. Furthermore, the system continuously re-evaluates its current position hypotheses within a database of scores (roughly 1,000,000 notes!) and is capable of detecting arbitrary 'jumps' by the performer (e.g., leaving out repetitions, re-starts at any position, etc.) – not only within a piece, but within the complete database of classical piano scores.