Augmented Reality Music Ensemble (ARME)
The ARME project is funded by the EPSRC and will run from September 2021 – August 2024. The project is a collaboration between University of Birmingham (Max DiLuca, Alan Wing, Maria Witek), Birmingham City University (Ryan Stables) and Warwick University (Mark Elliott).
In musical ensembles such as piano duets, jazz trios, string quartets, rock groups, samba bands, and drum circles, there is no single stable reference for performance timing and musicians must time their performance to each other. This collective timing requires practice, but group rehearsals are not always feasible, leaving musicians to practice solo for much of the time. One established alternative to group rehearsal is to follow a recording of a group playing, with one track omitted. This form of practice leaves out the interactive and mutually adaptive elements of a group rehearsal. We will develop a system that creates virtual musicians based on audio-visual recordings of professional players, with timing patterns that are individually modulated in real-time and that can dynamically synchronise to each other to maintain ensemble, as happens in a real performance. We will allow the user to learn by employing their own instrument interacting with virtual musicians and so progressively improve their musical skills. Such a system not only makes it possible to practise ensemble performance in situations where partners are not available; it also allows the real musician to individually control the timing adjustability of each of the virtual musicians as well as to measure his/her own adjustment characteristics.
Our point of departure for this project is classical string duos, trios, and quartets since these are a relatively standard format and they introduce essential challenges that need to be overcome for a more generalised model of ensemble timing with larger groups. However, the outcomes will likely be applicable to a variety of instrumental ensembles and musical genres. Based on an analysis of string players’ synchronisation and coordination in different musical conditions, we will develop a near real-time model of individual players’ timing characteristics, enabling the development of a system that can adapt note timing to the dynamic evolution of the performance. Through one of our industrial partners, PartPlay, we will have access to recordings of world-leading quartet performances, giving users the experience of playing using their personal string instrument with a virtual version of some of the most skilled professional musicians.
The virtual component of the experience will include both audio and video recordings, offering a multisensory, immersive and realistic rehearsal where every performance is unique and adapted to the skills of the individual. In particular, the system is expected to make a significant contribution to distance-learning musical tuition. Covid-19 makes the e-learning implications particularly relevant, but the system will also offer music students who are otherwise home-bound due to disability greater access to ensemble practice. Furthermore, expansions of the technologies developed in the project may impact the entertainment industry, for example, for mixed reality performance.
See project website for more details.