SMCClab projects on gesture, collaboration and intelligence
26 Aug '23
I’ve recently been thinking about projects my lab has been working on and how to focus our work in future to take advantage of collaboration and knowledge-sharing within my group.
All of my group’s work is related to sound and music computing, but we broadly work in three themes: gesture, collaboration, and (computational) intelligence. Projects we have been working on tend to sit on one or across two o these themes, for example:
Spatial Interaction (gesture x collaboration): Creating authentic spatial musical apps that support interaction in AR/VR computers and novel sensors. We have experimented with new freehand gestures in AR and created new kinds of artistic performances. This work has now led to collaborative music systems in AR.
Generating Creative Gestures (gesture x intelligence): My work with the EMPI and IMPS systems focussed on generating creative gestural data. Benedikte Wallace’s work on generative dance expanded these and engaged with full-body motion. Xinlei Niu has focussed on directly generating digital audio.
Guiding Collaborative Performance (collaboration x intelligence): This research aims to create new ways for groups to make music together. We created touchscreen musical instruments that communicate interactions over a network and ways to remotely adjust these interfaces. We plan to create AI models of performance and find ways to guide improvisors towards new musical states. This work follows the research projects I did with Ensemble Metatone and Ensemble Evolution on collaborative touchscreen performance.
Intelligent Instruments (gesture x collaboration x intelligence): Intelligent instruments predict human musical interactions to help them create music. This work involves encapsulating machine learning models of creative interaction into a playable instrumen. Our prototypes include EMPI, a portable musical robot that responds to performances with a 1-dimensional musical interface, self-playing iPads and AI laptop ensembles. The collaboration here can be between the performer and instrument as well as with other musicians or artists. This project is hard because we have to understand models of gesture and apply them in different interactive systems. One particularly interesting approach has been to create physical hardware instruments with Raspberry Pi or Bela. Another way might be to create spatial systems in AR/VR.
The point of writing the above is that, starting from now (2023) and at least for a year or two, I’m only going to support projects that fit within the above four meta-projects or are very clearly aligned with the themes of gesture, collaboration and intelligence. These are hard problems, and we need to create instruments that can be deployed and creatively applied in performance to make an impact. Keeping a focus on a fairly small number of themes will help build capacity and a culture of sharing within my lab.