PhaseRings: Music for ML-Connected Touchscreen Ensemble
01 Aug '16
PhaseRings is a touchscreen instrument that works within an ML-connected ensemble. A server tracks the four performers’ improvisations and adjusts their user interface during the performance to give them access to different sounds on their screens.

PhaseRings is written in Objective-C for iOS. The source code is available on GitHub.
This particular performance was part of the ICAD conference in Canberra in July 2016 and reflects the “final” version of PhaseRings produced as part of my PhD research. I used PhaseRings in research studies (reported at CHI 2016) to understand how we could use ML-connected decision making to feedback information with an ensemble of musicians. Simultaneously, I was using PhaseRings as my “main instruments” for duo performances with Alec Hunter, solo improvisation projects, and group improvisation with the ANU Experimental Studio. This peer-reviewed performance at ICAD put together the ML-agent research back into a cultural context in the artist program of ICAD.
More detail about this performance cann be found in the short paper that went with it. Program notes from the concert are here