Blog - page 8
Neural Network Ensembles in London and Representing Collaborative Interaction
I recently had the chance to present a paper about my “Neural iPad Ensemble” at the Audio Mostly conference in London. The paper, discusses how machine learning can help to model and create free-improvised music on new interfaces, where the rules of music theory may not fit. I described the Recurrent Neural Network (RNN) design that I used to produce an AI iPad ensemble that responds to a “lead” human performer. In the demonstration session, I set up the iPads and RNN and had lots of fun jamming with the conference attendees.
read moreMicroJam at Boost
We presented MicroJam this week at the Boost Technology and Equality in Music Conference at Sentralen, Oslo. The conference arranged a Tech Showcase session in Hvelvet, Sentralen’s old bank vault with developers of music apps, synthesisers, robots and education software.
read moreMusic Tech at IFI
We recently hosted a music technology event at the Department of Informatics to gather together researchers and students from the University of Oslo to see performances and demonstrations of current research.
read morePerforming with a Neural Touch-Screen Ensemble
Since about 2011, I’ve been performing music with various kinds of touch-screen devices in percussion ensembles, new music groups, improvisation workshops, installations, as well as my dedicated iPad group, Ensemble Metatone. Most of these events were recorded; detailed touch and gestural information was collected including classifications of each ensemble member’s gesture every second during each performance. Since moving to Oslo, however, I don’t have an iPad band! This leads to the question: Given all this performance data, can I make an artificial touch-screen ensemble using deep neural networks?

Interactive Music at Oslo Konserthust
We recently had the chance to present some experimental musical instruments developed as part of the EPEC project at Oslo Konserthus as part of their “Score” video game music concert.
read morePhaseRings: Music for ML-Connected Touchscreen Ensemble
PhaseRings is a touchscreen instrument that works within an ML-connected ensemble. A server tracks the four performers’ improvisations and adjusts their user interface during the performance to give them access to different sounds on their screens.
read more