Blog - page 4
Performing with a Neural Touch-Screen Ensemble
Since about 2011, I’ve been performing music with various kinds of touch-screen devices in percussion ensembles, new music groups, improvisation workshops, installations, as well as my dedicated iPad group, Ensemble Metatone. Most of these events were recorded; detailed touch and gestural information was collected including classifications of each ensemble member’s gesture every second during each performance. Since moving to Oslo, however, I don’t have an iPad band! This leads to the question: Given all this performance data, can I make an artificial touch-screen ensemble using deep neural networks?
Interactive Music at Oslo Konserthust
We recently had the chance to present some experimental musical instruments developed as part of the EPEC project at Oslo Konserthus as part of their “Score” video game music concert.
read moreConcert at ICAD2016
I joined the ANU Experimental Music Studio to perform several works at the International Conference on Auditory Display at the ANU School of Music last week - here’s some photos!
read moreNIME2015: Tracking an iPad Ensemble with Gesture Classification and Transition Matrices
I’ve just gotten back from NIME2015 in Baton Rouge where I presented a poster about my performance tracking system for iPad ensembles. Great to catch up with new and old NIMErs! The paper is available here and here’s the text of the poster:
read moreAudio Cables for iPad Performance
I’ve been using iPads and iPhones in performances since 2011 and other performers often ask how I connect the iPads to the PA system. The easy answer is “just use the headphone output”, but getting audio cables with a 3.5mm jack that are rugged enough to work on stage can be a bit of a problem.
read moreElectronic Music Workshop - January 2015
In January 2015 I ran a four day workshop in computer music, hardware hacking and new interfaces at the ANU School of Music.
read more