Research

Publications, history, and the ideas behind IMPSY.

IMPSY sits in a research thread that began with mixture density recurrent networks for musical interaction and has grown into a general toolkit for intelligent instruments. The papers below cover both core IMPSY development and work by students and collaborators that builds on the platform.

Core IMPSY publications

  • 2026
    Opening the Design Space: Two Years of Performance with Intelligent Musical Instruments

    Charles Martin. International Conference on New Interfaces for Musical Expression (NIME), London. Introduces the Raspberry-Pi-based IMPSY platform and reflects on a two-year first-person artistic research process with five prototype instruments (Intelligent Volca, MicroFreak, S-1, DAW, Setup). Argues that remapping can substitute for retraining, that fast input interleaving is a viable co-creative strategy, and that small-data AI models are a portable design resource.

    arXiv →
  • 2022
    Performing with a Generative Electronic Music Controller

    Charles Martin. Joint Proceedings of the ACM IUI Workshops. A reflection on using the IMPS prediction system in live electronic music performance, prefiguring the design directions later formalised in IMPSY.

    PDF →
  • 2020
    Understanding Musical Predictions with an Embodied Interface for Musical Machine Learning

    Charles Martin, Kyrre Glette, Tønnes Nygaard, and Jim Torresen. Frontiers in Artificial Intelligence. A study of embodied musical prediction using the EMPI hardware controller: how performers experience, interpret, and play with predictive output from an MDRNN.

    DOI →
  • 2019
    An Interactive Musical Prediction System with Mixture Density Recurrent Neural Networks

    Charles Martin and Jim Torresen. Proceedings of NIME 2019. The original IMPS paper. Introduces the prediction system, focuses on OSC connectivity, and lays out the machine learning approach that later became IMPSY.

    DOI →
  • 2018
    RoboJam: A Musical Mixture Density Network for Collaborative Touchscreen Interaction

    Charles Martin and Jim Torresen. Proceedings of EvoMUSArt 2018. An earlier collaborative-performance system that established the MDRNN principle on which IMPS and IMPSY were later built.

    DOI → · arXiv →
  • 2017
    Deep Models for Ensemble Touch-Screen Improvisation

    Charles Martin, Kai Olav Ellefsen, and Jim Torresen. Proceedings of Audio Mostly 2017. Early work applying deep sequence models to musical ensemble interaction; part of the thread leading into RoboJam and IMPS.

    DOI →

Work by students and collaborators

These papers extend IMPSY or the MDRNN approach into new instruments, interfaces, and performance contexts.

  • 2026
    A Web Interface for Real-Time Interaction with Machine Learning in Musical Performance

    Hongdi Zhu and Charles Martin. NIME 2026 (to appear). Describes the web interface for IMPSY — configuration, data capture, and model management without specialist tooling.

  • 2025
    Touching Wires: Tactility and a Quilted Musical Interface for Human–AI Musical Co-Creation

    Sandy Ma and Charles Martin. NIME 2025. A soft, quilted controller for co-creative musical interaction with an IMPSY-based predictive model.

  • 2025
    AI See, You See: Human–AI Musical Collaboration in Augmented Reality

    Yichen Wang and Charles Martin. CHI EA '25. Uses IMPSY as the predictive model behind a human–AI musical collaboration in head-mounted AR.

    DOI →
  • 2024
    Off-the-shelf: Improvising with a Minimal Intelligent Musical Instrument in Mixed Reality

    Yichen Wang and Charles Martin. AI Music Creativity (AIMC) 2024. An improvisation study using IMPSY as a minimal predictive partner in a mixed-reality musical setting.

    Paper → · Zenodo →

Project timeline

  • 2017–18 Started the musical MDRNN idea with the RoboJam project: collaborative touchscreen performance driven by a mixture density network.
  • 2019 Released IMPS, generalising the MDRNN approach to arbitrary musical interaction over OSC.
  • 2020 Studied IMPS in performance with the EMPI embodied controller; published findings on predictive interaction in Frontiers in AI.
  • 2024 Rebuilt IMPS as IMPSY with broader I/O, easier configuration, and a focus on Raspberry Pi deployment for new intelligent instruments.
  • 2024 Organised the Building NIMEs with Embedded AI workshop at NIME 2024.
  • 2025 SMCC Lab students publish the first IMPSY-based papers: Wang (AI See, You See, CHI EA) and Ma (Touching Wires, NIME).
  • 2026 Published the IMPSY design-space paper and web-interface paper at NIME 2026; established connections with the Mishmash Centre for AI and Creativity.

Music featuring IMPSY

Recorded music made with IMPSY-based instruments:

Wider context

The IMPSY project is developed at the Sound, Music, & Creative Computing Lab by Charles Martin and collaborators. For ongoing news, talks, and related work, those sites are the best place to look. There is intentionally no blog or news feed here.