Intelligent Musical Instrument Platform

Instruments that listen, predict, and play along.

IMPSY is a research toolkit for building intelligent musical instruments using mixture density recurrent neural networks. Train it on your own performance data, run it on a Raspberry Pi, and connect it to almost anything that speaks OSC, MIDI, serial, or the web.

An IMPSY-driven Korg S-1 synthesizer in performance, with a Raspberry Pi controller running the predictive model.

What IMPSY does

IMPSY captures a stream of musical gestures from a controller, sensor, or touchscreen, learns the temporal shape of a performer's choices, and generates plausible continuations in real time. It can run embedded inside an instrument, or as a separate module connected to one.

01 / SENSE

Capture gestures

Stream multi-dimensional control data into IMPSY over OSC, MIDI, serial, or websockets, whichever your instrument speaks.

02 / LEARN

Train an MDRNN

A mixture density recurrent network learns the joint distribution of your gestures over time, on a laptop or a Pi.

03 / PREDICT

Play together

IMPSY generates continuations or accompaniments at performance latency, routed back into your instrument's sound engine.

Block diagram of the IMPSY system: input adapters, mixture density RNN, output routing.

A small system, designed to be opened

IMPSY is a Python package with a small footprint and clear configuration. Inputs and outputs are decoupled from the model, so the same trained network can drive a synthesizer, an iPad app, or a hardware controller without re-training. A web interface exposes recording, training, and inference.

The companion IMPSYpi distribution packages the toolkit for Raspberry Pi Zero 2 W, 3, 4, and 5 so that an intelligent instrument can run untethered on stage. The IMPSYpi workshop walks through the whole process end to end.

Instruments built with IMPSY

A short tour of instruments we've built with IMPSY. Most of these images come from the 2026 design-space paper.

Read the research

IMPSY grew out of a sequence of research papers on mixture density networks for musical interaction, embodied prediction, and the design space of intelligent instruments.

Publications →

Get involved

IMPSY is open source and developed in the open. The core repository, the Pi distribution, and this site all live on GitHub — issues and pull requests are welcome.

Star on GitHub →