Embodied Predictive Musical Instrument (EMPI)
01 Apr '20
The EMPI is a minimal electronic musical instrument for experimenting with predictive interaction techniques. It includes a single physical input — a lever — and a matching physical output, a built-in speaker, and a Raspberry Pi for sound synthesis and machine learning computations.

The instrument is designed to be as simple as possible while still enabling meaningful exploration of how predictive ML models can interact with human performers. The matched input and output design allows direct comparison between what a performer plays and what the instrument predicts.
The EMPI was described in a paper in Frontiers in Artificial Intelligence:
@article{Martin2020,
author = {Martin, Charles Patrick and Torresen, Jim},
title = {Understanding Musical Predictions with an Embodied Interface for Musical Machine Learning},
journal = {Frontiers in Artificial Intelligence},
year = {2020},
doi = {10.3389/frai.2020.00006},
url = {https://doi.org/10.3389/frai.2020.00006}
}
edit:
I later spoke about EMPI at NIME2022 workshops with the talk slides here