Herakoi - the sound of the Universe


Herakoi is a motion-sensing sonification experiment, created by Luca Di Mascolo and myself.

It leverages a machine learning model to track in real time the position of your hands in the scene observed by a webcam connected to your computer. The model landmark coordinates of your hands are re-projected onto the pixel coordinates of your favorite image. The visual properties of the “touched” pixels (at the moment, color and saturation) are then converted into sound properties of your favorite instrument, which you can choose from your favorite virtual MIDI keyboard.

just pip install herakoi and have fun

Links to the documentation and the github pages:

https://herakoi.readthedocs.io/

https://github.com/herakoi