This project allows users to draw a paint and compose a song with body movement in real time working with multiple machine learning models.
A camera will capture the movement of the user's hand which will be used as the stroke data for the Spade-coco model. The projector will project the real-time window of the output of the spade-coco model, a colored landscape view, for the user to generate real-time pictures. The camera will also capture the Posenet data to tone.js to generate music in real-time. After defining the start and end, the grogram will send the whole music piece to Rnn model to generate a mix version with a chosen piece to make the work more musical. As a result of the whole system, there will be a song with an album cover. The video is the demo of the whole project, allowing the user to paint with no mouse and turn the stroke on and off when left-hand gets close to the right-hand.
Introduction to Machine Learning for the Arts (UG)