Arduino ML Gesture Sensing
Date: June 20, 2023 7-9pm
Format: Hybrid (In-person with online access)
Tags: #ml • #edge • #arduino • #ble • #midi
This session will walk participants through using the BLE sense 33 board to make a BLE Midi Instrument or a WIO Terminal (w/ a display) that responds to motion gestures recognized using machine learning. We'll go through acquiring training and testing data with our target devices, what different parameters to use, training with that data, and finally putting together all the pieces into a midi sketch for the arduino.
This session uses the edge impulse product to acquire data and train, and recommend participants check out the Arduino 33 Sense (not IOT) board for this workshop. Edge impulse is free for developers to use and once the model is built it can be run on any supported device. Alternatively, remote participants can run this program on their phone through a web browser data training and acquisition interface.
We'll go over training, how to use edge impulse, and integrating the generated ML model into your arduino sketch. I'll also provide further tutorials for using other ML approaches such as tinyml with colab to train your devices.
Recommended Preparation:
Check out Arduino 33 BLE Sense (Not the IOT) or the Seedstudio WIO Terminal (either are fine, personally like the WIO Terminal) from the ER for the day (remember also the required cables for your laptop) – if you cannot obtain hardware you can use a normal mobile phone for the first half of the session.
Download and install the Arduino IDE.
Create an edge impulse account (can use NYU gmail) – we're using the free developer plan: sign up for edge impulse