In this workshop we will teach an Arduino board to recognize gestures! We will capture motion data from the Arduino Nano 33 BLE board1, import it into TensorFlow to train a model, and deploy a classifier onto the board using TensorFlow Lite for microcontrollers.
The hardware for this workshop has been provided by Arduino
- Introduction
- Exercise 1: Development Environment
- Exercise 2: Source Code
- Exercise 3: Hardware
- Exercise 4: Visualizing the IMU Data
- Exercise 5: Gather the Training Data
- Exercise 6: Machine Learning
- Exercise 7: Classifying IMU Data
- Exercise 8: Emojis
- Exercise 9: Gesture Controlled USB Emoji Keyboard
- Exercise 10: Next Steps
This workshop material was developed by Sandeep Mistry and Don Coleman.
Previous versions
- https://github.com/sandeepmistry/aimldevfest-workshop-2019
- https://github.com/arduino/AIoT-Dev-Summit-2019
1: You can also use the Arduino Nano 33 BLE Sense for this workshop.