Week 5 - Gesture Recognition with TinyML

For this week's lab, we used collected accelerometer data from the Xiao nRF52840 Sense and trained a machine learning (ML) model to detect two gestures: punch or flex.

Flowchart

Before starting the lab, we created a flowchart to visualize the elements of a simple activity tracking system. The flowchart below can also be viewed here.

Activity tracker flowchart
Activity tracker flowchart

Data collection

We first flashed a sketch to the board to access the accelerometer and print the XYZ values to the serial monitor. We then used CoolTerm to capture the data from the serial monitor and write it to CSV files (click here to download the datasets).

Accessing accelerometer data
Accessing accelerometer data

Recording data using CoolTerm
Recording data using CoolTerm

Capturing punch gesture data:

Capturing flex gesture data:

ML Model Training

Next, we trained simple ML model with TensorFlow that outputs predictions for each gesture. We achieved low mean absolute error (MAE) and loss scores by training for 40 epochs and using the other hyperparameters shown below. You can view the full Google Colab notebook here.

Hyperparameters used
Hyperparameters used

Mean absolute error (left) and loss scores (right) during model training and validaion
Mean absolute error (left) and loss scores (right) during model training and validaion

To run the trained model on a microcontroller, we converted the model to TensoFlow Lite and exported the converted model to a byte array saved in a file named model.h. We then replaced the model.h file in the IMU_Classifier example sketch from the Seeed_Arduino_LSM6DS3 library and flashed the microcontroller again.

Exported TensorFlow Lite model as a byte array
Exported TensorFlow Lite model as a byte array

Model Testing

Our model was able to distinguish between the two gestures fairly well, although it seems sensitive to the rotation of the hand. So if the hand holding the microcontroller is rotated slightly during a flex gesture, it will tend to predict punch. Future work could involve fine-tuning the model to better distinguish between the different gestures.

Testing the model's ability to predict the gestures:


More posts

Week 14 - Real-Time Operating System (RTOS)

An introduction to real-time operating systems (RTOS) for embedded systems.

View post

Week 13 - Motor Controller

A simple DC motor controller using pulse-width modulation (PWM).

View post