Window Based Gesture Recognition on uController
As part of my final project, I am intending to build a gesture recognizer on a glove instrumented with a 3D accelerometer and flex sensors. As an added challenge, I intend to build the gesture recognizer software on the microcontroller, instead of offloading the processing to a computer. But why do the recognition on the glove? By doing away with the computer, I have one less set of wireless communications to deal with, as well as less overall components.
The gesture recognizer is being built as a window/frame based system. What this means is that samples are collected every VEC_FRAME_WIDTH milliseconds, and stuffed into a sliding window. The gesture recognizer engine then uses whatever data is in the window to determine the gesture being made. A smoothing algorithm is also applied to the data prior to recognition.
To have everything fit on the microcontroller (and make life a little bit easier), I opted to have specific states whereby the data for gesture recognition can be captured. This eases the computational burden of the microcontroller, which now will not need to constantly analyze the window and determine the gesture being made.
While the gesture recognition engine is not complete as of the time of writing, here are the items that have been ccompleted so far:
- Interfacing of flex sensors + accelerometer with ATMega328 (Using a previously built data capture board)
- Implementation of sliding window
- Implementation of gesture capture
- Implementation of smoothing function
- Implementation of peak detection using vectors
Items left to do:
- Peaks -> gesture conversion
Again, to make life simpler, I have opted for a small library of between 10 – 20 gestures, using only 4 fingers. This gives sufficient range of gestures that can be used to control a robot. Once a gesture is recognized, it will be converted into a gestureID, which will be sent wirelessly to the robot, which will then execute the programmed action for that gesture.