Low-Power Gesture Recognition Using Convolutional Neural Networks and Ambient Lighting

Bachelor Thesis (2023)
Author(s)

A.R. de Beer (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

Q. Wang – Mentor (TU Delft - Embedded Systems)

M. Yang – Mentor (TU Delft - Embedded Systems)

R. Zhu – Mentor (TU Delft - Embedded Systems)

R. Venkatesha Venkatesha Prasad – Graduation committee member (TU Delft - Networked Systems)

Faculty
Electrical Engineering, Mathematics and Computer Science, Electrical Engineering, Mathematics and Computer Science
Copyright
© 2023 Arne de Beer
More Info
expand_more
Publication Year
2023
Language
English
Copyright
© 2023 Arne de Beer
Graduation Date
03-07-2023
Awarding Institution
Delft University of Technology
Project
CSE3000 Research Project
Programme
Computer Science and Engineering
Faculty
Electrical Engineering, Mathematics and Computer Science, Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

This paper presents a study focused on developing an efficient signal processing pipeline and identifying suitable machine learning models for real-time gesture recognition using a testbed consisting of an Arduino Nano 33 BLE and three OPT101 photodiodes. Our research aims to address the challenges of limited computational power whilst maintaining a high inference accuracy.

Experiments were conducted to optimise the signal processing and explore various machine learning model architectures, specifically revolving around convolutional neural networks. The data used for these experiments was gathered by creating a dataset of gestures from left- and right-handed participants. We took ethical considerations regarding participant recruitment and data security into account and we made sure to balance the dataset with both left- and right-handed participants as much as possible.

We obtained accurate gesture recognition results, surpassing the goal of a 75% success rate. Our machine learning models, trained on pre-processed 2D data, achieved near real-time inference times while running on the resource-constrained Arduino Nano 33 BLE.

The findings of this study contribute to the field of gesture recognition by providing insights into efficient signal processing techniques and identifying suitable machine learning models for resource-constrained devices. The developed system can be applied in various applications, ranging from games to healthcare. Furthermore, a dataset is contributed which can be used for further research.

Files

License info not available