Using Gesture Detection as a User Interface for Customized Earphones

More Info


Nowadays, Dopple's wireless earphones have touch buttons as a user inter-
face, however these take in quite some physical space. Since the earphones
are becoming smaller over the years, there is a need for a new type of touch-
less user interface, that is smaller. In order to create that new user interface,
research is done on the topic of remote gesture sensing with sensors that can
t onto wireless earphones. When gestures are recognized by the system,
a corresponding action can be taken, like for example pausing the music.
Small infra-red imaging sensors are chosen as a solution to the problem. Its
images are analysed by a trained image recognition neural network created
with Python and Keras. This network takes an image as input and outputs
a gesture. Each gesture is supposed be linked to an action in the new user
interface. This report focusses on the retrieval of low resolution infra-red
images and neural network training/machine learning. The APDS-9500, an
already existing moving gesture sensor, is used as a comparison with regard
to the new neural network technique. It achieves an accuracy of 92.3% with
5 dierent gestures. The AMG Grid Eye is an 8 by 8 pixel infra-red camera
for which 5 gestures are trained. For raw images, 5 gestures are recognized
with an accuracy of 79.2%. With the help of pre-processing in the form of
contrast increasing and linear extrapolation the accuracy is increased up to
92.4%. The FLUKE 279 FC is a high resolution camera mounted on a mul-
timeter of which the images are downscaled to 30 by 30 pixels. It is found
out that while the size of the model increases, accuracy also increases up to
97.2% for 5 gestures. When the FLUKE is tested with 9 dierent gestures,
while also optimizing for size, an accuracy is achieved of 98.2% with a model
of 203 kB. The study proves that 30 by 30 infra-red images contain enough
information to use gesture recognition with a small neural network.