Real-time Ball-touch Classification on an Insole Sensor using Neural Networks
More Info
expand_more
Abstract
Monitoring and analyzing human movement is used in many fields, ranging from healthcare and industrial applications to sports analytics. To provide a football player or their coach with insight into their performance during a game, or their technical development over time, many methods are available such as camera setups and smart vests. However, to provide a more direct and biomechanical approach, this thesis utilizes sensors placed in the insoles of a player’s shoe. These sensors are equipped with an Inertial Measurement Unit (IMU), providing very direct insight into the accelerations and rotations of the player’s feet. Additionally, external impact on the feet can be detected, such as after a jump, a tackle by an opponent or a ball touch. To provide the player with information about covered distance, speed and ball touches and possibly many more metrics, the raw data from the sensor needs to be processed. Various methods can be used to provide the different metrics, ranging from custom algorithms to Machine Learning approaches.
This thesis provides an approach using neural networks to detect ball touches in real-time on these insole sensors. This approach combines the challenge to classify the ball touches correctly with the additional challenge to do so on a constrained embedded device, an Arm Cortex-M4 microcontroller. A development flow is set up using the TensorFlow framework. Recorded data can be used to develop and train a neural network model, after which the model is converted and optimized to be deployed to the sensor.
Several neural network models and settings have been benchmarked both on classification performance and resource usage. The final approach to this classification task samples the data from the IMU at 500 Hz, detects ball-touch candidates and creates a window of samples, which is then fed to a one-layer Convolutional Neural Network (CNN) model. This model achieves an accuracy of 95.8%, while the implementation on the sensor uses 64.9 KB Flash, 5.0 KB RAM and has an execution time of 8 ms per classification.