Researching hand gestures in real-world social interactions requires very careful analysis. While gesture coding schemes were created with that purpose in mind, they are not widely utilised in research. Moreover, studies on gesture classification rarely focus on the physical natu
...
Researching hand gestures in real-world social interactions requires very careful analysis. While gesture coding schemes were created with that purpose in mind, they are not widely utilised in research. Moreover, studies on gesture classification rarely focus on the physical nature of movements involved in gesturing, despite the fact that being able to quantify the motion could reveal useful patterns and correlations. To address those points, this research proposes the following approach: using machine learning models to automatically classify physical features of hand gestures, according to a coding scheme. Two such classifiers were created, for the left and right hand respectively. Overall, the results are quite promising - despite a small and imbalanced training set and complex features both models achieved an accuracy of roughly 60%. Moreover, the results indicate that by avoiding some of the simplifications that this research makes, and by using more balanced training data, the accuracy could be significantly increased. This is concrete evidence that machine learning models can indeed be used to classify the physical aspects of hand gestures, as defined
by a coding scheme, in social interactions in the wild.