Learning to Feel Textures

Predicting Perceptual Similarities From Unconstrained Finger-Surface Interactions

Journal Article (2022)
Author(s)

Benjamin A. Richardson (Max Planck Institute for Intelligent Systems, University of Stuttgart)

Y. Vardar (TU Delft - Human-Robot Interaction)

Christian Wallraven (Korea University)

Katherine J. Kuchenbecker (Max Planck Institute for Intelligent Systems)

Research Group
Human-Robot Interaction
Copyright
© 2022 Benjamin A. Richardson, Y. Vardar, Christian Wallraven, Katherine J. Kuchenbecker
DOI related publication
https://doi.org/10.1109/TOH.2022.3212701
More Info
expand_more
Publication Year
2022
Language
English
Copyright
© 2022 Benjamin A. Richardson, Y. Vardar, Christian Wallraven, Katherine J. Kuchenbecker
Research Group
Human-Robot Interaction
Issue number
4
Volume number
15
Pages (from-to)
705-717
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Whenever we touch a surface with our fingers, we perceive distinct tactile properties that are based on the underlying dynamics of the interaction. However, little is known about how the brain aggregates the sensory information from these dynamics to form abstract representations of textures. Earlier studies in surface perception all used general surface descriptors measured in controlled conditions instead of considering the unique dynamics of specific interactions, reducing the comprehensiveness and interpretability of the results. Here, we present an interpretable modeling method that predicts the perceptual similarity of surfaces by comparing probability distributions of features calculated from short time windows of specific physical signals (finger motion, contact force, fingernail acceleration) elicited during unconstrained finger-surface interactions. The results show that our method can predict the similarity judgments of individual participants with a maximum Spearman's correlation of 0.7. Furthermore, we found evidence that different participants weight interaction features differently when judging surface similarity. Our findings provide new perspectives on human texture perception during active touch, and our approach could benefit haptic surface assessment, robotic tactile perception, and haptic rendering.

Files

License info not available