Conceptualizing the Integration of Cognitive and Physical Models to Enable Continuous Human-Robot Interaction
C. Hao (TU Delft - Pattern Recognition and Bioinformatics)
Nele Russwinkel (University of Lübeck)
Daniel F.B. Haeufle (Eberhard-Karls Universität Tübingen)
Philipp Beckerle (Friedrich-Alexander-Universität Erlangen-Nürnberg)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
Research in human-robot interaction (HRI) often puts emphasis on either the cognitive level or on the physical level. In a scenario, where a robot physically guides a person to perform a complex series of tasks (e.g., a patient making tea), information is exchanged on the cognitive level and forces/torques are exchanged on the physical level, continuously. Such a continuous co-adaptive interaction between both agents and the environment requires the robot to be anticipating, proactive, and able to react flexibly to the user's intentions and situation context. The unification of sequential cognitive situation modeling and continuous robotic movement control is a challenge currently missing a conceptual framework. We conceptualize strategies on how to connect models of physical HRI and models of cognitive HRI, depending on the level of assistance provided by the robot system, from mere warnings of dangerous situations (level 1) to on-body continuous movement guidance (level 4). In this, we consider the requirements for the robot to be aware of the interaction environment and have a dynamic representation of the individual user. Our conceptual framework is intended to spark discussions and formalize assistance approaches with the aim to integrate cognitive and physical human-robot interaction approaches for anticipatory assistance in continuous dynamic tasks.
Files
File under embargo until 15-01-2026