Generating precise motions with continuum soft robots calls for ways of closing the loop through nonconventional sensors - like cameras. This paper considers, for the first time, model-based visual servoing control of tendon-driven continuum soft robots. We take into account both
...
Generating precise motions with continuum soft robots calls for ways of closing the loop through nonconventional sensors - like cameras. This paper considers, for the first time, model-based visual servoing control of tendon-driven continuum soft robots. We take into account both regulation and trajectory tracking. We especially focus on a system inspired by the human neck, which employs an eye-in-hand configuration. The considered control architecture maps visual feedback to motor commands using the overall system's Jacobian, aiming for accurate end-effector (i.e., robot's head) positioning. This is made possible by blending classic results in visual serving with reduced-order models for continuum soft robots. In this work, we place significant focus on evaluating the controller's effectiveness on the physical prototype, for which we develop a tailored testing setup, which is a novel contribution of this work. Central to this setup is a tendon-driven soft robotic neck. We extensively characterize the control algorithm's performance for several control gains and different operational scenarios. We show that incorporating feedforward velocity estimation into the controller consistently improves performance in trajectory tracking tasks.