Direct sensor integration for optimization fabrics

More Info
expand_more

Abstract

Autonomous robots hold great potential for positive impacts on society by applying them to tasks that are hazardous, repetitive, or complex and difficult for humans to perform. To achieve these tasks, autonomous robots require the ability to perceive environmental changes and create corresponding motion plans, which involve a combination of perception and motion planning techniques. Building a perception pipeline that can detect all relevant details in a dynamic environment is challenging and computationally expensive. To address this issue, the raw data output of a distance-measuring sensor can be used as a perception pipeline directly, transferring most of the computational load to the motion planner. However, most motion planning techniques cannot handle this high computational load. The motion planning technique optimization fabrics offers a promising solution, which utilizes a combination of differential equations to design robot behavior with extremely low computational complexity.

This thesis proposes a method for local motion planning that combines the low computational complexity of optimization fabrics with direct sensor integration. Our goal is to develop a method that enables autonomous robots to perceive and respond to changes in their environment quickly and efficiently. Our method, called direct sensor integrated (DSI) optimization fabrics, utilizes collision-avoidance differential equations generated for each raw data point collected from a distance measuring sensor. We adapted the regular optimization fabrics method to incorporate sensor data directly, and our analytical analysis shows that direct sensor integration does not require scaling adjustments to the regular collision-avoidance differential equations. By combining optimization fabrics with direct sensor integration, DSI optimization fabrics offers a promising solution to local motion planning. This method can enable autonomous robots to handle tasks that are challenging for humans to perform without needing a complex perception pipeline.

We conducted several experiments to assess our method, utilizing a simulated LiDAR-equipped point robot. First, we show empirically that an adjustment is required to properly scale the collision-avoidance differential equations, resulting in similar collision-avoidance behavior across sensor resolution. Second, we demonstrate that DSI optimization fabrics is feasible regarding computational complexity, achieving a frequency of 23 Hz even with the maximum sensor resolution of 2048 LiDAR rays. Third, we demonstrate the effect of varying the sensor resolution on performance in multiple goal-reaching scenarios. We measure performance by monitoring the time-to-goal, total path length, minimum clearance from obstacles, and success rates. Fourth, we compare our method to regular optimization fabrics with a simulated perception pipeline in a scenario with static obstacles and a scenario with dynamic obstacles. In both scenarios, we show comparable performance regarding time-to-goal, total path length, minimum clearance from obstacles, and success rates. Finally, we showcase a real-world application of our method.