Many robotic tasks require high-dimensional sensors such as cameras and Lidar to navigate complex environments, but developing certifiably safe feedback controllers around these sensors remains a challenging open problem, particularly when learning is involved.
Previous works have proved the safety of perception-feedback controllers by separating the perception and control subsystems and making strong assumptions on the abilities of the perception subsystem. In this work, we introduce a novel learning-enabled perception-feedback hybrid controller, where we use Control Barrier Functions (CBFs) and Control Lyapunov Functions (CLFs) to show the safety and liveness of a full-stack perception-feedback controller. We use neural networks to learn a CBF and CLF for the full-stack system directly in the observation space of the robot, without the need to assume a separate perception-based state estimator. Our hybrid controller, called LOCUS (Learning-enabled Observation-feedback Control Using Switching), can safely navigate unknown environments, consistently reach its goal, and generalizes safely to environments outside of the training dataset. We demonstrate LOCUS in experiments both in simulation and in hardware, where it successfully navigates a changing environment using feedback from a Lidar sensor.
This work is part of a broader research thread around
Other work on learned certificates from our lab include:
@article{dawson2022locus,
author={Dawson, Charles and Lowenkamp, Bethany and Goff, Dylan and Fan, Chuchu},
journal={IEEE Robotics and Automation Letters},
title={Learning Safe, Generalizable Perception-Based Hybrid Control With Certificates},
year={2022},
volume={7},
number={2},
pages={1904-1911},
doi={10.1109/LRA.2022.3141657}}
}