Conference Publication

J. L. Sanchez-Lopez, M. Castillo, H. Voos. Semantic situation awareness of ellipse shapes via deep learning for multirotor aerial robots with a 2D LIDAR. 2020 International Conference on Unmanned Aircraft Systems (ICUAS). IEEE. e-ISSN: 2575-7296. pp. 1014-1023. Sep. 2020. (Online: Oct. 2020). DOI: 10.1109/ICUAS48674.2020.9214063.

Resources:
Abstract:
In this work, we present a semantic situation awareness system for multirotor aerial robots equipped with a 2D LIDAR sensor, focusing on the understanding of the environment, provided to have a drift-free precise localization of the robot (e.g. given by GNSS/INS or motion capture system). Our algorithm generates in real-time a semantic map of the objects of the environment as a list of ellipses represented by their radii, and their pose and velocity, both in world coordinates. Two different Convolutional Neural Network (CNN) architectures are proposed and trained using an artificially generated dataset and a custom loss function, to detect ellipses in a segmented (i.e. with one single object) LIDAR measurement. In cascade, a specifically designed indirect-EKF estimates the ellipses based semantic map in world coordinates, as well as their velocity. We have quantitative and qualitatively evaluated the performance of our proposed situation awareness system. Two sets of Software-In-The-Loop simulations using CoppeliaSim with one and multiple static and moving cylindrical objects are used to evaluate the accuracy and performance of our algorithm. In addition, we have demonstrated the robustness of our proposed algorithm when handling real environments thanks to real laboratory experiments with non-cylindrical static (i.e. a barrel) objects and moving persons.