Conference Publication

R. A. Suarez Fernández, J. L. Sanchez-Lopez, C. Sampedro, H. Bavle, M. Molina, P. Campoy. Natural user interfaces for human-drone multi-modal interaction. 2016 International Conference on Unmanned Aircraft Systems (ICUAS). IEEE. e-ISSN: 2575-7296. pp. 1013-1022. June 2016. DOI: 10.1109/ICUAS.2016.7502665.

Resources:
Abstract:
Personal drones are becoming part of every day life. To fully integrate them into society, it is crucial to design safe and intuitive ways to interact with these aerial systems. The recent advances on User-Centered Design (UCD) applied to Natural User Interfaces (NUIs) intend to make use of human innate features, such as speech, gestures and vision to interact with technology in the way humans would with one another. In this paper, a Graphical User Interface (GUI) and several NUI methods are studied and implemented, along with computer vision techniques, in a single software framework for aerial robotics called Aerostack which allows for intuitive and natural human-quadrotor interaction in indoor GPS-denied environments. These strategies include speech, body position, hand gesture and visual marker interactions used to directly command tasks to the drone. The NUIs presented are based on devices like the Leap Motion Controller, microphones and small size monocular on-board cameras which are unnoticeable to the user. Thanks to this UCD perspective, the users can choose the most intuitive and effective type of interaction for their application. Additionally, the strategies proposed allow for multi-modal interaction between multiple users and the drone by being able to integrate several of these interfaces in one single application as is shown in various real flight experiments performed with non-expert users.