Conference Publication
Resources:
- Paper link: https://doi.org/10.1109/ACC.2014.6858831
- Bibtex citation: Bibtex
Abstract:
The motivation of this research is to show that visual based object tracking and following is reliable using a cheap GPS-denied multirotor platform such as the AR Drone 2.0. Our architecture allows the user to specify an object in the image that the robot has to follow from an approximate constant distance. At the current stage of our development, in the event of image tracking loss the system starts to hover and waits for the image tracking recovery or second detection, which requires the usage of odometry measurements for self stabilization. During the following task, our software utilizes the forward-facing camera images and part of the IMU data to calculate the references for the four on-board low-level control loops. To obtain a stronger wind disturbance rejection and an improved navigation performance, a yaw heading reference based on the IMU data is internally kept and updated by our control algorithm. We validate the architecture using an AR Drone 2.0 and the OpenTLD tracker in outdoor suburban areas. The experimental tests have shown robustness against wind perturbations, target occlusion and illumination changes, and the system's capability to track a great variety of objects present on suburban areas, for instance: walking or running people, windows, AC machines, static and moving cars and plants.