Autonomous UAV Landing on a Moving Vehicle and Real-Time Ellipse Detection

Autonomous UAV Landing on a Moving Vehicle and Real-Time Ellipse Detection

Published: by

Autonomous UAV Landing on a Moving Vehicle and Real-Time Ellipse Detection

The autonomous landing of an Unmanned Aerial Vehicle (UAV) on moving platforms has been an active area of research for several years. The applications include safer landing of aircraft on the ship, more efficient delivery networks and entertainment. Some of the key challenges of the problem include dealing with environmental conditions, such as changes in light and wind, and robust detection of the landing zone. The subsequent maneuver in trying to land also needs to take care of the potential ground effects at the proximity of the landing surface.

We have developed a method to autonomously land an Unmanned Aerial Vehicle on a moving vehicle with a circular (or elliptical) pattern on the top. A visual servoing controller is developed to approach the ground vehicle using velocity commands calculated directly in image space. The control laws generate velocity commands in all three dimensions, eliminating the need for a separate height controller. Our method has shown the ability to approach and land on the moving deck in simulation, indoor and outdoor environments, and compared to the other available methods, it has provided the fastest landing approach. It does not rely on additional external setup, such as RTK, motion capture system, ground station, offboard processing or communication with the ground vehicle, and it requires only the minimal set of hardware and localization sensors.

Additionally, we have proposed a novel algorithm for the detection and tracking of elliptic patterns in real-time. The ellipse detection algorithm fits an ellipse to each contour in the input image and rejects ellipses that do not yield in a good fit. It can detect complete, as well as partial and imperfect ellipses in extreme weather and lighting conditions. The method is suitable for real-time robotics applications where a circular or elliptical pattern needs to be detected and tracked using an onboard camera. A comparison with other well-known ellipse detection methods shows that our proposed algorithm outperforms other methods.

For more details about the methods, dataset and the project, please refer to the publication section below.

Source Code and Dataset

The project code base and the collected datasets can be accessed from the AirLab’s BitBucket: https://bitbucket.org/account/user/castacks/projects/MBZIRC

A good starting point is the documents repository.

More specifically, the code base for the ellipse detection can be found here and the code showing how to use the ellipse detector is available here.

The collected dataset for the elliptic pattern is available here.

Videos

The following video shows the autonomous UAV landing approach on the vehicle moving at 15 km/h in a simulated environment.

The following videos show the autonomous UAV landing on the vehicle moving at 15 km/h in outdoors settings with and without the magnets on the legs.

The following video shows the autonomous UAV landing on the platform moving at ~8 km/h indoors.

The following video shows 11 consecutive trials to test the robustness of the autonomous UAV landing on the moving platform indoors. One of the trials fails due to the perception problem (not detecting the landing zone).

The following videos show the proposed elliptic pattern detection method in real scenarios. The videos only show the detection method performed on the frames without the added information from the flight or the vehicle movement.

Publication

The publication written on this project includes details about the following:

  1. The visual servoing-based method for autonomous UAV landing on the moving vehicle

  2. The novel real-time ellipse detection method for robotics applications

  3. The dataset created from an elliptic pattern on top of a vehicle moving at 15 km/h for testing real-time ellipse detection methods

Please cite the publication if any part of this project is used for research purposes.

BibTeX:

@article{keipour:autonomouslanding:2019,
author={Azarakhsh Keipour and Guilherme A.S. Pereira and Rogerio Bonatti and Rohit Garg and Puru Rastogi and Geetesh Dubey and Sebastian Scherer},
title={Visual Servoing Approach for Autonomous UAV Landing on a Moving Vehicle using Robust Real-Time Elliptic Pattern Detection},
journal = {},
volume = {},
number = {},
pages = {},
year = {In press},
} 

IEEE Style:

A. Keipour, G. Pereira, R. Bonatti, R. Garg, P. Rastogi, G. Dubey, and S. Scherer, “Visual Servoing Approach for Autonomous UAV Landing on a Moving Vehicle using Robust Real-Time Elliptic Pattern Detection,” Journal of Field Robotics, In press. 

Contact

Azarakhsh Keipour - (keipour [at] cmu [dot] edu)

Guilherme A.S. Pereira - (guilherme.pereira [at] mail [dot] wvu [dot] edu)

Sebastian Scherer - (basti [at] cmu [dot] edu)

Acknowledgments

This work was supported by Carnegie Mellon University Robotics Institute and Mohamed Bin Zayed International Robotics Challenge.