Autonomous UAV Landing on a Moving Vehicle and Real-Time Ellipse Detection

Autonomous UAV Landing on a Moving Vehicle and Real-Time Ellipse Detection

Published: by

Autonomous UAV Landing on a Moving Vehicle and Real-Time Ellipse Detection

The autonomous landing of an Unmanned Aerial Vehicle (UAV) on moving platforms has been an active area of research for several years. The applications include safer landing of aircraft on the ship, more efficient delivery networks and entertainment. Some of the key challenges of the problem include dealing with environmental conditions, such as changes in light and wind, and robust detection of the landing zone. The subsequent maneuver in trying to land also needs to take care of the potential ground effects at the proximity of the landing surface.

We have developed a method to autonomously land an Unmanned Aerial Vehicle on a moving vehicle with a circular (or elliptical) pattern on the top. A visual servoing controller is developed to approach the ground vehicle using velocity commands calculated directly in image space. The control laws generate velocity commands in all three dimensions, eliminating the need for a separate height controller. Our method has shown the ability to approach and land on the moving deck in simulation, indoor and outdoor environments, and compared to the other available methods, it has provided the fastest landing approach. It does not rely on additional external setup, such as RTK, motion capture system, ground station, offboard processing or communication with the ground vehicle, and it requires only the minimal set of hardware and localization sensors.

Additionally, we propose a new algorithm for real-time detection and tracking of elliptic patterns suitable for real-world robotics applications. The method fits ellipses to each contour in the image frame and rejects ellipses that do not yield a good fit. It can detect complete, partial, and imperfect ellipses in extreme weather and lighting conditions and is lightweight enough to be used on robots’ resource-limited onboard computers. The method is used on in the autonomous UAV landing to show its performance indoors, outdoors, and in simulation on a real-world robotics task. The comparison with other well-known ellipse detection methods shows that our proposed algorithm outperforms other methods with the F1 score of 0.981 on a dataset with over 1500 frames. The videos of experiments, the source codes, and the collected dataset are provided with the paper.

For more details about the methods, dataset and the project, please refer to the publication section below.

Source Code and Dataset

The project code base and the collected datasets can be accessed from the AirLab’s BitBucket: https://bitbucket.org/account/user/castacks/projects/MBZIRC

A good starting point is the documents repository.

More specifically, the code base for the ellipse detection can be found here and the code showing how to use the ellipse detector is available here.

The collected dataset for the elliptic pattern is available here.

Please refer to the publication section below for more details and the proper citation.

Videos

The following videos show the proposed elliptic pattern detection method in real scenarios. The videos only show the detection method performed on the frames without the added information from the flight or the vehicle movement.

The following video shows the autonomous UAV landing approach on the vehicle moving at 15 km/h in a simulated environment.

The following videos show the autonomous UAV landing on the vehicle moving at 15 km/h in outdoors settings with and without the magnets on the legs.

The following video shows the autonomous UAV landing on the platform moving at ~8 km/h indoors.

The following video shows 11 consecutive trials to test the robustness of the autonomous UAV landing on the moving platform indoors. One of the trials fails due to the perception problem (not detecting the landing zone).

Publication

The real-time ellipse detection and tracking method is published in Robotics and Automation Letters (RA-L) journal and is being presented in IROS 2021 conference. It includes the details of our novel real-time ellipse detection method for robotics applications and releases the the dataset created from an elliptic pattern on top of a vehicle moving at 15 km/h for testing real-time ellipse detection methods.

Please cite the publication if our dataset or ellipse detection method is used in your research (access on arXiv and on IEEEXplore):

BibTeX:

@article{keipour:ral:2021,
author={Azarakhsh Keipour and Guilherme A.S. Pereira and Sebastian Scherer},
title={Real-Time Ellipse Detection for Robotics Applications},
journal = {IEEE Robotics and Automation Letters},
year={2021},
month={Oct},
volume={6},
number={4},
pages={7009-7016},
doi={10.1109/LRA.2021.3097057},
link={https://ieeexplore.ieee.org/document/9484730},
} 

IEEE Style:

A. Keipour, G.A.S. Pereira, and S. Scherer, “Real-Time Ellipse Detection for Robotics Applications,” in IEEE Robotics and Automation Letters, vol. 6, no. 4, pp. 7009-7016, Oct. 2021, doi: 10.1109/LRA.2021.3097057.

The publication on the whole approach is available on arXiv. It includes the details of our visual servoing approach for autonomous UAV landing on a moving vehicle.

Please cite the publication if any part of this project is used in your research (access on arXiv):

BibTeX:

@article{keipour:landing:2021,
author={Azarakhsh Keipour and Guilherme A.S. Pereira and Rogerio Bonatti and Rohit Garg and Puru Rastogi and Geetesh Dubey and Sebastian Scherer},
title={Visual Servoing Approach for Autonomous UAV Landing on a Moving Vehicle},
journal = {arXiv},
year={2021},
pages={1-24},
link={https://arxiv.org/abs/2104.01272},
} 

IEEE Style:

A. Keipour, G.A.S. Pereira, R. Bonatti, R. Garg, P. Rastogi, G. Dubey, and S. Scherer, “Visual Servoing Approach for Autonomous UAV Landing on a Moving Vehicle,” arXiv, eprint 2104.01272, 2021. 

Contact

Azarakhsh Keipour - (keipour [at] cmu [dot] edu)

Guilherme A.S. Pereira - (guilherme.pereira [at] mail [dot] wvu [dot] edu)

Sebastian Scherer - (basti [at] cmu [dot] edu)

Acknowledgments

This work was supported by Carnegie Mellon University Robotics Institute and Mohamed Bin Zayed International Robotics Challenge.

Perception, Planning, Control

Latest Research

SubT-MRS: Pushing SLAM Towards All-weather Environments
SubT-MRS: Pushing SLAM Towards All-weather Environments

Simultaneous localization and mapping (SLAM) is a fundamental task for numerous applications such...

TartanDrive 2.0: More Modalities and Better Infrastructure to Further Self-Supervised Learning Research in Off-Road Driving Tasks
TartanDrive 2.0: More Modalities and Better Infrastructure to Further Self-Supervised Learning Research in Off-Road Driving Tasks

TartanAviation: Image, Speech, and Trajectory Datasets for Terminal Airspace Operations
TartanAviation: Image, Speech, and Trajectory Datasets for Terminal Airspace Operations

We introduce TartanAviation, an open-source multi-modal dataset focused on terminal-area airspace...