Lidron: Team LiDron's LiDAR based Autonomous Landing System for UAVs

Introduction

Lockheed Martin and Raytheon have funded Team LiDron to develop an autonomous landing system for UAVs using Lidar sensors. Additionally, we included the ability for autonomous takeoff and flight. As a requirement, we have  to fully design and implement both the UAV and UGV hardware interconnections, as well as the software interconnections and algorithms that will enable this feature set.

Lockheed Martin Logo and symbol, meaning, history, PNG, brand

Description

Research

  •  LiDRON is dedicated to designing and implementing an autonomous drone with a primary focus on autonomous landing capabilities utilizing LiDAR Sensors. To achieve this goal, LiDRON leverages technology, including Intel RealSense cameras for precise obstacle detection and a LiDAR sensor for mapping the landing area. Also, LiDRON employs a UAV equipped with a PixHawk 4 flight controller, utilizing the Tarot 680 Pro drone model. The choice of LiDAR sensors includes the highly efficient Velodyne Puck Lite. Additionally, LiDRON utilizes ROS (Robot Operating System) as its framework and Gazebo Classic for running simulations to test object detection, obstacle avoidance, and path planning algorithms, ensuring the utmost reliability and performance of the autonomous drone system.
 
Competition
  •  Team LiDRON is dedicated to developing an autonomous unmanned aerial vehicle (UAV) capable of object recognition through camera vision, to identify enemy unmanned ground vehicles (UGV) facilitated via ArUco Markers. This UAV should have the functionality of water deployment to “eliminate” enemy UGVs. Additionally,  develop a UGV that is capable of deactivating upon exposure to water while evading enemy UAV.

Problem & Hypothesis

Our problem is that there are risks in aerial missions caused by errors of human pilots and their inccorrect judgement.

 

Our hypothesis is that autonomous landing and path planning will reduce mission risks by removing incorrect judgement from pilots. 

To do this, a number if sensors, including lidar, and path planning algorithms  will be used. This will allow landing without the need for human interaction.

Objectives

1. Design and implement an autonomous landing system with the capability to navigate to a destination and determine the safest possible landing zone of any given area, unknown or otherwise, utilizing LiDAR sensors, in order to complete flight plans without human intervention.

2. Develop a complete ROS tree that includes all of the components and their responsabilities as publishers or suscribers.


3. Better integrate the three Lidar sensors that will be carrying out the environmental data collection, so that it can later be used to test the ALA.

 

4. Render lidar data to acquire a better understanding of the sensors visualization capabilities.

 

5. Completely rewrite our data acquisition algorithm to better work alongside the different sensors, as well as the software interconnections of each one of the systrem components.

 

Members

Research

Competition

Presentation Video

Research Poster