Citation
Mohamed Yaghoobi, Yousef Jassim Ismail Abdulla
(2019)
Avian-inspired feature-based relative positioning strategy for formation control of multiple unmanned aerial vehicles.
Masters thesis, Universiti Putra Malaysia.
Abstract
As a new era of flying machines intersected autonomous and semi-autonomous
machines, resulted in the birth of Unmanned Aerial Vehicles (UAVs). It has changed the way human
beings travel, transport objects, surveillance execution, emergency response, and other
things which time will reveal. Among the various types of UAVs, the Multi-rotor UAVs attained the
most attention due to its advantages such as: ease of use, Vertical Take Off and Landing
capability, hover flight, and ability to operate in confined area. However, small payload
capacity is one of the most discerning disadvantage. Due to limited capability and
performance of single Multi-rotor UAVs, interest to overcome this through flight formation and
formation control of UAVs has grown significantly over the past few years. One of the key aspects
of flight formation is spatial coordination or relative positioning between UAVs flying in close
proximity in order to avoid collision and achieve collective operation.
In spatial coordination of UAVs flying in the swarm, using the vision-based technique for on-board
computation, there have been two main approaches, which are; Color- based (artificial marker
detection) and Motion-based (Optical Flow). The Color-based approach performance is highly affected
by misdetection for indoor application and light intensity variation for outdoor application. The
Motion-based or Optical Flow approach for both indoor and outdoor application suffers from lack of
precision and high sensitivity to noise. To the best of our knowledge at the time of writing this
thesis, there are nearly no studies which focused on the use of feature-detection approach in
real-time and on-board of UAVs for the collision-free flight formation.
As inspired by birds flying in flocks, vision is one of the most critical component for them to be
able to respond to their neighbor’s motion. Thus, in this thesis a novel
approach in developing a Vision System as a primary sensor for relative positioning in flight formation of Leader-Follower scenario is introduced. The developed Vision
System is based on Feature-Detection and stereo vision. It utilizes the On-Line Machine
Learning approach for tracking the Leader in Leader-Follower flight formation. The NVIDIA
JETSON TX1 is used as a computing platform for processing the Vision System data in real-time and
on-board of DJI MATRICE 100 quadcopter.
In order to evaluate the Vision System performance, three flight-formation scenarios which are
tracking the Leader in motion, following the Leader in motion and tracking the Leader in
presence of obstacle were introduced. The test results from the first flight-formation
show the 99% success in tracking the Leader in motion when approximately 3600 training
sample photo of the leader provided and 83% accuracy in calculating the location of the Leader. The
second flight-formation test results show the ability of the Follower to follow the Leader
with roughly 2 seconds delay by utilizing the Vision System. The results of third
flight-formation show 85% success in tracking the Leader with 30% occlusion, 75% success in case of
50% occlusion and poor performance in case of 100% occlusion. The obtained test results
show the developed Vision System which is based on Feature-Detection to be a better alternative
to the existing approaches which are Color-based and Motion-based.
Download File
Additional Metadata
Actions (login required)
|
View Item |