Team Foxtrot is a team of undergraduate students from Ghulam Ishaq Khan Institute, Pakistan, that develops unmanned aerial vehicles. In 2024, the team is participating in the SUAS Competition 2024, which is an international drone competition that challenges teams to design and build an autonomous drone capable of performing various tasks, including waypoint navigation, object detection, and payload delivery. The complete details of the competition can be found here.
At the start of the mission, 4 bottles are given to the team, marked with 4 labels describing the target markers. The drone must perform an area search and drop the relevant payload on each target marker autonomously. Here are the details of the g 65E0 round markers:
- It is a colored shape containing a colored alphanumeric
- It is printed on an A4 sized paper
- The drone must be at a minimum height of 25 meters.
The project uses a YOLOv8 model for real-time object detection and classification. The drone's onboard computer, Jetson Orin Nano, processes the video feed, detects the target objects, and communicates with the flight controller using Python's Dronekit library to execute the airdrop. Tarot Peeper camera is used for its 10x optical zoom so that detection can be performed at a height of 25 meters.
Here's a breakdown of the detection pipeline:
- Web Scraping was used to gather hundreds of background images from the internet.
- OpenCV Python was used to generate a synthetic dataset of the target markers, generating upto 20,000 images within 2 minutes.
- YOLOv8-medium detects and localizes shapes. It provides the cropped image of the detected shape.
- YOLOv8-nano classifies letters. Since every shape is guaranteed to contain a letter, classification is sufficient.
- HSV values of each cropped shape are used to determine the most frequent colors. The first and second most frequent colors are used to identify the shape and letter respectively.
Here's how the flow of information occurs:
- Detection Model → Runs on the onboard computer and processes video feed. When target marker is detected, it sends the information to Dronekit script via sockets.
- Dronekit Python → Receives detection results, calculates a score for each detection identifying the most likely target for each bottle, stores GPS coordinates for each marker and after the area search, sends commands to navigate and drop payloads.
- MAVProxy → Acts as a relay between Dronekit and the flight controller.
- Flight Controller → Executes commands to control the drone.