2019; Project Zenith (DRDO SASE UAV Challenge - 8th Inter IIT Tech Meet)
- Guining Pertin
- Dec 31, 2019
- 4 min read
Updated: Oct 7
Introduction
This was my second Inter IIT experience at IIT Roorkee in 2019 during my junior year.
This project dealt with multi-drone system designed to perform search and rescue operations for Defence Research and Development Organization Snow and Avalanche Study Establishment (DRDO SASE) in India.
This specific project was handled by Aeromodelling Club IITG and I got the opportunity to work with a new group of people, with me coming from a more control and purely robotics background. The months long work and sleepless nights finally came to fruition as we won SILVER in the event.
PS: We also formed a great network with the clubs and groups from other IITs and had a great time exploring the different ideas and methods implemented by all other IITs.

Problem Statement
The teams were supposed to develop a UAV fleet that can fly outdoors over a grassy land, with autonomous take-off and landing, and with custom drones (using commercially available drones led to disqualification).
The drones were required to spot a set of target green boxes amongst a clutter of different objects spread randomly over the grassy land and subsequently communicate the location of the target to the ground control and other drones wirelessly. The maximum number of manual overrides were limited to 3 per team.
Note: The specific case of green boxes over a green field made it extremely difficult to detect the target boxes.
Our Solution
I mainly worked on the multi-UAV coverage flight trajectories and its implementation in ROS. I also worked partly on the target detection system.
Hardware:
We used the Pixhawk 4 FCU (a beauty) with PX4 firmware and MAVLink communication protocol running through ROS using MAVROS. The detection part was implemented on an NVIDIA Jetson Nano board. We used the default flight controller stack in PX4 to estimate the UAV state using the Extended Kalman Filters from their library, and used PID controllers for the velocity and position control components. The communication was performed through Wi-Fi for the companion boards, and Telemetry for FCU.
Multi-UAV Coverage:
This specific part deals with how different UAVs should cover the search space to speed up detection. Given a box region with completely random boxes, we started with linear search and voronoi cells based distribution

However, since we could easily come up with misdetections and missed detections, we could utilize another flight pattern that had more overlap (eg: Zamboni pattern)

![Other patterns from [3]](https://static.wixstatic.com/media/c38e82_ca7de8907de44874b8dd8c1a7d2817d3~mv2.png/v1/fill/w_980,h_297,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/c38e82_ca7de8907de44874b8dd8c1a7d2817d3~mv2.png)
Target Detection and Localization:
Object classification and localization (in camera frame) was performed using YOLOv3 algorithm. Everytime any of the UAVs detected a box, the GPS coordinates of all the box (after transformation from camera frame to drone frame and then world frame), were tracked onto a single map running on the remote ground control computer.
Since we were not given the exact number of boxes during the run, we decided to use DBSCAN (Density-based spatial clustering of applications with noise) algorithm to provide the final GPS coordinate estimates of the boxes. This algorithm is robust to outliers and efficient in preventing low-density clusters that are prone to happen due to noise (false positives). Also, we could set minimum density for the cluster to be formed, reducing the number of false positives.
![DBSCAN [wikipedia]](https://static.wixstatic.com/media/c38e82_f8d27a83413e450eaa61155b697364e5~mv2.png/v1/fill/w_400,h_290,al_c,q_85,enc_avif,quality_auto/c38e82_f8d27a83413e450eaa61155b697364e5~mv2.png)
However, that was not all; in the detection algorithm we inserted another small feature, specifically to improve detection recall. Our finetuned YOLOv3 was somewhat biased towards lower false positives but also had higher false negatives. These no-detection cases often failed DBSCAN's minimum density threshold (We realized this on the last day before the event when tested on the actual field).
As a trade-off, in cases where YOLO gave no detection but had prediction probabilities very close to the threshold, we passed the same image through another filter. We knew that grasses have very noisy image gradients due to their random shapes and orientations over the entire ground, but a structured shape like a cube would have defined image gradients. Hence, we used HOG (Histogram of oriented gradients) to obtain the first obtain the image gradients. Then, we used KPLS-mRMR (Kernel partial least squares) based feature selection algorithm [5] to select the top 30% gradient features. The final features were then passed through a light logistic regression classifier to give the final prediction value.
After some (hours of) trial and error and some duct-tape approaches, we managed to improve our recall by ~9% with minimal decrease in precision ~3%.
Results?
Well, we ended up doing pretty well compared to other teams, earning us the silver medal. One of our drones crashed during the run, but we did end up getting the 3/4 box coordinates quite close to the true values. Our detection and localization approach was specifically appreciated by the judges. But way more than that, it was more of a fun journey and a completely new area that I started working on (maybe the results were the friends we made along the way).
The Storm Troopers:
Manik Mittal (Darth Vader of course!)
Nitin Chauhan (Netrux)
Akash Sharma
Ankit Saini (Chappie)
Guining Pertin (Otoshuki)
Ajinkya Bhandare
Sankeerth Reddy
Prashamsa Talla


References:
Di Franco, Carmelo & Buttazzo, Giorgio. (2016). Coverage Path Planning for UAVs Photogrammetry with Energy and Resolution Constraints. Journal of Intelligent & Robotic Systems. 83. 10.1007/s10846-016-0348-x.
J. F. Araújo, P. B. Sujit and J. B. Sousa, “Multiple UAV area decomposition and coverage,” 2013 IEEE Symposium on Computational Intelligence for Security and Defense Applications (CISDA), Singapore, 2013, pp. 30-37.
T. M. Cabreira, C. D. Franco, P. R. Ferreira and G. C. Buttazzo, “Energy-Aware Spiral Coverage Path Planning for UAV Photogrammetric Applications,” in IEEE Robotics and Automation Letters, vol. 3, no. 4, pp. 3662-3668, Oct. 2018.
Cabreira, T.M.; Brisolara, L.B.; Ferreira Jr., P.R. Survey on Coverage Path Planning with Unmanned Aerial Vehicles. Drones 2019, 3, 4.
Talukdar, Upasana & Hazarika, Shyamanta & Gan, John. (2018). A Kernel Partial Least Square Based Feature Selection Method. Pattern Recognition. 83. 10.1016/j.patcog.2018.05.012.



Comments