L o a d i n g

Project Overview

The advancement of drone technology brings forth the challenge of achieving precise and efficient object tracking. This project addresses this by leveraging advanced algorithms, including YOLOv2 for object detection and MOSSE (Minimum Output Sum of Squared Error) for real-time tracking. These methodologies are integrated using co-design principles to enhance tracking accuracy, responsiveness, and energy efficiency.

Core Methodologies

  • YOLOv2 Object Detection: YOLOv2 was used for initial detection of objects. It enabled rapid localization of targets with minimal computational overhead. Trained on a dataset of 2700 images, YOLOv2 provided bounding boxes to initialize the MOSSE tracker.
  • MOSSE Tracking Algorithm: MOSSE is a lightweight, adaptive tracking algorithm that operates in the frequency domain. It continuously updated its filters based on a heuristic parameter, ensuring robust performance even under dynamic conditions.

Design Optimization

The project incorporated co-design principles to align hardware capabilities with algorithmic efficiencies. Key talent metrics included image resolution, frames per second (FPS), cruising speed, and flight range. These parameters were optimized to achieve superior tracking accuracy while ensuring energy-efficient operations.

Hardware-Software Integration

Hardware Considerations
  • Onboard computing resources
  • Camera specifications and mounting
  • Battery capacity and power management
  • Flight control systems
Software Optimizations
  • Algorithm efficiency improvements
  • Real-time processing pipelines
  • Adaptive parameter tuning
  • Failure recovery strategies

Performance Metrics

The co-design approach focused on optimizing several key performance metrics:

  • Tracking Accuracy: Measured by intersection over union (IoU) between predicted and ground truth bounding boxes
  • Processing Speed: Frame rate achievable with the full detection and tracking pipeline
  • Energy Efficiency: Flight time achievable while continuously running tracking algorithms
  • Robustness: Performance under challenging conditions like occlusions, lighting changes, and fast movements

Simulation and Results

A simulation environment was developed using MATLAB and Simulink. The environment allowed for testing various scenarios by varying resolution, FPS, and algorithmic hyperparameters. Depth cameras were used to estimate the distance between the UAV and the target drone, replacing costly alternatives like LiDAR.

Simulation Framework

The simulation environment incorporated:

  • Realistic UAV flight dynamics
  • Virtual camera sensors with configurable parameters
  • Simulated environmental conditions (lighting, obstacles)
  • Various target movement patterns
Key Results
Metric Baseline System Optimized Co-Design Improvement
Tracking Accuracy (IoU) 0.72 0.86 19.4%
Processing Time (ms/frame) 48 23 52.1%
Flight Time (minutes) 18 27 50.0%
Target Recovery Time (s) 3.8 1.2 68.4%

Applications

The drone tracking technology developed in this project has numerous potential applications:

Search and Rescue

Tracking and following missing persons or vehicles in emergency situations, providing real-time visual data to rescue teams.

Wildlife Monitoring

Tracking animal movements in natural habitats for conservation research without human intervention that might disturb natural behaviors.

Cinematography

Autonomous filming of moving subjects for documentaries, sports events, and other dynamic filming scenarios requiring consistent framing.

Security Surveillance

Tracking suspicious vehicles or individuals in security applications, providing continuous monitoring with minimal human intervention.

Technologies Used

Python
MATLAB
Simulink
OpenCV
YOLOv2
PX4

Conclusion

This project showcases a robust approach to improving real-time drone tracking by integrating cutting-edge algorithms and co-design principles. The optimization of hardware and software parameters resulted in enhanced tracking accuracy, faster response times, and improved energy efficiency. These advancements have broad implications for applications such as search-and-rescue missions, wildlife monitoring, and live-event broadcasting.

Project Information

  • Category: Computer Vision, UAV Systems
  • Duration: 4 months
  • Completed: 2023
  • Institution: University at Buffalo

Interested in this project?

If you're interested in learning more about our concurrent UAV design for real-time tracking or exploring potential collaborations in drone technology, please get in touch.

Contact Me