Data-driven classification approaches for radar bird-drone surveillance

White, Daniel ORCID: 0000-0002-1032-4100 (2024). Data-driven classification approaches for radar bird-drone surveillance. University of Birmingham. Ph.D.

[img]
Preview
White2024PhD.pdf
Text - Accepted Version
Available under License All rights reserved.

Download (110MB) | Preview

Abstract

This thesis addresses the topic of drone classification using radar spectrograms to provide warning of unauthorised drone intrusions. Radar is an excellent detection sensor, but the similar size, speed and flying height of drones and birds necessities the development of robust methods for target classification. Deep learning approaches may allow classification of drone targets in benign and challenging circumstances, but in a security critical scenario where lives may be at stake, several advances in their training and validation are required. This thesis seeks to accelerate the reliability of deep learning approaches, and to substantiate such developments a collection of 564 drone flights and 932 bird tracks were gathered over ranges up to 5km by two L-band staring radar systems in a high-clutter, densely urban environment at the heart of Birmingham, England. Raw-data spectrograms crafted from these flights were used to train and test a convolutional neural network for the classification of real targets in a challenging operational condition, where we show drones possessing rotor micro-Doppler can be classified to >98% accuracy, while drones without any micro-Doppler are still able to be classified ∼90%.

As radar data is usually a scarce resource, this thesis also proposes approaches for synthetically increasing the quantities of available data, and additionally, a method to use such limited data more effectively in the training of classifiers. Initially, this work shows the production of highly-realistic synthetic micro-Doppler spectrograms that novelly uses motor speed recordings extracted from drone flight logs. This approach vastly increases the ease of creation and visual fidelity of simulated drone returns. The resampling of the recorded motor speed signal to fit the radar’s temporal reference was shown to have a strong effect on the results achieved with synthetic training. Finally, unsupervised deep learning implemented with an elementary convolutional autoencoder trained with a modest amount of synthetic and real radar data, was demonstrated to have both performance and interpretability advantages over the use of an optically trained, transfer learned classifier.

Type of Work: Thesis (Doctorates > Ph.D.)
Award Type: Doctorates > Ph.D.
Supervisor(s):
Supervisor(s)EmailORCID
Jahangir, MohammedUNSPECIFIEDorcid.org/0000-0002-5847-380X
Antoniou, MichailUNSPECIFIEDorcid.org/0000-0003-2977-2031
Baker, Christopher JUNSPECIFIEDorcid.org/0000-0003-2572-151X
Licence: All rights reserved
College/Faculty: Colleges > College of Engineering & Physical Sciences
School or Department: School of Engineering, Department of Electronic, Electrical and Systems Engineering
Funders: Other
Other Funders: Department for Transport, UK Gov., Aveillant Ltd., Rutherford Appleton Laboratory, Future Aviation Security Solutions programme
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
T Technology > TK Electrical engineering. Electronics Nuclear engineering
URI: http://etheses.bham.ac.uk/id/eprint/15101

Actions

Request a Correction Request a Correction
View Item View Item

Downloads

Downloads per month over past year