Deep learning for radar classification of drones

Dale, Holly (2023). Deep learning for radar classification of drones. University of Birmingham. Ph.D.

[img]
Preview
Dale2023PhD.pdf
Text - Accepted Version
Available under License All rights reserved.

Download (54MB) | Preview

Abstract

Proliferation in drone numbers requires the development of a surveillance system for monitoring low-altitude airspace occupancy and ensuring safety in the skies. Radar offers a method of achieving 24-hour surveillance in all weather conditions, making it an appropriate solution for this challenge. In order to be effective, a radar solution needs to be able to discriminate between birds and drones for the removal of false alarms.

This thesis investigates the use of convolutional neural networks (CNNs) for drone classification. A CNN is trained for drone-bird discrimination on radar spectrograms obtained using an L-band staring radar, and performance is assessed and compared with machine learning benchmarks. The CNN is validated on an extensive data set, and the classifier’s ability to generalise against new models of drone and unseen clutter environments is investigated. Dynamic classifier selection is used to make an intelligent choice of CNN classifier based on features extracted from the tracker and the spectrogram, leading to a reduction in false positives whilst maintaining a high true positive rate at low signal to noise ratio (SNR).

It is desirable to classify drones at longer ranges (therefore with lower SNR) to maximise the time available to deploy drone countermeasures. CNN classification performance is therefore investigated as a function of SNR, firstly through the addition of Gaussian noise to an experimentally obtained dataset of radar spectrograms, and secondly through data collected in increasingly demanding environments. Whilst classification accuracy falls with decreasing SNR, the augmentation of training data is shown to enhance performance at low SNR by 14%. Bayesian optimisation is used to maximise performance through the optimisation of augmentation hyperparameters, leading to an improvement in performance of more than 10%.

Finally, CNNs are used to perform drone vs drone classification. Performance is established for drone group (fixed wing vs rotary wing, F1 score 0.97), drone subgroup (F1 score 0.95) and drone model (F1 score 0.67) classification using CNNs, across 9 different drone models measured at Ku-band. Multiple classification strategies are considered (single stage and multi-stage), and the relationship between dwell-time and performance is explored, forming a comprehensive investigation of CNN use for drone-to-drone classification.

The feature space of drone classification is investigated, and decision tree classifiers are used to explore the relationship between processing length, coherency, and performance. Training parallel classifiers on various processing lengths and fusing the results leads to an improvement in performance of 26%.

Type of Work: Thesis (Doctorates > Ph.D.)
Award Type: Doctorates > Ph.D.
Supervisor(s):
Supervisor(s)EmailORCID
Antoniou, MichailUNSPECIFIEDUNSPECIFIED
Baker, ChristopherUNSPECIFIEDUNSPECIFIED
Licence: All rights reserved
College/Faculty: Colleges (2008 onwards) > College of Engineering & Physical Sciences
School or Department: School of Engineering, Department of Electronic, Electrical and Systems Engineering
Funders: Engineering and Physical Sciences Research Council
Subjects: T Technology > TK Electrical engineering. Electronics Nuclear engineering
URI: http://etheses.bham.ac.uk/id/eprint/13901

Actions

Request a Correction Request a Correction
View Item View Item

Downloads

Downloads per month over past year