Target detection for very high-frequency synthetic aperture radar ground surveillance

W. Ye, C. Paulson, D. Wu

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

14 Citations (Scopus)

Abstract

A target detection algorithm is developed based on a supervised learning technique that maximises the margin between two classes, that is, the target class and the non-target class. Specifically, the proposed target detection algorithm consists of (i) image differencing, (ii) maximum-margin classifier, and (iii) diversity combining. The image differencing is to enhance and highlight the targets so that the targets are more distinguishable from the background. The maximum-margin classifier is based on a recently developed feature weighting technique called Iterative RELIEF; the objective of the maximum-margin classifier is to achieve robustness against uncertainties and clutter. The diversity combining utilises multiple images to further improve the performance of detection, and hence it is a type of multi-pass change detection. The authors evaluate the performance of the proposed detection algorithm, using the CARABAS-II synthetic aperture radar (SAR) image data and the experimental results demonstrate superior performance of the proposed algorithm, compared to the benchmark algorithm. © 2012 The Institution of Engineering and Technology.
Original languageEnglish
Pages (from-to)101-110
JournalIET Computer Vision
Volume6
Issue number2
DOIs
Publication statusPublished - Mar 2012
Externally publishedYes

Bibliographical note

Publication details (e.g. title, author(s), publication statuses and dates) are captured on an “AS IS” and “AS AVAILABLE” basis at the time of record harvesting from the data source. Suggestions for further amendments or supplementary information can be sent to [email protected].

Fingerprint

Dive into the research topics of 'Target detection for very high-frequency synthetic aperture radar ground surveillance'. Together they form a unique fingerprint.

Cite this