Skip to main navigation Skip to search Skip to main content

Burg matrix divergence based multi-metric learning

    Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

    Abstract

    The basic idea of most distance metric learning methods is to find a space that can optimally classify data points belong to different categories. However, current methods only learn one Mahalanobis distance for each data set, which actually fails to perfectly classify different categories in most real world applications. To improve the classification accuracy of k-nearest-neighbour algorithm, a multi-metric learning method is proposed in this paper to completely classify different categories by sequentially learning sub-metrics. The proposed algorithm is based on minimizing the Burg matrix divergence between metrics. The experiments on five UCI data sets demonstrate the improved performance of Multi-Metric learning when comparing with the state-of-the-art methods.
    Original languageEnglish
    Title of host publicationECAI 2016
    PublisherIOS Press
    Pages1553-1554
    Volume285
    ISBN (Print)9781614996712
    DOIs
    Publication statusPublished - 2016
    Event22nd European Conference on Artificial Intelligence, ECAI 2016 - The Hague, Netherlands
    Duration: 29 Aug 20162 Sept 2016

    Publication series

    NameFrontiers in Artificial Intelligence and Applications
    Volume285
    ISSN (Print)0922-6389

    Conference

    Conference22nd European Conference on Artificial Intelligence, ECAI 2016
    PlaceNetherlands
    CityThe Hague
    Period29/08/162/09/16

    Fingerprint

    Dive into the research topics of 'Burg matrix divergence based multi-metric learning'. Together they form a unique fingerprint.

    Cite this