Back to Search Start Over

Divergence-based classification in learning vector quantization

Authors :
Mwebaze, E.
Schneider, P.
Schleif, F.-M.
Aduwo, J.R.
Quinn, J.A.
Haase, S.
Villmann, T.
Biehl, M.
Source :
Neurocomputing. Apr2011, Vol. 74 Issue 9, p1429-1435. 7p.
Publication Year :
2011

Abstract

Abstract: We discuss the use of divergences in dissimilarity-based classification. Divergences can be employed whenever vectorial data consists of non-negative, potentially normalized features. This is, for instance, the case in spectral data or histograms. In particular, we introduce and study divergence based learning vector quantization (DLVQ). We derive cost function based DLVQ schemes for the family of which includes the well-known Kullback–Leibler divergence and the so-called Cauchy–Schwarz divergence as special cases. The corresponding training schemes are applied to two different real world data sets. The first one, a benchmark data set (Wisconsin Breast Cancer) is available in the public domain. In the second problem, color histograms of leaf images are used to detect the presence of cassava mosaic disease in cassava plants. We compare the use of standard Euclidean distances with DLVQ for different parameter settings. We show that DLVQ can yield superior classification accuracies and Receiver Operating Characteristics. [Copyright &y& Elsevier]

Details

Language :
English
ISSN :
09252312
Volume :
74
Issue :
9
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
59641124
Full Text :
https://doi.org/10.1016/j.neucom.2010.10.016