Search results for key=ZSSa2010 :
1 match found.
Search my entire BibTeX database
Abstract 
BibTeX entry 
Postscript 
PDF 
Powerpoint 
2010

Nayyar Abbas Zaidi, David McG. Squire and David
Suter,
A Gradientbased Metric Learning Algorithm for kNN
Classifiers,
In Proceedings of the 23rd Australasian Joint Conference on
Artificial Intelligence,
Adelaide, Australia, No. 6464 in Lecture Notes in Computer Science, pp. 194203, SpringerVerlag, December 710 2010.
The Nearest Neighbor (NN) classification/regression
techniques, besides their simplicity, are amongst the most widely
applied and well studied techniques for pattern recognition in machine
learning. A drawback, however, is the assumption of the availability of
a suitable metric to measure distances to the k nearest neighbors. It
has been shown that kNN classifiers with a suitable distance metric
can perform better than other, more sophisticated, alternatives such as
Support Vector Machines and Gaussian Process classifiers. For this
reason, much recent research in kNN methods has focused on metric
learning, i.e. finding an optimized metric. In this paper we propose a
simple gradientbased algorithm for metric learning. We discuss in
detail the motivations behind metric learning, i.e. error minimization
and margin maximization. Our formulation differs from the prevalent
techniques in metric learning, where the goal is to maximize the
classifier's margin. Instead our proposed technique (MEGM) finds an
optimal metric by directly minimizing the mean square error. Our
technique not only results in greatly improved kNN performance, but
also performs better than competing metric learning techniques.
Promising results are reported on major UCIML databases.
