Back to Search
Start Over
Compressed Kernel Perceptrons
- Source :
- DCC
- Publication Year :
- 2009
- Publisher :
- IEEE, 2009.
-
Abstract
- Kernel machines are a popular class of machine learning algorithms that achieve state of the art accuracies on many real-life classification problems. Kernel perceptrons are among the most popular online kernel machines that are known to achieve high-quality classification despite their simplicity. They are represented by a set of B prototype examples, called support vectors, and their associated weights. To obtain a classification, a new example is compared to the support vectors. Both space to store a prediction model and time to provide a single classification scale as O(B). A problem with kernel perceptrons is that on noisy data the number of support vectors tends to grow without bounds with the number of training examples. To reduce the strain at computational resources, budget kernel perceptrons have been developed by upper bounding the number of support vectors. In this work, we propose a new budget algorithm that upper bounds the number of bits needed to store kernel perceptron. Setting the bitlength constraint could facilitate development of hardware and software implementations of kernel perceptrons on resource-limited devices such as microcontrollers. The proposed compressed kernel perceptron algorithm decides on the optimal tradeoff between number of support vectors and their bit precision. The algorithm was evaluated on several benchmark data sets and the results indicate that it can train highly accurate classifiers even when the available memory budget is below 1 Kbit. This promising result points to a possibility of implementing powerful learning algorithms even on the most resource-constrained computational devices.
Details
- Database :
- OpenAIRE
- Journal :
- 2009 Data Compression Conference
- Accession number :
- edsair.doi...........5829be9d007dc24571fca5b27bfce71c