28 results on '"Kernel perceptron"'
Search Results
2. Arm Motion Capture and Recognition Algorithm Based on MEMS Sensor Networks and KPA
- Author
-
ZeYu Wang, Guoxing Yi, Lei Hu, and Zhihui Cao
- Subjects
Kernel perceptron ,business.industry ,Computer science ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Perceptron ,Motion capture ,Motion (physics) ,Support vector machine ,Units of measurement ,Inertial measurement unit ,Computer Science::Computer Vision and Pattern Recognition ,Computer vision ,Artificial intelligence ,business ,Wireless sensor network - Abstract
With the development of artificial intelligence, human-computer interaction technology has become more and more popular. Motion capture and recognition technology based on sensor networks and machine learning algorithms gradually drawn wide attention and extensive works have been done. This article introduces the arm motion capture system based on MEMS sensor networks that takes the micro inertial measurement unit as the core, and uses the Kernel Perceptron Algorithm (KPA) to realize motion classification and recognition. This algorithm combines the advantages of Support Vector Machine (SVM) and perceptron algorithm to gain a faster model training speed under the premise of high recognition accuracy. Extensive experiments can prove that arm motions can be accurately captured by MEMS sensor networks, and the KPA has a good performance of motion recognition and faster model training speed than SVM.
- Published
- 2021
3. Machine learning based smart steering for wireless mesh networks
- Author
-
Ozgur Gurbuz, Bulut Kuskonmaz, and Huseyin Ozkan
- Subjects
Kernel perceptron ,Wireless mesh network ,Computer Networks and Communications ,Computer science ,business.industry ,010401 analytical chemistry ,Mesh networking ,Online machine learning ,020206 networking & telecommunications ,02 engineering and technology ,Machine learning ,computer.software_genre ,01 natural sciences ,0104 chemical sciences ,Support vector machine ,Hardware and Architecture ,0202 electrical engineering, electronic engineering, information engineering ,Artificial intelligence ,Online algorithm ,business ,computer ,Software - Abstract
Steering actions in wireless mesh networks refer to requesting clients to change their access points (AP) for better exploiting the mesh network and achieving higher quality connections. However, steering actions for especially the sticky clients do not always successfully produce the intended outcome. In this work, we address this issue from a machine learning perspective as we formulate a classification problem in both batch (SVM) and online (kernel perceptron) setting based on various network features. We train classifiers to learn the nonlinear regions of correct decisions to maximize the overall success probability in steering actions. In particular, the presented online kernel perceptron classifier (1) performs learning sequentially at the cloud from the entire data of multiple mesh networks and (2) operates at APs for steering; both are executed in real-time. The presented algorithm is completely data driven, adaptive, optimal in its steering and real-time, hence named as Online Machine Learning for Smart Steering. Our batch algorithm is observed in our experiments to achieve -at least- 95% of classification accuracy in identifying the conditions for successful steering. Our online algorithm -on the other hand- successfully approximates the baseline accuracy by a small margin with relatively negligible space and computational complexity, allowing real-time steering.
- Published
- 2019
4. Autonomous Navigation in Unknown Environments using Sparse Kernel-based Occupancy Mapping
- Author
-
Nikolay Atanasov, Michael Yip, Thai P. Duong, and Nikhil Das
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,0209 industrial biotechnology ,Occupancy ,Computer science ,02 engineering and technology ,Systems and Control (eess.SY) ,Electrical Engineering and Systems Science - Systems and Control ,Machine Learning (cs.LG) ,Computer Science::Robotics ,Computer Science - Robotics ,020901 industrial engineering & automation ,0202 electrical engineering, electronic engineering, information engineering ,FOS: Electrical engineering, electronic engineering, information engineering ,Computer vision ,Kernel perceptron ,business.industry ,Autonomous robot ,Obstacle ,Decision boundary ,Robot ,020201 artificial intelligence & image processing ,Configuration space ,Artificial intelligence ,business ,Classifier (UML) ,Robotics (cs.RO) - Abstract
This paper focuses on real-time occupancy mapping and collision checking onboard an autonomous robot navigating in an unknown environment. We propose a new map representation, in which occupied and free space are separated by the decision boundary of a kernel perceptron classifier. We develop an online training algorithm that maintains a very sparse set of support vectors to represent obstacle boundaries in configuration space. We also derive conditions that allow complete (without sampling) collision-checking for piecewise-linear and piecewise-polynomial robot trajectories. We demonstrate the effectiveness of our mapping and collision checking algorithms for autonomous navigation of an Ackermann-drive robot in unknown environments., Comment: Accepted to ICRA 2020
- Published
- 2020
- Full Text
- View/download PDF
5. Kernel perceptron algorithm for sinusitis classification
- Author
-
Z. Rustam, J. Pandelaki, and Sri Hartini
- Subjects
History ,Kernel perceptron ,Computer science ,business.industry ,medicine ,Pattern recognition ,Artificial intelligence ,Sinusitis ,medicine.disease ,business ,Computer Science Applications ,Education - Abstract
Sinusitis is one of the most commonly diagnosed diseases in the world. Its diagnosis is usually based on clinical signs and symptoms, which led to the development and use of many machine learning methods to provide a better diagnosis. This research, therefore, proposed a kernel perceptron method applied to the sinusitis dataset, consisting of 102 acute and 98 chronic samples, obtained from Cipto Mangunkusumo Hospital in Indonesia. This research utilized the RBF and polynomial kernel function for several k values in k-fold cross-validation and compared the results in accuracy, sensitivity, precision, specificity, and Fl-Score. From the experiments, it was concluded that the kernel parameter σ = 0.0001 obtained excellent performance in every k-fold, with a better performance achieved using 10-fold cross-validation. Meanwhile, the polynomial degree did not affect the kernel perceptron performance. However, the use of 7-fold cross-validation can be considered to obtain better performance of kernel perceptron based on polynomial kernel.
- Published
- 2020
6. Experiments with Adabag in Biology Classification Tasks
- Author
-
María Pérez-Ortiz, E. Cernadas, and Manuel Fernández-Delgado
- Subjects
Kernel perceptron ,biology ,business.industry ,Pattern recognition ,Merluccius merluccius ,biology.organism_classification ,Fecundity ,Expression (mathematics) ,Software ,Simple (abstract algebra) ,Kernel (statistics) ,Classifier (linguistics) ,Artificial intelligence ,business - Abstract
The assessment of fecundity is fundamental in the study of biology and to define the management of sustainable fisheries. Stereometry is an accurate method to estimate fecundity from histological images. This chapter shows some histological images of fish species Merluccius merluccius. The direct kernel perceptron (DKP) is a very simple and fast kernel-based classifier, whose trainable parameters are calculated directly, without any iterative training, using an analytical closed-form expression that involves only the training patterns and the classes to which they belong. An accurate fish fecundity estimation must only consider mature oocytes, which must be reliably classified, according to their stage of development, by experienced personnel using histological images. The fish oocytes were manually drawn and labelled with the development stage by expert technicians of the Institute of Marine Research CSIC using Govocitos software. Adaboost.M1 in Weka (ABW) is much worse than the Adabag version in all the species and experiments.
- Published
- 2018
7. Extending instance-based and linear models
- Author
-
Eibe Frank, Christopher J. Pal, Mark Hall, and Ian H. Witten
- Subjects
Support vector machine ,Kernel method ,Kernel perceptron ,business.industry ,Polynomial kernel ,Kernel embedding of distributions ,Linear model ,Principal component regression ,Artificial intelligence ,Perceptron ,business ,Mathematics - Abstract
We begin by revisiting the basic instance-based learning method of nearest-neighbor classification and considering how it can be made more robust and storage efficient by generalizing both exemplars and distance functions. We then discuss two well-known approaches for generalizing linear models that go beyond modeling linear relationships between the inputs and the outputs. The first is based on the so-called kernel trick, which implicitly creates a high-dimensional feature space and models linear relationships in this extended space. We discuss support vector machines for classification and regression, kernel ridge regression, and kernel perceptrons. The second approach is based on applying simple linear models in a network structure that includes nonlinear transformations. This yields neural networks, and we discuss the classical multilayer perceptron. The final part of the chapter discusses an alternative method for tackling learning problems with complex relationships: building linear models that are local in the sense that they only apply to a small part of the input space. We consider model trees, which are decision trees with linear regression models at the leaf nodes, and locally weighted linear regression, which combines instance-based learning and linear regression.
- Published
- 2017
8. A study of visual behavior of multidimensional scaling for kernel perceptron algorithm
- Author
-
Kuo-Shong Wang, Shih-Hsing Chang, Che-Chang Hsu, and Hung-Yuan Chung
- Subjects
Kernel perceptron ,business.industry ,Feature vector ,Hilbert space ,Multimodal distribution ,Pattern recognition ,Machine learning ,computer.software_genre ,Visualization ,symbols.namesake ,Artificial Intelligence ,symbols ,Multidimensional scaling ,Artificial intelligence ,Sources of error ,business ,Classifier (UML) ,computer ,Software ,Mathematics - Abstract
The class imbalance problem occurs when the classifier is to detect a rare but important class. The purpose of this paper is to study whether possible sources of error are not only the imbalance but also other factors in combination, which lead to these misclassifications. The theoretical difficulties in purely predictive settings arise from the lack of visualization. Therefore, for kernel classifiers we propose the link with a kernel version of multidimensional scaling in high-dimensional feature space. The transformed version of the features specifically discloses the intrinsic structure of Hilbert space and is then used as inputs into a learning system: in the example, this prediction method is based on the SVMs-rebalance methodology. The graphical representations indicate the effects of masking, skewed, and multimodal distribution, which are also responsible for the poor performance. By studying the properties of the misclassifications, we can further develop ways to improve them.
- Published
- 2014
9. Formalized Generalization Bounds for Perceptron-Like Algorithms
- Author
-
Kelby, Robin J.
- Subjects
- Computer Science, Artificial Intelligence, Kernel Perceptron, Budget Kernel Perceptron, Software Verification, Coq, Machine learning, Generalization error
- Abstract
Machine learning algorithms are integrated into many aspects of daily life. However, research into the correctness and security of these important algorithms has lagged behind experimental results and improvements. My research seeks to add to our theretical understanding of the Perceptron family of algorithms, which includes the Kernel Perceptron, Budget Kernel Perceptron, and Description Kernel Perceptron algorithms. In this thesis, I will describe three variants of the Kernel Perceptron algorithm and provide both proof and performance results for verified implementations of these algorithms written in the Coq Proof Assistant. This research employs generalization error, which bounds how poorly a model may perform on unseen testing data, as a guarantee of performance with proofs verified in Coq. These implementations are also extracted to the functional language Haskell to evaluate their generalization error and performance results on real and synthetic data sets.
- Published
- 2020
10. Scalable classification for large dynamic networks
- Author
-
Yibo Yao and Lawrence B. Holder
- Subjects
Graph kernel ,Kernel perceptron ,business.industry ,Computer science ,Entropy (statistical thermodynamics) ,Feature extraction ,Pattern recognition ,Graph ,Support vector machine ,Kernel (linear algebra) ,ComputingMethodologies_PATTERNRECOGNITION ,Kernel method ,Discriminative model ,Kernel embedding of distributions ,Polynomial kernel ,String kernel ,Radial basis function kernel ,Entropy (information theory) ,Artificial intelligence ,Entropy (energy dispersal) ,Tree kernel ,business ,MathematicsofComputing_DISCRETEMATHEMATICS - Abstract
We examine the problem of node classification in large-scale and dynamically changing graphs. An entropy-based subgraph extraction method has been developed for extracting subgraphs surrounding the nodes to be classified. We introduce an online version of an existing graph kernel to incrementally compute the kernel matrix for a unbounded stream of these extracted subgraphs. After obtaining the kernel values, we adopt a kernel perceptron to learn a discriminative classifier and predict the class labels of the target nodes with their corresponding subgraphs. We demonstrate the advantages of our learning techniques by conducting empirical evaluations on two real-world graph datasets.
- Published
- 2015
11. Fuzzy kernel perceptron
- Author
-
Chu-Song Chen and Jiun-Hung Chen
- Subjects
Kernel perceptron ,Computer Networks and Communications ,business.industry ,Feature vector ,Pattern recognition ,General Medicine ,Perceptron ,Fuzzy logic ,Computer Science Applications ,Support vector machine ,ComputingMethodologies_PATTERNRECOGNITION ,Kernel method ,Hyperplane ,Artificial Intelligence ,Kernel (statistics) ,Artificial intelligence ,business ,Software ,Mathematics - Abstract
A new learning method, the fuzzy kernel perceptron (FKP), in which the fuzzy perceptron (FP) and the Mercer kernels are incorporated, is proposed in this paper. The proposed method first maps the input data into a high-dimensional feature space using some implicit mapping functions. Then, the FP is adopted to find a linear separating hyperplane in the high-dimensional feature space. Compared with the FP, the FKP is more suitable for solving the linearly nonseparable problems. In addition, it is also more efficient than the kernel perceptron (KP). Experimental results show that the FKP has better classification performance than FP, KP, and the support vector machine.
- Published
- 2002
12. Kernel ridge regression for super vised classification
- Author
-
S. Y. Kung
- Subjects
Multi-label classification ,Kernel perceptron ,business.industry ,Pattern recognition ,Machine learning ,computer.software_genre ,Multiclass classification ,Kernel method ,Lasso (statistics) ,Polynomial kernel ,Radial basis function kernel ,Artificial intelligence ,Kernel Fisher discriminant analysis ,business ,computer ,Mathematics - Published
- 2014
13. Offline Signature Verification Using Support Vector Machine
- Author
-
Deepika C. Shet and C. Kruthi
- Subjects
Support vector machine ,Kernel perceptron ,Digital signature ,Computer science ,business.industry ,Histogram ,Feature extraction ,Centroid ,Pattern recognition ,Artificial intelligence ,business ,Grayscale ,Edge detection - Abstract
This paper aims at developing a support vector machine for identity verification of offline signature based on the feature values in the database. A set of signature samples are collected from individuals and these signature samples are scanned in a gray scale scanner. These scanned signature images are then subjected to a number of image enhancement operations like binarization, complementation, filtering, thinning and edge detection. From these pre-processed signatures, features such as centroid, centre of gravity, calculation of number of loops, horizontal and vertical profile and normalized area are extracted and stored in a database separately. The values from the database are fed to the support vector machine which draws a hyper plane and classifies the signature into original or forged based on a particular feature value. The developed SVM is successfully tested against 336 signature samples and the classification error rate is less than 7.16% and this is found to be convincing.
- Published
- 2014
14. An Empirical Evaluation of the Fuzzy Kernel Perceptron
- Author
-
Gavin C. Cawley
- Subjects
Kernel perceptron ,biology ,Artificial neural network ,Computer Networks and Communications ,business.industry ,Fuzzy set ,Pattern recognition ,General Medicine ,biology.organism_classification ,Perceptron ,Fuzzy logic ,GeneralLiterature_MISCELLANEOUS ,Computer Science Applications ,Support vector machine ,ComputingMethodologies_PATTERNRECOGNITION ,Chen ,Kernel method ,Artificial Intelligence ,Artificial intelligence ,business ,Software ,Mathematics - Abstract
J.-H. Chen and C.-S. Chen have recently proposed a nonlinear variant of Keller and Hunt's fuzzy perceptron algorithm, based on the now familiar "kernel trick." In this letter, we demonstrate experimentally that J.-H. Chen and C.-S. Chen's assertion that the fuzzy kernel perceptron (FKP) outperforms the support vector machine (SVM) cannot be sustained. A more thorough model comparison exercise, based on a much wider range of benchmark data sets, shows that the FKP algorithm is not competitive with the SVM
- Published
- 2007
15. Direct Kernel Perceptron (DKP): ultra-fast kernel ELM-based classification with non-iterative closed-form weight calculation
- Author
-
E. Cernadas, José Neves, Jorge Ribeiro, Manuel Fernández-Delgado, and Senén Barro
- Subjects
Kernel perceptron ,Support Vector Machine ,business.industry ,Cognitive Neuroscience ,Feature vector ,Discriminant Analysis ,Pattern recognition ,Linear classifier ,Linear discriminant analysis ,Perceptron ,Classification ,Support vector machine ,ComputingMethodologies_PATTERNRECOGNITION ,Artificial Intelligence ,Linear Models ,Humans ,Computer Simulation ,Artificial intelligence ,AdaBoost ,Neural Networks, Computer ,business ,Algorithms ,Mathematics ,Extreme learning machine - Abstract
The Direct Kernel Perceptron (DKP) (Fernandez-Delgado et al., 2010) is a very simple and fast kernel-based classifier, related to the Support Vector Machine (SVM) and to the Extreme Learning Machine (ELM) (Huang, Wang, & Lan, 2011), whose @a-coefficients are calculated directly, without any iterative training, using an analytical closed-form expression which involves only the training patterns. The DKP, which is inspired by the Direct Parallel Perceptron, (Auer et al., 2008), uses a Gaussian kernel and a linear classifier (perceptron). The weight vector of this classifier in the feature space minimizes an error measure which combines the training error and the hyperplane margin, without any tunable regularization parameter. This weight vector can be translated, using a variable change, to the @a-coefficients, and both are determined without iterative calculations. We calculate solutions using several error functions, achieving the best trade-off between accuracy and efficiency with the linear function. These solutions for the @a coefficients can be considered alternatives to the ELM with a new physical meaning in terms of error and margin: in fact, the linear and quadratic DKP are special cases of the two-class ELM when the regularization parameter C takes the values C=0 and C=~. The linear DKP is extremely efficient and much faster (over a vast collection of 42 benchmark and real-life data sets) than 12 very popular and accurate classifiers including SVM, Multi-Layer Perceptron, Adaboost, Random Forest and Bagging of RPART decision trees, Linear Discriminant Analysis, K-Nearest Neighbors, ELM, Probabilistic Neural Networks, Radial Basis Function neural networks and Generalized ART. Besides, despite its simplicity and extreme efficiency, DKP achieves higher accuracies than 7 out of 12 classifiers, exhibiting small differences with respect to the best ones (SVM, ELM, Adaboost and Random Forest), which are much slower. Thus, the DKP provides an easy and fast way to achieve classification accuracies which are not too far from the best one for a given problem. The C and Matlab code of DKP are freely available.
- Published
- 2013
16. Incremental Learning on a Budget and Its Application to Power Electronics
- Author
-
Akinari Maeda, Kiyotaka Nakano, Yusuke Kondo, Koichiro Yamauchi, and Akihisa Kato
- Subjects
Kernel perceptron ,business.industry ,Computer science ,Machine learning ,computer.software_genre ,Upper and lower bounds ,Set (abstract data type) ,Kernel (linear algebra) ,Kernel method ,Kernel (statistics) ,Bounded function ,Power electronics ,Incremental learning ,Artificial intelligence ,business ,computer - Abstract
In this paper, we present an incremental learning method on a budget for embedded systems. We discuss its application for two power systems: a micro-converter for photovoltaic and a step down DC-DC-converter. This learning method is a variation of the general regression neural network but it is able to continue incremental learning on a bounded support set. The method basically learns new instances by adding new kernels. However, when the number of kernels reaches a predefined upper bound, the method selects the most effective learning option from several options: including replacing the most ineffective kernel with the new kernel, modifying of the parameters of existing kernels, and ignoring the new instance. The proposed method is compared with other similar learning methods on a budget, which are based on kernel perceptron. Two examples of the application of the proposed method are demonstrated in power electronics. In these two examples, we show that the proposed system learns the properties of the control-objects during the services and realizes quick control.
- Published
- 2013
17. Fast weight calculation for kernel-based perceptron in two-class classification problems
- Author
-
Jorge Ribeiro, Senén Barro, Manuel Fernández-Delgado, and E. Cernadas
- Subjects
Kernel perceptron ,Artificial neural network ,business.industry ,Pattern recognition ,Perceptron ,Linear discriminant analysis ,Support vector machine ,symbols.namesake ,ComputingMethodologies_PATTERNRECOGNITION ,Dimension (vector space) ,Kernel (statistics) ,symbols ,Artificial intelligence ,business ,Gaussian process ,Mathematics - Abstract
We propose a method, called Direct Kernel Perceptron (DKP), to directly calculate the weights of a single perceptron using a closed-form expression which does not require any training stage. The weigths minimize a performance measure which simultaneously takes into account the training error and the classification margin of the perceptron. The ability to learn non-linearly separable problems is provided by a kernel mapping between the input and the hidden space. Using Gaussian kernels, DKP achieves better results than the standard Support Vector Machine (SVM) and Linear Discriminant Analysis (LDA) for a wide variety of benchmark two-class data sets. The computational cost of DKP linearly increases with the dimension of the input space and it is much lower than the corresponding to SVM.
- Published
- 2010
18. Learning to rank with a novel kernel perceptron method
- Author
-
Xiaotong Lin, Haixun Wang, and Xue-wen Chen
- Subjects
Active learning (machine learning) ,Computer science ,Feature vector ,Stability (learning theory) ,Semi-supervised learning ,computer.software_genre ,Machine learning ,Ranking SVM ,Instance-based learning ,Kernel perceptron ,business.industry ,Supervised learning ,Online machine learning ,Perceptron ,Ensemble learning ,Generalization error ,Ranking ,Kernel (statistics) ,Outlier ,Decision boundary ,Unsupervised learning ,Learning to rank ,Artificial intelligence ,Data mining ,business ,computer - Abstract
While conventional ranking algorithms, such as the PageRank, rely on the web structure to decide the relevancy of a web page, learning to rank seeks a function capable of ordering a set of instances using a supervised learning approach. Learning to rank has gained increasing popularity in information retrieval and machine learning communities. In this paper, we propose a novel nonlinear perceptron method for rank learning. The proposed method is an online algorithm and simple to implement. It introduces a kernel function to map the original feature space into a nonlinear space and employs a perceptron method to minimize the ranking error by avoiding converging to a solution near the decision boundary and alleviating the effect of outliers in the training dataset. Furthermore, unlike existing approaches such as RankSVM and RankBoost, the proposed method is scalable to large datasets for online learning. Experimental results on benchmark corpora show that our approach is more efficient and achieves higher or comparable accuracies in instance ranking than state of the art methods such as FRank, RankSVM and RankBoost.
- Published
- 2009
19. Compressed Kernel Perceptrons
- Author
-
Vladimir Coric, Slobodan Vucetic, and Zhuang Wang
- Subjects
Kernel perceptron ,business.industry ,Computer science ,Machine learning ,computer.software_genre ,Kernel method ,String kernel ,Polynomial kernel ,Kernel embedding of distributions ,Kernel (statistics) ,Radial basis function kernel ,Artificial intelligence ,Tree kernel ,business ,computer - Abstract
Kernel machines are a popular class of machine learning algorithms that achieve state of the art accuracies on many real-life classification problems. Kernel perceptrons are among the most popular online kernel machines that are known to achieve high-quality classification despite their simplicity. They are represented by a set of B prototype examples, called support vectors, and their associated weights. To obtain a classification, a new example is compared to the support vectors. Both space to store a prediction model and time to provide a single classification scale as O(B). A problem with kernel perceptrons is that on noisy data the number of support vectors tends to grow without bounds with the number of training examples. To reduce the strain at computational resources, budget kernel perceptrons have been developed by upper bounding the number of support vectors. In this work, we propose a new budget algorithm that upper bounds the number of bits needed to store kernel perceptron. Setting the bitlength constraint could facilitate development of hardware and software implementations of kernel perceptrons on resource-limited devices such as microcontrollers. The proposed compressed kernel perceptron algorithm decides on the optimal tradeoff between number of support vectors and their bit precision. The algorithm was evaluated on several benchmark data sets and the results indicate that it can train highly accurate classifiers even when the available memory budget is below 1 Kbit. This promising result points to a possibility of implementing powerful learning algorithms even on the most resource-constrained computational devices.
- Published
- 2009
20. SIRMs connected fuzzy inference method using kernel method
- Author
-
F. Mizuguchi, Hiroaki Ishii, Masaharu Mizumoto, Soichiro Watanabe, and Hirosato Seki
- Subjects
Kernel perceptron ,Kernel method ,business.industry ,Kernel (statistics) ,Fuzzy set ,Method of steepest descent ,Exclusive or ,Pattern recognition ,Function (mathematics) ,Artificial intelligence ,business ,Fuzzy logic ,Mathematics - Abstract
Single input rule modules connected fuzzy inference method (SIRMs method, for short) by Yubazaki can decrease the number of fuzzy rules drastically in comparison with the conventional fuzzy inference methods. Seki et al. have proposed functional type single input rule modules connected fuzzy inference method (functional type SIRMs method, for short) which generalizes the consequent part of SIRMs method to function. However, these SIRMs methods can not be applied to XOR (exclusive OR). In this paper, we propose ldquokernel type single input rule modules connected fuzzy inference methodrdquo which uses kernel trick to SIRMs method, and show that this method can treat XOR. Further, learning algorithm of the proposed SIRMs method is derived by using the steepest descent method, and is shown to be superior to the one of conventional SIRMs method and kernel perceptron by applying to identification of nonlinear functions.
- Published
- 2008
21. Accelerating Kernel Perceptron Learning
- Author
-
José R. Dorronsoro, Ana González, and Daniel Jaque García
- Subjects
Kernel perceptron ,business.industry ,Training (meteorology) ,Acceleration (differential geometry) ,Pattern recognition ,Perceptron ,Support vector machine ,Hyperplane ,Sample size determination ,Margin (machine learning) ,Artificial intelligence ,business ,Algorithm ,Mathematics - Abstract
Recently it has been shown that appropriate perceptron training methods, such as the Schlesinger-Kozinec (SK) algorithm, can provide maximal margin hyperplanes with training costs O(N ×T), with N denoting sample size and T the number of training iterations. In this work we shall relate SK training with the classical Rosenblatt rule and show that, when the hyperplane vector is written in dual form, the support vector (SV) coefficients determine their training appearance frequency; in particular, large coefficient SVs penalize training costs. Under this light we shall explore a training acceleration procedure in which large coefficient and, hence, large cost SVs are removed from training and that allows for a further stable large sample shrinking. As we shall see, this results in a much faster training while not penalizing test classification.
- Published
- 2007
22. A Multiclass Kernel Perceptron Algorithm
- Author
-
Jianhua Xu and Xuegong Zhang
- Subjects
Computer Science::Machine Learning ,Kernel perceptron ,Structured support vector machine ,business.industry ,Computer science ,Computer Science::Neural and Evolutionary Computation ,Pattern recognition ,Perceptron ,Multiclass classification ,Support vector machine ,ComputingMethodologies_PATTERNRECOGNITION ,Kernel method ,Computer Science::Computer Vision and Pattern Recognition ,Radial basis function kernel ,Artificial intelligence ,Kernel Fisher discriminant analysis ,business ,Algorithm - Abstract
Original kernel machines (e.g., support vector machine, least squares support vector machine, kernel Fisher discriminant analysis, kernel perceptron algorithm, and etc.) were mainly designed for binary classification. How to effectively extend them for multiclass classification is still an ongoing research issue. Rosenblatt's linear perceptron algorithm for binary classification and its corresponding multiclass linear version are the simplest learning machines according to their algorithmic routines. Kernel perceptron algorithm for binary classification was constructed by extending linear perceptron algorithm with Mercer kernel. In this paper, a multiclass kernel perceptron algorithm is proposed by combining multiclass linear perceptron algorithm with binary kernel perceptron algorithm, which can deal with multiclass classification problem directly and nonlinearly in a simple iterative procedure. Two artificial examples and four benchmark datasets are used to evaluate the performance of our multiclass method. The experimental results show that our algorithm could achieve the good classification performance
- Published
- 2006
23. Selecting the kernel type for a web-based adaptive image retrieval systems (AIRS)
- Author
-
Anca Doloc-Mihu and Vijay V. Raghavan
- Subjects
Kernel perceptron ,business.industry ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Pattern recognition ,Machine learning ,computer.software_genre ,Kernel principal component analysis ,Kernel embedding of distributions ,Variable kernel density estimation ,String kernel ,Polynomial kernel ,Computer Science::Computer Vision and Pattern Recognition ,Radial basis function kernel ,Artificial intelligence ,Tree kernel ,business ,computer ,Mathematics - Abstract
The goal of this paper is to investigate the selection of the kernel for a Web-based AIRS. Using the Kernel Perceptron learning method, several kernels having polynomial and Gaussian Radial Basis Function (RBF) like forms (6 polynomials and 6 RBFs) are applied to general images represented by color histograms in RGB and HSV color spaces. Experimental results on these collections show that performance varies significantly between different kernel types and that choosing an appropriate kernel is important.
- Published
- 2006
24. A soft Bayes perceptron
- Author
-
M. Bruckner and W. Dilger
- Subjects
Kernel perceptron ,business.industry ,Computer science ,Computer Science::Neural and Evolutionary Computation ,Pattern recognition ,Machine learning ,computer.software_genre ,Perceptron ,Bayes' theorem ,Kernel (linear algebra) ,ComputingMethodologies_PATTERNRECOGNITION ,Kernel method ,Kernel (statistics) ,Benchmark (computing) ,Point (geometry) ,Artificial intelligence ,business ,computer - Abstract
The kernel perceptron is one of the simplest and fastest kernel machines, its performance, however, is inferior to other well known kernel machines. We introduce an algorithm that combines several approaches, mainly Herbrich's large-scale Bayes point machine and the soft perceptron in order to improve the kernel perceptron. Our experiments, which were based on standard benchmark datasets, show that the performance of the perceptron can be improved significantly with similar computational effort.
- Published
- 2006
25. Learning on Graphs in the Game of Go
- Author
-
Marco Krüger, Thore Graepel, Ralf Herbrich, and Mike Goutrié
- Subjects
Support vector machine ,Kernel (linear algebra) ,Kernel perceptron ,Artificial neural network ,Computer science ,business.industry ,Feature vector ,Graph theory ,Artificial intelligence ,Game tree ,Perceptron ,business ,Graph - Abstract
We consider the game of Go from the point of view of machine learning and as a well-defined domain for learning on graph representations. We discuss the representation of both board positions and candidate moves and introduce the common fate graph (CFG) as an adequate representation of board positions for learning. Single candidate moves are represented as feature vectors with features given by subgraphs relative to the given move in the CFG. Using this representation we train a support vector machine (SVM) and a kernel perceptron to discriminate good moves from bad moves on a collection of life-and-death problems and on 9 × 9 game records. We thus obtain kernel machines that solve Go problems and play 9 × 9 Go.
- Published
- 2001
26. Learning Kernel Classifiers
- Author
-
Ralf Herbrich
- Subjects
Graph kernel ,Theoretical computer science ,Kernel perceptron ,Computer science ,business.industry ,Machine learning ,computer.software_genre ,Kernel method ,Computational learning theory ,Kernel embedding of distributions ,Polynomial kernel ,Radial basis function kernel ,Artificial intelligence ,Tree kernel ,business ,Algorithm ,computer - Abstract
From the Publisher: Linear classifiers in kernel spaces have emerged as a major topic within the field of machine learning. The kernel technique takes the linear classifier--a limited, but well-established and comprehensively studied model--and extends its applicability to a wide range of nonlinear pattern-recognition tasks such as natural language processing, machine vision, and biological sequence analysis. This book provides the first comprehensive overview of both the theory and algorithms of kernel classifiers, including the most recent developments. It begins by describing the major algorithmic advances: kernel perceptron learning, kernel Fisher discriminants, support vector machines, relevance vector machines, Gaussian processes, and Bayes point machines. Then follows a detailed introduction to learning theory, including VC and PAC-Bayesian theory, data-dependent structural risk minimization, and compression bounds. Throughout, the book emphasizes the interaction between theory and algorithms: how learning algorithms work and why. The book includes many examples, complete pseudo code of the algorithms presented, and an extensive source code library.
- Published
- 2001
27. The perceptron: A probabilistic model for information storage and organization in the brain
- Author
-
Frank Rosenblatt
- Subjects
Winnow ,Models, Statistical ,Kernel perceptron ,Artificial neural network ,business.industry ,Computer science ,media_common.quotation_subject ,Brain ,Information Storage and Retrieval ,Cognition ,Statistical model ,Machine learning ,computer.software_genre ,Perceptron ,Perception ,Harmonic Grammar ,Humans ,Neural Networks, Computer ,Artificial intelligence ,business ,computer ,General Psychology ,media_common - Published
- 1958
28. Cryptographically private support vector machines
- Author
-
Taneli Mielikäinen, Helger Lipmaa, and Sven Laur
- Subjects
Computer Science::Computer Science and Game Theory ,Kernel perceptron ,Computer science ,business.industry ,Computation ,02 engineering and technology ,16. Peace & justice ,computer.software_genre ,Machine learning ,Encryption ,Support vector machine ,Kernel (linear algebra) ,ComputingMethodologies_PATTERNRECOGNITION ,Kernel method ,Polynomial kernel ,020204 information systems ,Computer Science::Networking and Internet Architecture ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Data mining ,Artificial intelligence ,business ,computer ,Classifier (UML) ,Computer Science::Cryptography and Security - Abstract
We propose private protocols implementing the Kernel Adatron and Kernel Perceptron learning algorithms, give private classification protocols and private polynomial kernel computation protocols. The new protocols return their outputs - either the kernel value, the classifier or the classifications - in encrypted form so that they can be decrypted only by a common agreement by the protocol participants. We show how to use the encrypted classifications to privately estimate many properties of the data and the classifier. The new SVM classifiers are the first to be proven private according to the standard cryptographic definitions.
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.