10 results on '"Shu-Xia Lu"'
Search Results
2. Condensed Nearest Neighbor Decision Rule Input Weight Sequential Feed-Forward Neural Networks
- Author
-
Yang Fan Zhou, Shu Xia Lu, and Bin Liu
- Subjects
Support vector machine ,Artificial neural network ,business.industry ,Generalization ,Computer science ,Feed forward neural ,Pattern recognition ,General Medicine ,Decision rule ,Artificial intelligence ,business ,Extreme learning machine ,k-nearest neighbors algorithm - Abstract
This paper proposes a new approach is referred to as condensed nearest neighbor decision rule (CNN) input weight sequential feed-forward neural networks (CIW-SFFNS). In this paper, it is firstly shown that the difference of optimization constraints between the extreme learning machine (ELM) and constrained-optimization-based extreme learning machine. For the second time, this paper proposes a method that using CNN to select the hidden-layer weights from example. Moreover, we compare error minimized extreme learning machines (EM-ELM), support vector sequential feed-forward neural networks (SV-SFFNS) and CIW-SFFNS from two aspects:test accuracy and the number of hidden nodes. We present the result of an experimental study on 10 classification sets. The CIW-SFFNS algorithm has a statistically significant improvement in generalization performance than EM-ELM and SV-SFFNS.
- Published
- 2014
3. A new and informative active learning approach for support vector machine
- Author
-
Shu-Xia Lu, Xizhao Wang, and Lisha Hu
- Subjects
Information Systems and Management ,Active learning (machine learning) ,Computer science ,Relevance feedback ,Semi-supervised learning ,Machine learning ,computer.software_genre ,Theoretical Computer Science ,Relevance vector machine ,Artificial Intelligence ,Instance-based learning ,Training set ,Structured support vector machine ,business.industry ,Online machine learning ,Computer Science Applications ,Support vector machine ,ComputingMethodologies_PATTERNRECOGNITION ,Hyperplane ,Computational learning theory ,Control and Systems Engineering ,Active learning ,Margin classifier ,Unsupervised learning ,Artificial intelligence ,Data mining ,business ,computer ,Software - Abstract
Active learning approach has been integrated with support vector machine or other machine-learning techniques in many areas. However, the challenge is: Unlabeled instances are often abundant or easy to obtain, but their labels are expensive and time-consuming to get in general. In spite of this, most existing methods cannot guarantee the usefulness of each query in learning a new classifier. In this paper, we propose a new active learning approach of selecting the most informative query for annotation. Unlabeled instance, which is nearest to the support vector machine’s hyperplane learnt from both the unlabeled instance itself and all labeled instances, is selected as the query for annotation. Merits of these queries in learning a new optimal hyperplane have been assured before they are annotated and put into the training set. Experimental results on several UCI datasets have shown the efficiency of our approach.
- Published
- 2013
4. FAST FUZZY MULTICATEGORY SVM BASED ON SUPPORT VECTOR DOMAIN DESCRIPTION
- Author
-
Xizhao Wang, Junhai Zhai, and Shu-Xia Lu
- Subjects
Fuzzy classification ,Structured support vector machine ,business.industry ,Pattern recognition ,Fuzzy logic ,Multicategory ,Support vector machine ,Relevance vector machine ,Artificial Intelligence ,Sequential minimal optimization ,Computer Vision and Pattern Recognition ,Artificial intelligence ,Quadratic programming ,business ,Software ,Mathematics - Abstract
This paper proposes a fast fuzzy classifier of multicategory support vector machines (FMSVM) based on support vector domain description (SVDD). The main idea is that the proposed FMSVM is obtained by directly considering all data in one optimization formulation, using a fuzzy membership to each input point. The fuzzy membership is determined by support vector domain description (SVDD). For making support vector machine (SVM) more practical, we use an implement of the modified sequential minimal optimization (SMO) that can quickly solve SVM quadratic programming (QP) problems without any extra matrix storage or the use of numerical QP optimization steps at all. Compared with the existing SVMs, the newly proposed FMSVM that uses the L2-norm in the objective function shows improvement with regards to accuracy of classification and reduction of the effects of noises and outliers. The experiment also shows the efficiency of the modified SMO for expediting the training of SVM.
- Published
- 2008
5. Imbalanced extreme support vector machine
- Author
-
Shu-Xia Lu, Meng Zhang, Li-Sha Hu, and Xu Zhou
- Subjects
Structured support vector machine ,Geometric analysis ,business.industry ,Generalization ,Pattern recognition ,Machine learning ,computer.software_genre ,Support vector machine ,Relevance vector machine ,ComputingMethodologies_PATTERNRECOGNITION ,Hyperplane ,Margin classifier ,Artificial intelligence ,business ,Normal ,computer ,Mathematics - Abstract
For the problem of imbalanced data classification which was not discussed in the standard Extreme Support Vector Machines (ESVM), an imbalanced extreme support vector machines (IESVM) was proposed. Firstly, a preliminary normal vector of separating hyperplane is obtained directly by geometric analysis. Secondly, penalty factors are obtained which are based on the information provided by data sets projecting onto the preliminary normal vector. Finally, the final separation hyperplane is got through the improved ESVM training. IESVM can overcome disadvantages of traditional designing methods which only consider the imbalance of samples size and can improve the generalization ability of ESVM. Experimental results show that the method can effectively enhance the classification performance on imbalanced data sets.
- Published
- 2012
6. Potential support vector machine based on the reduced samples
- Author
-
Shu-xia Lu, Gui-en Cao, Jie Meng, and Hua-chao Wang
- Subjects
Computer science ,business.industry ,Process (computing) ,Pattern recognition ,Machine learning ,computer.software_genre ,Support vector machine ,Reduction (complexity) ,Statistical classification ,Acceleration ,Sequential minimal optimization ,Algorithm design ,Artificial intelligence ,business ,computer ,Computer memory - Abstract
When the training dataset is very large, the learning process of potential support vector machine takes up so large memory that the training speed is very slow. To accelerate the training speed of the potential support vector machine (PSVM) for large-scale datasets, a new method is proposed, which introduces PSVM based on the reduced samples. The new method removes most non-support vectors, and keeps the samples on and near the boundary, which may be the support vectors, as the new training samples. This method is more suitable to large-scale datasets. The experimental results show that the proposed method performs well to decrease the consumption of computer memory, and accelerate the training speed of PSVM.
- Published
- 2010
7. Support vector machine based on a new reduced samples method
- Author
-
Shu-Xia Lu, Gui-en Cao, and Jie Meng
- Subjects
Support vector machine ,Kernel (linear algebra) ,Statistical classification ,Data point ,business.industry ,Training time ,Pattern recognition ,Artificial intelligence ,Enhanced Data Rates for GSM Evolution ,business ,Domain (software engineering) ,Support vector machine classification ,Mathematics - Abstract
The support vectors play an important role in the training to find the optimal hyper-plane. For the problem of many non-support vectors and a few support vectors in the classification of SVM, a method to reduce the samples that may be not support vectors is proposed in this paper. First, adopt the Support Vector Domain Description to find the smallest sphere containing the most data points, and then remove the objects outside the sphere. Second, remove the edge points based on the distance of each pattern to the centers of other classes. In comparison with the standard SVM, the experimental results show that the new algorithm in the paper is capable of reducing the number of samples as well as the training time while maintaining high accuracy.
- Published
- 2010
8. A New Fuzzy Multicategory Support Vector Machines Classifier
- Author
-
Shu-Xia Lu, Junhai Zhai, and Xian-Hao Liu
- Subjects
Fuzzy classification ,Structured support vector machine ,business.industry ,Fuzzy set ,Pattern recognition ,Machine learning ,computer.software_genre ,Multicategory ,Relevance vector machine ,Support vector machine ,ComputingMethodologies_PATTERNRECOGNITION ,Margin classifier ,Fuzzy set operations ,Artificial intelligence ,business ,computer ,Mathematics - Abstract
This paper proposes a new fuzzy multicategory support vector machines (FMSVM) classifier. The main idea is that the proposed FMSVM uses knowledge of the ambiguity associated with the membership of samples for a given class and the relative location of samples to the origin. Compared with the existing SVMs, the new proposed FMSVM that uses the L2-norm in the objective function has the improvement in aspects of classification accuracy and reducing the effects of noises and outliers.
- Published
- 2007
9. Improved Fuzzy Multicategory Support Vector Machines Classifier
- Author
-
Shu-Xia Lu and Xizhao Wang
- Subjects
Computational complexity theory ,Structured support vector machine ,business.industry ,Fuzzy set ,Pattern recognition ,Machine learning ,computer.software_genre ,Fuzzy logic ,Multicategory ,Support vector machine ,Statistical classification ,ComputingMethodologies_PATTERNRECOGNITION ,Margin classifier ,Artificial intelligence ,business ,computer ,Mathematics - Abstract
This paper investigates an Improved Fuzzy Multicategory Support Vector Machines Classifier (IFMSVM). It uses knowledge of the ambiguity associated with the membership of data samples of a given class and relative location to the origin, to improve classification performance with high generalization capability. In some aspects, classifying accuracy of the new algorithm is better than that of the classical support vector classification algorithms. Numerical simulations show the feasibility and effectiveness of this algorithm.
- Published
- 2006
10. A comparison among four SVM classification methods: LSVM, NLSVM, SSVM and NSVM
- Author
-
Shu-Xia Lu and Xizhao Wang
- Subjects
Structured support vector machine ,business.industry ,Pattern recognition ,Linear classifier ,Machine learning ,computer.software_genre ,Support vector machine ,Relevance vector machine ,ComputingMethodologies_PATTERNRECOGNITION ,Kernel method ,Margin classifier ,Radial basis function kernel ,Least squares support vector machine ,Artificial intelligence ,business ,computer ,Mathematics - Abstract
Support vector machines (SVMs) are powerful tools for providing solutions to classification and function approximation problems. The comparison among the four classification methods is conducted. The four methods are Lagrangian support vector machine (LSVM), finite Newton Lagrangian support vector machine (NLSVM), smooth support vector machine (SSVM) and finite Newton support vector machine (NSVM). The comparison of their algorithm in generating a linear or nonlinear kernel classifier, accuracy and computational complexity is also given. The study provides some guidelines for choosing an appropriate one from four SVM classification methods in a classification problem.
- Published
- 2005
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.