Back to Search Start Over

A Maximally Split and Relaxed ADMM for Regularized Extreme Learning Machines.

Authors :
Lai, Xiaoping
Cao, Jiuwen
Huang, Xiaofeng
Wang, Tianlei
Lin, Zhiping
Source :
IEEE Transactions on Neural Networks & Learning Systems. Jun2020, Vol. 31 Issue 6, p1899-1913. 15p.
Publication Year :
2020

Abstract

One of the salient features of the extreme learning machine (ELM) is its fast learning speed. However, in a big data environment, the ELM still suffers from an overly heavy computational load due to the high dimensionality and the large amount of data. Using the alternating direction method of multipliers (ADMM), a convex model fitting problem can be split into a set of concurrently executable subproblems, each with just a subset of model coefficients. By maximally splitting across the coefficients and incorporating a novel relaxation technique, a maximally split and relaxed ADMM (MS-RADMM), along with a scalarwise implementation, is developed for the regularized ELM (RELM). The convergence conditions and the convergence rate of the MS-RADMM are established, which exhibits linear convergence with a smaller convergence ratio than the unrelaxed maximally split ADMM. The optimal parameter values of the MS-RADMM are obtained and a fast parameter selection scheme is provided. Experiments on ten benchmark classification data sets are conducted, the results of which demonstrate the fast convergence and parallelism of the MS-RADMM. Complexity comparisons with the matrix-inversion-based method in terms of the numbers of multiplication and addition operations, the computation time and the number of memory cells are provided for performance evaluation of the MS-RADMM. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
2162237X
Volume :
31
Issue :
6
Database :
Academic Search Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
143613637
Full Text :
https://doi.org/10.1109/TNNLS.2019.2927385