Back to Search Start Over

Improving the performance of an incremental algorithm driven by error margins.

Authors :
Del Campo-Ávila, José
Ramos-Jiménez, Gonzalo
Gama, João
Morales-Bueno, Rafael
Source :
Intelligent Data Analysis. 2008, Vol. 12 Issue 3, p305-318. 14p. 2 Diagrams, 1 Chart, 1 Graph.
Publication Year :
2008

Abstract

Classification is a quite relevant task within data analysis field. This task is not a trivial task and different difficulties can arise depending on the nature of the problem. All these difficulties can become worse when the datasets are too large or when new information can arrive at any time. Incremental learning is an approach that can be used to deal with the classification task in these cases. It must alleviate, or solve, the problem of limited time and memory resources. One emergent approach uses concentration bounds to ensure that decisions are made when enough information supports them. IADEM is one of the most recent algorithms that use this approach. The aim of this paper is to improve the performance of this algorithm in different ways: simplifying the complexity of the induced models, adding the ability to deal with continuous data, improving the detection of noise, selecting new criteria for evolutionating the model, including the use of more powerful prediction techniques, etc. Besides these new properties, the new system, IADEM-2, preserves the ability to obtain a performance similar to standard learning algorithms independently of the datasets size and it can incorporate new information as the basic algorithm does: using short time per example. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
1088467X
Volume :
12
Issue :
3
Database :
Academic Search Index
Journal :
Intelligent Data Analysis
Publication Type :
Academic Journal
Accession number :
32548799
Full Text :
https://doi.org/10.3233/IDA-2008-12305