Back to Search
Start Over
A Theory of Over-Learning in the Presence of Noise.
- Source :
- Systems & Computers in Japan; 11/15/94, Vol. 25 Issue 13, p62-72, 11p
- Publication Year :
- 1994
-
Abstract
- Over-learning is a drawback of the error-back-propagation (BP) method for a multilayer feed-forward neural network. The authors have already discussed this problem for the case in which pure training data are available. It was shown that over-learning is caused by the use of a memorization criterion as a substitute for some true criterion that determines generalization ability. A relation between a true criterion and a substitute criterion was used to define mathematically the concept of over-learning, and the concepts of four kinds of admissibility of the substitute criterion in place of the true criterion were introduced. In this paper, we show that the forementioned general framework can also be applied to the case in which training data are noisy. First, the memorization criterion is extended to cover rote-memorization criterion, which requires the same responses as those given by the training data even if they include noise. Next, the framework is applied to the case in which the rote-memorization criterion is used as a substitute for the Wiener criterion. Necessary and sufficient conditions for the four kinds of admissibility are obtained. These conditions lead us to methods for choosing a training set that prevents over-learning. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 08821666
- Volume :
- 25
- Issue :
- 13
- Database :
- Supplemental Index
- Journal :
- Systems & Computers in Japan
- Publication Type :
- Academic Journal
- Accession number :
- 13945443