Back to Search
Start Over
Associative Learning on a Continuum in Evolved Dynamical Neural Networks.
- Source :
-
Adaptive Behavior . 2008, Vol. 16 Issue 6, p361-384. 24p. 1 Chart, 1 Graph. - Publication Year :
- 2008
-
Abstract
- This article extends previous work on evolving learning without synaptic plasticity from discrete tasks to continuous tasks. Continuous-time recurrent neural networks without synaptic plasticity are artificially evolved on an associative learning task. The task consists in associating paired stimuli: temperature and food. The temperature to be associated can be either drawn from a discrete set or allowed to range over a continuum of values. We address two questions: Can the learning without synaptic plasticity approach be extended to continuous tasks? And if so, how does learning without synaptic plasticity work in the evolved circuits? Analysis of the most successful circuits to learn discrete stimuli reveal finite state machine (FSM) like internal dynamics. However, when the task is modified to require learning stimuli on the full continuum range, it is not possible to extract a FSM from the internal dynamics. In this case, a continuous state machine is extracted instead. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 10597123
- Volume :
- 16
- Issue :
- 6
- Database :
- Academic Search Index
- Journal :
- Adaptive Behavior
- Publication Type :
- Academic Journal
- Accession number :
- 35214487
- Full Text :
- https://doi.org/10.1177/1059712308097316