Abstract
Most known learning algorithms for dynamic neural networks in non-stationary environments need global computations to perform credit assignment. These algon'thms either are not local in time or nor local in space. Those algorithms which are local in both time and space usually cannot deal sensibly with ‘hidden units’. In contrast, as far as we can judge, learning rules in biological systems with many 'hidden units' are local in both space and time. In this paper we propose a parallel on-line learning algorithms which performs local computations only, yet still is designed to deal with hidden units and with units whose past activations are ‘hidden in time’. The approach is inspired by Holland's idea of the bucket brigade for classifier systems, which is transformed to run on a neural network with fixed topology. The result is a feedforward or recurrent ‘neural’ dissipative system which is consuming ‘weight-substance’ and permanently trying to dism'bute this substance onto its connections in an appropriate way. Simple experiments demonstrating the feasibility of the algorithm are reported. © 1989, Taylor & Francis Group, LLC. All rights reserved.
Original language | English (US) |
---|---|
Pages (from-to) | 403-412 |
Number of pages | 10 |
Journal | Connection Science |
Volume | 1 |
Issue number | 4 |
DOIs | |
State | Published - Jan 1 1989 |
Externally published | Yes |
Bibliographical note
Generated from Scopus record by KAUST IRTS on 2022-09-14ASJC Scopus subject areas
- Artificial Intelligence
- Computational Theory and Mathematics
- Theoretical Computer Science
- Control and Systems Engineering