Abstract
Back-propagation with gradient method is the most popular learning algorithm for feed-forward neural networks. However, it is critical to determine a proper fixed learning rate for the algorithm. In this paper, an optimized recursive algorithm is presented for online learning based on matrix operation and optimization methods analytically, which can avoid the trouble to select a proper learning rate for the gradient method. The proof of weak convergence of the proposed algorithm also is given. Although this approach is proposed for three-layer, feed-forward neural networks, it could be extended to multiple layer feed-forward neural networks. The effectiveness of the proposed algorithms applied to the identification of behavior of a two-input and two-output non-linear dynamic system is demonstrated by simulation experiments.
Original language | English (US) |
---|---|
Pages (from-to) | 133-147 |
Number of pages | 15 |
Journal | Intelligent Automation and Soft Computing |
Volume | 17 |
Issue number | 2 |
DOIs | |
State | Published - Jan 2011 |
Keywords
- Back Propagation
- Gradient Descent Method
- Learning Algorithms
- Neural Networks
ASJC Scopus subject areas
- Software
- Theoretical Computer Science
- Computational Theory and Mathematics
- Artificial Intelligence