Abstract
Almost all traditional learning methods of artificial neural networks focused on manipulating stationary training sets. In fact, this assumption is anti-biological and too strong to hinder many interesting applications. In this paper, we apply the concept of minimizing weight sensitivity cost and training square-error functions using gradient descent optimization techniques, and obtain a new supervised back-propagation learning algorithm on a biased 2-layered perceptron. In addition to illustrate the conflict locality of an inserted training instance with respect to previous training data, we point out this adaptive learning method can get a network with a measurable generalization ability. This work can also be extended to an incremental network which no training instances are needed to be remembered.
| Original language | English |
|---|---|
| Title of host publication | Proceedings - 1992 International Joint Conference on Neural Networks, IJCNN 1992 |
| Publisher | Institute of Electrical and Electronics Engineers Inc. |
| Pages | 713-718 |
| Number of pages | 6 |
| ISBN (Electronic) | 0780305590 |
| DOIs | |
| State | Published - 1992 |
| Externally published | Yes |
| Event | 1992 International Joint Conference on Neural Networks, IJCNN 1992 - Baltimore, United States Duration: 07 06 1992 → 11 06 1992 |
Publication series
| Name | Proceedings of the International Joint Conference on Neural Networks |
|---|---|
| Volume | 1 |
Conference
| Conference | 1992 International Joint Conference on Neural Networks, IJCNN 1992 |
|---|---|
| Country/Territory | United States |
| City | Baltimore |
| Period | 07/06/92 → 11/06/92 |
Bibliographical note
Publisher Copyright:© 1992 IEEE.
Fingerprint
Dive into the research topics of 'An Adaptive Back-Propagation Learning Method: A Preliminary Study for Incremental Neural Networks'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver