The plasticity of feedforward neural networks in assimilating a training instance based on non-batch learning

  • Hown Wen Chen
  • , Von Wun Soo*
  • *Corresponding author for this work

Research output: Contribution to journalJournal Article peer-review

Abstract

In non-batch learning systems, an index called plasticity is needed to indicate how easy an instance can be assimilated. Basically, plasticity should be able to illustrate three essential elements: what degree of modification is allowed in a learning system, how close the learning system actually adapts and how much learning effort is taken in response to an incoming instance. By taking those three notions into consideration, we proposed a new formula to evaluate the plasticity of feedforward neural networks trained with non-batch learning. The formula was investigated against on-line backpropagation [1], adaptive learning [2,3] and Incremental Feedforward Networks (IFFN) [3] in handling both consistent and inconsistent instances in solving a problem of function approximation. Experiments showed that the plasticities of networks using the three learning schemes varied from high to low as on-line backpropagation, IFFN and adaptive learning, respectively. The effects of initial weights and bandwidths on plasticity were also empirically measured and reported.

Original languageEnglish
Pages (from-to)299-309
Number of pages11
JournalNeurocomputing
Volume7
Issue number3
DOIs
StatePublished - 04 1995
Externally publishedYes

Keywords

  • Adaptive learning
  • Feedforward neural networks
  • Incremental learning
  • Instance consistency
  • Plasticity

Fingerprint

Dive into the research topics of 'The plasticity of feedforward neural networks in assimilating a training instance based on non-batch learning'. Together they form a unique fingerprint.

Cite this