Abstract
Neural networks are parameterized by a set of synaptic weights. The task of an optimization scheme for a neural network is to find a set of synaptic weights that make the network perform the desired function. The backpropagation learning method, quasi-Newton method, non-derivative quasi-Newton method, Gauss-Newton method, secant method and simulated Cauchy annealing method have been investigated. According to the computation time, convergence speed, and mean-squared error between the network outputs and desired results, the comparison of these learning methods has been presented. For learning a sine function, the quasi-Newton method can yield the best performance and the Gauss-Newton method also can provide a good promising result.
Original language | English |
---|---|
Pages | 817-822 |
Number of pages | 6 |
State | Published - 1994 |
Externally published | Yes |
Event | Proceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7) - Orlando, FL, USA Duration: 27 06 1994 → 29 06 1994 |
Conference
Conference | Proceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7) |
---|---|
City | Orlando, FL, USA |
Period | 27/06/94 → 29/06/94 |