Optimization schemes for neural network training

Oscal T.C. Chen*, Bing J. Sheu

*Corresponding author for this work

Research output: Contribution to conferenceConference Paperpeer-review

10 Scopus citations

Abstract

Neural networks are parameterized by a set of synaptic weights. The task of an optimization scheme for a neural network is to find a set of synaptic weights that make the network perform the desired function. The backpropagation learning method, quasi-Newton method, non-derivative quasi-Newton method, Gauss-Newton method, secant method and simulated Cauchy annealing method have been investigated. According to the computation time, convergence speed, and mean-squared error between the network outputs and desired results, the comparison of these learning methods has been presented. For learning a sine function, the quasi-Newton method can yield the best performance and the Gauss-Newton method also can provide a good promising result.

Original languageEnglish
Pages817-822
Number of pages6
StatePublished - 1994
Externally publishedYes
EventProceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7) - Orlando, FL, USA
Duration: 27 06 199429 06 1994

Conference

ConferenceProceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7)
CityOrlando, FL, USA
Period27/06/9429/06/94

Fingerprint

Dive into the research topics of 'Optimization schemes for neural network training'. Together they form a unique fingerprint.

Cite this