Paralleled Hardware Annealing in Multilevel Hopfield Neural Networks for Optimal Solutions

Sa Hyun Bang, Oscal T.C. Chen, Josephine C.F. Chang, Bing J. Sheu

Research output: Contribution to journalJournal Article peer-review

4 Scopus citations

Abstract

In a multilevel neural network, the output of each neuron is to produce a multi-bit representation. Therefore, the total network size can be significantly smaller than a conventional network. The reduction in network size is a highly desirable feature in large-scale applications. The procedure for applying hardware annealing by continuously changing the neuron gain from a low value to a certain high value, to reach the globally optimal solution is described. Several simulation results are also presented. The hardware annealing technique can be applied to the neurons in a parallel format, and is much faster than the simulated annealing method on digital computers.

Original languageEnglish
Pages (from-to)46-49
Number of pages4
JournalIEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing
Volume42
Issue number1
DOIs
StatePublished - 01 1995
Externally publishedYes

Fingerprint

Dive into the research topics of 'Paralleled Hardware Annealing in Multilevel Hopfield Neural Networks for Optimal Solutions'. Together they form a unique fingerprint.

Cite this