Comparative study of recurrent neural network architectures on learning temporal sequences

  • Tung Bo Chen*
  • , Von Wun Soo
  • *Corresponding author for this work

Research output: Contribution to conferenceConference Paperpeer-review

20 Scopus citations

Abstract

A recurrent neural networks with context units that can handle temporal sequences is proposed. In this paper, we show an architecture whose performance is better than the architectures proposed by Jordan and Elman respectively using error backpropagation learning algorithms. Three learning experiments were carried out. In the first experiment, we used the recurrent neural networks to simulate a finite state machine. In the second experiment, we use the recurrent networks to handle a combination retrieving problem. In the third experiment, we train the neural networks to recognize the periodicity in temporal sequence data. The results of three experiments showed that our system had a better performance.

Original languageEnglish
Pages1945-1950
Number of pages6
StatePublished - 1996
Externally publishedYes
EventProceedings of the 1996 IEEE International Conference on Neural Networks, ICNN. Part 1 (of 4) - Washington, DC, USA
Duration: 03 06 199606 06 1996

Conference

ConferenceProceedings of the 1996 IEEE International Conference on Neural Networks, ICNN. Part 1 (of 4)
CityWashington, DC, USA
Period03/06/9606/06/96

Fingerprint

Dive into the research topics of 'Comparative study of recurrent neural network architectures on learning temporal sequences'. Together they form a unique fingerprint.

Cite this