Abstract
A recurrent neural networks with context units that can handle temporal sequences is proposed. In this paper, we show an architecture whose performance is better than the architectures proposed by Jordan and Elman respectively using error backpropagation learning algorithms. Three learning experiments were carried out. In the first experiment, we used the recurrent neural networks to simulate a finite state machine. In the second experiment, we use the recurrent networks to handle a combination retrieving problem. In the third experiment, we train the neural networks to recognize the periodicity in temporal sequence data. The results of three experiments showed that our system had a better performance.
| Original language | English |
|---|---|
| Pages | 1945-1950 |
| Number of pages | 6 |
| State | Published - 1996 |
| Externally published | Yes |
| Event | Proceedings of the 1996 IEEE International Conference on Neural Networks, ICNN. Part 1 (of 4) - Washington, DC, USA Duration: 03 06 1996 → 06 06 1996 |
Conference
| Conference | Proceedings of the 1996 IEEE International Conference on Neural Networks, ICNN. Part 1 (of 4) |
|---|---|
| City | Washington, DC, USA |
| Period | 03/06/96 → 06/06/96 |
Fingerprint
Dive into the research topics of 'Comparative study of recurrent neural network architectures on learning temporal sequences'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver