Abstract
A benchmark study of two self-organizing artificial neural network models, ART2 and DIGNET, is conducted. The architecture differences and learning procedures between these two models are compared. The performance of ART2 and DIGNET on data clustering and pattern recognition problems with noise or interference is investigated by computer simulations. It is shown that DIGNET generally has faster learning and better clustering performance on the statistical pattern recognition problems. DIGNET has a simpler architecture, and the system parameters can be analytically determined from the self-organizing process. The threshold value used in DIGNET can be specifically determined from a given lower bound on the desirable signal-to-noise ratio (SNR). A modified model based on the features of ART2 and DIGNET is also derived and investigated. The simpler architecture combines the ART2 structure with the advantages of DIGNET model. The concepts of well depth and stage age originally introduced in DIGNET are applied in the modified model. The modified model preserves the features of noise suppression, contrast enhancement and self-organizing stable pattern recognition of ART2, yet provides a specific method to adjust parameters in the network. The network performs a variant of A'-means learning, but without the knowledge of a priori information on the actual number of clusters. The networks discussed in this paper are applied and benchmarked against clustering and pattern recognition problems. Comparative simulation results of the networks are also presented.
Original language | English |
---|---|
Pages (from-to) | 346-357 |
Number of pages | 12 |
Journal | Proceedings of SPIE - The International Society for Optical Engineering |
Volume | 1965 |
DOIs | |
State | Published - 02 09 1993 |
Externally published | Yes |
Event | Applications of Artificial Neural Networks IV 1993 - Orlando, United States Duration: 11 04 1993 → 16 04 1993 |
Bibliographical note
Publisher Copyright:© 1993 SPIE. All rights reserved.