A comparative study of self-organizing clustering algorithms dignet and ART 2

Chin Der Wann, Stelios C.A. Thomopoulos*

*Corresponding author for this work

Research output: Contribution to journalJournal Article peer-review

27 Scopus citations

Abstract

A comparative study of two self-organizing clustering neural network algorithms, Dignet and ART2, has been conducted. The differences in architecture and learning procedures between the two models are compared. Comparative computer simulations on data clustering and signal detection problems with Gaussian noise were used for investigating the performance of Dignet and "fast learning" ART2. The study shows that Dignet, with a simple architecture and straightforward dynamics, is more flexible with the choice of different metrics for the measure of similarity. The system parameters in Dignet can be analytically determined from a self-adjusting process; moreover, the initial threshold value used in Dignet is directly determined from a lower-bound of the desirable operational signal-to-noise ratio. Simulations show that Dignet generally exhibits faster learning and better clustering performance on statistical pattern recognition problems. A simplified ART2 model (SART2) is derived by adopting the structural concepts from Dignet. SART2 exhibits faster learning and eliminates a "false conviction" problem that exists in the "fast learning" ART2. The comparative study is benchmarked against statistical data clustering and signal detection problems.

Original languageEnglish
Pages (from-to)737-753
Number of pages17
JournalNeural Networks
Volume10
Issue number4
DOIs
StatePublished - 1997
Externally publishedYes

Keywords

  • ART2
  • Clustering
  • Dignet
  • Fast learning
  • Neural networks
  • Pattern recognition
  • Self-organization
  • Unsupervised learning

Fingerprint

Dive into the research topics of 'A comparative study of self-organizing clustering algorithms dignet and ART 2'. Together they form a unique fingerprint.

Cite this