Large memory capacity in chaotic artificial neural networks: A view of the anti-integrable limit

Wei Lin, Guanrong Chen

Research output: Journal Publications and ReviewsRGC 22 - Publication in policy or professional journal

49 Citations (Scopus)

Abstract

In the literature, it was reported that the chaotic artificial neural network model with sinusoidal activation functions possesses a large memory capacity as well as a remarkable ability of retrieving the stored patterns, better than the conventional chaotic model with only monotonic activation functions such as sigmoidal functions. This paper, from the viewpoint of the anti-integrable limit, elucidates the mechanism inducing the superiority of the model with periodic activation functions that includes sinusoidal functions. Particularly, by virtue of the anti-integrable limit technique, this paper shows that any finite-dimensional neural network model with periodic activation functions and properly selected parameters has much more abundant chaotic dynamics that truly determine the model's memory capacity and pattern-retrieval ability. To some extent, this paper mathematically and numerically demonstrates that an appropriate choice of the activation functions and control scheme can lead to a large memory capacity and better pattern-retrieval ability of the artificial neural network models. © 2009 IEEE.
Original languageEnglish
Pages (from-to)1340-1351
JournalIEEE Transactions on Neural Networks
Volume20
Issue number8
DOIs
Publication statusPublished - 2009

Research Keywords

  • Anti-integrable limit
  • Artificial neural network
  • Chaos
  • Periodic activation function

Fingerprint

Dive into the research topics of 'Large memory capacity in chaotic artificial neural networks: A view of the anti-integrable limit'. Together they form a unique fingerprint.

Cite this