Dynamical behavior of autoassociative memory performing novelty filtering for signal enhancement

Hanseok Ko, Garry M. Jacyna

Research output: Contribution to journalArticle

7 Citations (Scopus)

Abstract

This paper concerns the dynamical behavior, in probabilistic sense, of a simple perceptron network with sigmoidal output units performing autoassociation for novelty filtering. Networks of retinotopic topology having a one-to-one correspondence between input and output units can be readily trained using the delta learning rule, to perform autoassociative mappings. A novelty filter is obtained by subtracting the network output from the input vector. Then the presentation of a `familiar' pattern tends to evoke a null response; but any anomalous component is enhanced. Such a behavior exhibits a promising feature for enhancement of weak signals in additive noise. As an analysis of the novelty filtering, this paper shows that the probability density function of the weight converges to Gaussian when the input time series is statistically characterized by nonsymmetrical probability density functions. After output units are locally linearized, the recursive relation for updating the weight of the neural network is converted into a first-order random differential equation. Based on this equation it is shown that the probability density function of the weight satisfies the Fokker-Planck equation. By solving the Fokker-Planck equation, it is found that the weight is Gaussian distributed with time dependent mean and variance.

Original languageEnglish
Pages (from-to)1152-1161
Number of pages10
JournalIEEE Transactions on Neural Networks
Volume11
Issue number5
DOIs
Publication statusPublished - 2000 Sep

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Dynamical behavior of autoassociative memory performing novelty filtering for signal enhancement'. Together they form a unique fingerprint.

Cite this