Dynamical behavior of autoassociative memory performing novelty filtering for signal enhancement

Hanseok Ko, Garry M. Jacyna

Research output: Contribution to journalArticle

7 Citations (Scopus)

Abstract

This paper concerns the dynamical behavior, in probabilistic sense, of a simple perceptron network with sigmoidal output units performing autoassociation for novelty filtering. Networks of retinotopic topology having a one-to-one correspondence between input and output units can be readily trained using the delta learning rule, to perform autoassociative mappings. A novelty filter is obtained by subtracting the network output from the input vector. Then the presentation of a `familiar' pattern tends to evoke a null response; but any anomalous component is enhanced. Such a behavior exhibits a promising feature for enhancement of weak signals in additive noise. As an analysis of the novelty filtering, this paper shows that the probability density function of the weight converges to Gaussian when the input time series is statistically characterized by nonsymmetrical probability density functions. After output units are locally linearized, the recursive relation for updating the weight of the neural network is converted into a first-order random differential equation. Based on this equation it is shown that the probability density function of the weight satisfies the Fokker-Planck equation. By solving the Fokker-Planck equation, it is found that the weight is Gaussian distributed with time dependent mean and variance.

Original languageEnglish
Pages (from-to)1152-1161
Number of pages10
JournalIEEE Transactions on Neural Networks
Volume11
Issue number5
DOIs
Publication statusPublished - 2000 Sep 1

Fingerprint

Dynamical Behavior
Probability density function
Fokker Planck equation
Filtering
Enhancement
Data storage equipment
Weights and Measures
Output
Fokker-Planck Equation
Unit
Neural networks
Additive noise
Random Differential Equation
Neural Networks (Computer)
Rule Learning
Time series
Additive Noise
Differential equations
First order differential equation
One to one correspondence

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computational Theory and Mathematics
  • Hardware and Architecture
  • Control and Systems Engineering
  • Electrical and Electronic Engineering
  • Theoretical Computer Science

Cite this

Dynamical behavior of autoassociative memory performing novelty filtering for signal enhancement. / Ko, Hanseok; Jacyna, Garry M.

In: IEEE Transactions on Neural Networks, Vol. 11, No. 5, 01.09.2000, p. 1152-1161.

Research output: Contribution to journalArticle

@article{350637720f32472fabbf032020b4f0f0,
title = "Dynamical behavior of autoassociative memory performing novelty filtering for signal enhancement",
abstract = "This paper concerns the dynamical behavior, in probabilistic sense, of a simple perceptron network with sigmoidal output units performing autoassociation for novelty filtering. Networks of retinotopic topology having a one-to-one correspondence between input and output units can be readily trained using the delta learning rule, to perform autoassociative mappings. A novelty filter is obtained by subtracting the network output from the input vector. Then the presentation of a `familiar' pattern tends to evoke a null response; but any anomalous component is enhanced. Such a behavior exhibits a promising feature for enhancement of weak signals in additive noise. As an analysis of the novelty filtering, this paper shows that the probability density function of the weight converges to Gaussian when the input time series is statistically characterized by nonsymmetrical probability density functions. After output units are locally linearized, the recursive relation for updating the weight of the neural network is converted into a first-order random differential equation. Based on this equation it is shown that the probability density function of the weight satisfies the Fokker-Planck equation. By solving the Fokker-Planck equation, it is found that the weight is Gaussian distributed with time dependent mean and variance.",
author = "Hanseok Ko and Jacyna, {Garry M.}",
year = "2000",
month = "9",
day = "1",
doi = "10.1109/72.870046",
language = "English",
volume = "11",
pages = "1152--1161",
journal = "IEEE Transactions on Neural Networks and Learning Systems",
issn = "2162-237X",
publisher = "IEEE Computational Intelligence Society",
number = "5",

}

TY - JOUR

T1 - Dynamical behavior of autoassociative memory performing novelty filtering for signal enhancement

AU - Ko, Hanseok

AU - Jacyna, Garry M.

PY - 2000/9/1

Y1 - 2000/9/1

N2 - This paper concerns the dynamical behavior, in probabilistic sense, of a simple perceptron network with sigmoidal output units performing autoassociation for novelty filtering. Networks of retinotopic topology having a one-to-one correspondence between input and output units can be readily trained using the delta learning rule, to perform autoassociative mappings. A novelty filter is obtained by subtracting the network output from the input vector. Then the presentation of a `familiar' pattern tends to evoke a null response; but any anomalous component is enhanced. Such a behavior exhibits a promising feature for enhancement of weak signals in additive noise. As an analysis of the novelty filtering, this paper shows that the probability density function of the weight converges to Gaussian when the input time series is statistically characterized by nonsymmetrical probability density functions. After output units are locally linearized, the recursive relation for updating the weight of the neural network is converted into a first-order random differential equation. Based on this equation it is shown that the probability density function of the weight satisfies the Fokker-Planck equation. By solving the Fokker-Planck equation, it is found that the weight is Gaussian distributed with time dependent mean and variance.

AB - This paper concerns the dynamical behavior, in probabilistic sense, of a simple perceptron network with sigmoidal output units performing autoassociation for novelty filtering. Networks of retinotopic topology having a one-to-one correspondence between input and output units can be readily trained using the delta learning rule, to perform autoassociative mappings. A novelty filter is obtained by subtracting the network output from the input vector. Then the presentation of a `familiar' pattern tends to evoke a null response; but any anomalous component is enhanced. Such a behavior exhibits a promising feature for enhancement of weak signals in additive noise. As an analysis of the novelty filtering, this paper shows that the probability density function of the weight converges to Gaussian when the input time series is statistically characterized by nonsymmetrical probability density functions. After output units are locally linearized, the recursive relation for updating the weight of the neural network is converted into a first-order random differential equation. Based on this equation it is shown that the probability density function of the weight satisfies the Fokker-Planck equation. By solving the Fokker-Planck equation, it is found that the weight is Gaussian distributed with time dependent mean and variance.

UR - http://www.scopus.com/inward/record.url?scp=0034270237&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0034270237&partnerID=8YFLogxK

U2 - 10.1109/72.870046

DO - 10.1109/72.870046

M3 - Article

VL - 11

SP - 1152

EP - 1161

JO - IEEE Transactions on Neural Networks and Learning Systems

JF - IEEE Transactions on Neural Networks and Learning Systems

SN - 2162-237X

IS - 5

ER -