### Abstract

This paper concerns the dynamical behavior, in probabilistic sense, of a simple perceptron network with sigmoidal output units performing autoassociation for novelty filtering. Networks of retinotopic topology having a one-to-one correspondence between input and output units can be readily trained using the delta learning rule, to perform autoassociative mappings. A novelty filter is obtained by subtracting the network output from the input vector. Then the presentation of a `familiar' pattern tends to evoke a null response; but any anomalous component is enhanced. Such a behavior exhibits a promising feature for enhancement of weak signals in additive noise. As an analysis of the novelty filtering, this paper shows that the probability density function of the weight converges to Gaussian when the input time series is statistically characterized by nonsymmetrical probability density functions. After output units are locally linearized, the recursive relation for updating the weight of the neural network is converted into a first-order random differential equation. Based on this equation it is shown that the probability density function of the weight satisfies the Fokker-Planck equation. By solving the Fokker-Planck equation, it is found that the weight is Gaussian distributed with time dependent mean and variance.

Original language | English |
---|---|

Pages (from-to) | 1152-1161 |

Number of pages | 10 |

Journal | IEEE Transactions on Neural Networks |

Volume | 11 |

Issue number | 5 |

DOIs | |

Publication status | Published - 2000 Sep 1 |

### Fingerprint

### ASJC Scopus subject areas

- Artificial Intelligence
- Computational Theory and Mathematics
- Hardware and Architecture
- Control and Systems Engineering
- Electrical and Electronic Engineering
- Theoretical Computer Science

### Cite this

*IEEE Transactions on Neural Networks*,

*11*(5), 1152-1161. https://doi.org/10.1109/72.870046

**Dynamical behavior of autoassociative memory performing novelty filtering for signal enhancement.** / Ko, Hanseok; Jacyna, Garry M.

Research output: Contribution to journal › Article

*IEEE Transactions on Neural Networks*, vol. 11, no. 5, pp. 1152-1161. https://doi.org/10.1109/72.870046

}

TY - JOUR

T1 - Dynamical behavior of autoassociative memory performing novelty filtering for signal enhancement

AU - Ko, Hanseok

AU - Jacyna, Garry M.

PY - 2000/9/1

Y1 - 2000/9/1

N2 - This paper concerns the dynamical behavior, in probabilistic sense, of a simple perceptron network with sigmoidal output units performing autoassociation for novelty filtering. Networks of retinotopic topology having a one-to-one correspondence between input and output units can be readily trained using the delta learning rule, to perform autoassociative mappings. A novelty filter is obtained by subtracting the network output from the input vector. Then the presentation of a `familiar' pattern tends to evoke a null response; but any anomalous component is enhanced. Such a behavior exhibits a promising feature for enhancement of weak signals in additive noise. As an analysis of the novelty filtering, this paper shows that the probability density function of the weight converges to Gaussian when the input time series is statistically characterized by nonsymmetrical probability density functions. After output units are locally linearized, the recursive relation for updating the weight of the neural network is converted into a first-order random differential equation. Based on this equation it is shown that the probability density function of the weight satisfies the Fokker-Planck equation. By solving the Fokker-Planck equation, it is found that the weight is Gaussian distributed with time dependent mean and variance.

AB - This paper concerns the dynamical behavior, in probabilistic sense, of a simple perceptron network with sigmoidal output units performing autoassociation for novelty filtering. Networks of retinotopic topology having a one-to-one correspondence between input and output units can be readily trained using the delta learning rule, to perform autoassociative mappings. A novelty filter is obtained by subtracting the network output from the input vector. Then the presentation of a `familiar' pattern tends to evoke a null response; but any anomalous component is enhanced. Such a behavior exhibits a promising feature for enhancement of weak signals in additive noise. As an analysis of the novelty filtering, this paper shows that the probability density function of the weight converges to Gaussian when the input time series is statistically characterized by nonsymmetrical probability density functions. After output units are locally linearized, the recursive relation for updating the weight of the neural network is converted into a first-order random differential equation. Based on this equation it is shown that the probability density function of the weight satisfies the Fokker-Planck equation. By solving the Fokker-Planck equation, it is found that the weight is Gaussian distributed with time dependent mean and variance.

UR - http://www.scopus.com/inward/record.url?scp=0034270237&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0034270237&partnerID=8YFLogxK

U2 - 10.1109/72.870046

DO - 10.1109/72.870046

M3 - Article

C2 - 18249841

AN - SCOPUS:0034270237

VL - 11

SP - 1152

EP - 1161

JO - IEEE Transactions on Neural Networks and Learning Systems

JF - IEEE Transactions on Neural Networks and Learning Systems

SN - 2162-237X

IS - 5

ER -