In this letter, we propose a new weight learning algorithm, called an H∞ learning law (HLL), for recurrent neural networks with time-delay. Based on the LyapunovKrasovskii stability theory, the HLL is presented to not only guarantee asymptotical stability but also reduce the effect of external disturbance to an H∞ norm constraint. An existence condition for the HLL is represented in terms of linear matrix inequality (LMI). An illustrative example is given to demonstrate the effectiveness of the proposed HLL.
- H∞ learning law (HLL)
- Linear matrix inequality (LMI)
- LyapunovKrasovskii stability theory
- Recurrent neural networks
ASJC Scopus subject areas
- Statistical and Nonlinear Physics
- Condensed Matter Physics