摘要:We investigate the problem of extended dissipativity analysis for a class of neural networks with time-varying delay. The extended dissipativity analysis generalizes a few previous known results, which contain the H ∞ $H_{\infty}$ , passivity, dissipativity, and ℓ 2 − ℓ ∞ $\ell_,-\ell _{\infty}$ performance in a unified framework. By introducing a suitable augmented Lyapunov-Krasovskii functional and considering the sufficient information of neuron activation functions and together with a new bound inequality, we give some sufficient conditions in terms of linear matrix inequalities (LMIs) to guarantee the stability and extended dissipativity of delayed neural networks. Numerical examples are given to illustrate the efficiency and less conservative of the proposed methods.