Journal of the Franklin Institute 338 (2001) 481–495
Stability of stochastic delay neural networks
$
Steve Blythe
a
,Xuerong Mao
a,
*,Xiaoxin Liao
b
a
Department of Statistics and Modelling Science,University of Strathclyde,Glasgow G1 1XH,UK
b
Department of Control,Huazhong University of Science and Technology,Wuhan,
People’s Republic of China
Received 17 March 2000;received in revised form 20 February 2001
Abstract
The authors in their papers (Liao and Mao,Stochast.Anal.Appl.14 (2) (1996a) 165–185;
Neural,Parallel Sci.Comput.4 (2) (1996b) 205–244) initiated the study of stability and
instability of stochastic neural networks and this paper is the continuation of their research in
this area.The main aim of this paper is to discuss almost sure exponential stability for a
stochastic delay neural network dxðtÞ ¼ ½BxðtÞ þAgðx
t
ðtÞÞ dt þsðxðtÞ;gðx
t
ðtÞ;tÞ dwðtÞ.The
techniques used in this paper are diﬀerent from those in their earlier papers.Especially,the
nonnegative semimartingale convergence theorem will play an important role in this paper.
Several examples are also given for illustration.#2001 The Franklin Institute.Published by
Elsevier Science Ltd.All rights reserved.
Keywords:Delay neural network;Brownian motion;Martingale convergence theorem;Lyapunov
exponent;Exponential stability
1.Introduction
Theoretical understanding of neuralnetwork dynamics has advanced greatly in
the past 15 years [1–6].In many networks,time delays cannot be avoided.For
example,in electronic neural networks,time delays will be present due to the ﬁnite
switching speed of ampliﬁers.Marcus and Westervelt [7] proposed,in a similar way
as Hopﬁeld [2],a model for a network with delays as follows:
C
i
’
u
i
ðtÞ ¼
1
R
i
u
i
ðtÞ þ
X
n
j¼1
T
ij
g
j
ðu
j
ðt t
j
ÞÞ;14i4n;ð1:1Þ
Supported by the Royal Society and the EPSRC=BBSRC.
*Corresponding author.Tel.:+441415483669;fax:+441415522079.
Email addresses:xuerong@stams.strath.ac.uk (X.Mao),liaoxx@public.wh.hb.cn (X.Liao).
00160032/01/$20.00#2001 The Franklin Institute.Published by Elsevier Science Ltd.All rights reserved.
PII:S 0 0 1 6  0 0 3 2 ( 0 1 ) 0 0 0 1 6  3
on t50.The variable u
i
ðtÞ represents the voltage on the input of the ith neuron.Each
neuron is characterized by an input capacitance C
i
,a time delay t
i
and a transfer
function g
i
ðuÞ.The connection matrix element T
ij
has a value þ1=R
ij
when the
noninverting output of the jth neuron is connected to the input of the ith neuron
through a resistance R
ij
,and a value 1=R
ij
when the inverting output of the jth
neuron is connected to the input of the ith neuron through a resistance R
ij
.The
parallel resistance at the input of each neuron is deﬁned R
i
¼ ð
P
n
j¼1
jT
ij
jÞ
1
.The
nonlinear transfer function g
i
ðuÞ is sigmoidal,saturating at 1 with maximumslope
at u ¼ 0.That is,in mathematical terms,g
i
ðuÞ is nondecreasing and
jg
i
ðuÞj41 ^b
i
juj for all 15u51;ð1:2Þ
where b
i
is the slope of g
i
ðuÞ at u ¼ 0 and is supposed to be ﬁnite.By deﬁning
b
i
¼
1
C
i
R
i
;a
ij
¼
T
ij
C
i
;
Eq.(1.1) can be rewritten as
’
u
i
ðtÞ ¼ b
i
u
i
ðtÞ þ
X
n
j¼1
a
ij
g
j
ðu
j
ðt t
j
ÞÞ;14i4n ð1:3Þ
or,in matrix form,
’
uðtÞ ¼ BuðtÞ þAgðu
t
ðtÞÞ;t50;ð1:4Þ
where
uðtÞ ¼ ðu
1
ðtÞ;...;u
n
ðtÞÞ
T
;u
t
ðtÞ ¼ ðu
1
ðt t
1
Þ;...;u
n
ðt t
n
ÞÞ
T
;
B ¼ diagðb
1
;...;b
n
Þ;A ¼ ða
ij
Þ
n n
;gðuÞ ¼ ðg
1
ðu
1
Þ;...;g
n
ðu
n
ÞÞ
T
:
Moreover,there is a relationship
b
i
¼
X
n
j¼1
ja
ij
j;14i4n:ð1:5Þ
It is clear that whenever given initial data uðsÞ ¼ xðsÞ for
%
t4s40;Eq.(1.4) has a
unique global solution on t50,where
%
t ¼ max
14i4n
t
i
and x ¼ fxðsÞ:
%
t4s40g is
a Cð½
%
t;0;R
n
Þvalued function.
Haykin [8] points out that in real nervous systems,synaptic transmission ‘‘...is a
noisy process brought on by random ﬂuctuations from the release of neurotrans
mitters,and other probabilistic causes’’ (p.309–310).One approach to the
mathematical incorporation of such eﬀects is to use probabilistic threshold models
(e.g [8]);the approach used in the current paper is to view neural networks as
nonlinear dynamical systems with intrinsic noise,that is,to include a representation
of the inherent stochasticity in the neurodynamics.Le Cun et al.[9] describe a
network where noise is injected into the ﬁrst hidden layer (only),and use the result to
obtain error derivatives,thereby avoiding backpropagation.
We therefore suppose that there exists a stochastic perturbation to the neural
network and the stochastically perturbed network with delays is described by a
S.Blythe et al./Journal of the Franklin Institute 338 (2001) 481–495482
stochastic diﬀerential delay equation
dxðtÞ ¼ ½BxðtÞ þAgðx
t
ðtÞÞ dt þsðxðtÞ;x
t
ðtÞ;tÞ dwðtÞ on t50;
xðsÞ ¼ xðsÞ on
%
t4s40:ð1:6Þ
Here wðtÞ ¼ ðw
1
ðtÞ;...;w
m
ðtÞÞ
T
is an mdimensional Brownian motion deﬁned on a
complete probability space ðO;F;PÞ with a natural ﬁltration fF
t
g
t50
(i.e.F
t
¼
sfwðsÞ:04s4tg),xðtÞ ¼ ðx
1
ðtÞ;...;x
n
ðtÞÞ
T
,x
t
ðtÞ ¼ ðx
1
ðt t
1
Þ;...;x
n
ðt t
n
ÞÞ
T
and
s:R
n
R
n
R
þ
!R
n m
i.e.sðx;y;tÞ ¼ s
ij
ðx;y;tÞÞ
n m
.Assume,throughout
this paper,that sðx;y;tÞ is locally Lipschitz continuous and satisﬁes the linear
growth condition as well.So it is known (cf.[10–12]) that Eq.(1.6) has a unique
global solution on t50,which is denoted by xðt;xÞ.Moreover,assume also that
sð0;0;tÞ 0 for the stability purpose of this paper.So Eq.(1.6) admits an
equilibrium solution xðt;0Þ 0.
Stability of stochastic diﬀerential delay equations have been studied intensively
and the reader is referred,for example,to Arnold [13],Friedman [14],Has’minskii
[15],Kolmanovskii and Myshkis [16] and Mao [10,11].However,stochastic
delay neural networks have their own characteristics and it is desirable to obtain
stability criteria that make full use of these characteristics.It was in this spirit
that the authors in their earlier papers Liao and Mao [17,18] initiated the study
of stability and instability of stochastic neural networks and this paper is
the continuation of their research in this area.The main aim of this paper is to
discuss the almost sure exponential stability of the stochastic delay neural network
(1.6).Liao and Mao [18] mainly discussed the mean square exponential stability
fromwhich,along with an additional condition,the almost sure exponential stability
follows.But this paper investigates the almost sure exponential stability directly
and the criteria obtained here are absolutely new.Moreover,the techniques used
in this paper are diﬀerent from those in the authors’ earlier papers.Especially,
the nonnegative semimartingale convergence theorem will play an important role
in this paper.This paper is organized as follows:The main results of this paper
are developed in Section 2 where several suﬃcient criteria are established for almost
sure exponential stability of the stochastic delay neural network (1.6).By making
use of the special construction of the network,a number of very useful corollaries
are obtained in Section 3.These corollaries are described only in terms of
given system parameters and hence are extremely useful in applications.Finally,a
number of examples will be provided as illustrations of the use of the theorems in
Section 4.
2.Main results
Let C
2;1
ðR
n
R
þ
;R
þ
Þ denote the family of all nonnegative functions Vðx;tÞ on
R
n
R
þ
which are continuously twice diﬀerentiable in x and once diﬀerentiable in t.
For each V 2 C
2;1
ðR
n
R
þ
;R
þ
Þ,deﬁne an operator LV,associated with the
S.Blythe et al./Journal of the Franklin Institute 338 (2001) 481–495 483
stochastic delay neural network (1.6),from R
n
R
n
R
þ
to R by
LVðx;y;tÞ ¼V
t
ðx;tÞ þV
x
ðx;tÞ½Bx þAgðyÞ
þ
1
2
trace½s
T
ðx;y;tÞV
xx
ðx;tÞsðx;y;tÞ;
where
V
t
ðx;tÞ ¼
@Vðx;tÞ
@t
;V
x
ðx;tÞ ¼
@Vðx;tÞ
@x
1
;...;
@Vðx;tÞ
@x
n
;
V
xx
ðx;tÞ ¼
@
2
Vðx;tÞ
@x
i
@x
j
n n
:
Let us stress that LV is deﬁned on R
n
R
n
R
þ
while V on R
n
R
þ
.Let
CðR
n
;R
þ
Þ denote the family of all continuous functions from R
n
to R
þ
,while
Cð½
%
T;0;R
n
Þ,CðR;R
þ
Þ;etc.
Although conditions (1.2) and (1.5) are the characteristics of the delay network
and are of course assumed to hold throughout this paper,they will be mentioned
whenever they are used explicitly.
Theorem 2.1.Assume that there exist a number of functions V 2 C
2;1
ðR
n
R
þ
;R
þ
Þ;
f 2 CðR
n
;R
þ
Þ;f
i
2 CðR;R
þ
Þ ð14i4nÞ and two constants l
1
> l
2
50 such that
LVðx;y;tÞ4l
1
fðxÞ þl
2
X
n
i¼1
f
i
ðy
i
Þ;ðx;y;tÞ 2 R
n
R
n
R
þ
;ð2:1Þ
Vðx;tÞ4fðxÞ;ðx;tÞ 2 R
n
R
þ
ð2:2Þ
and
X
n
i¼1
f
i
ðx
i
Þ4fðxÞ;x 2 R
n
:ð2:3Þ
Then;for every x 2 Cð½t;0;R
n
Þ;the solution of Eq.ð1:6Þ has the property
limsup
t!1
1
t
log ðVðxðt;xÞ;tÞÞ4g a:s:;ð2:4Þ
where g 2 ð0;l
1
l
2
Þ is the unique root of
l
1
¼ g þl
2
e
g
%
t
:ð2:5Þ
ðRecall
%
t ¼ max
14i4n
t
i
.)
The proof of this theorem is based on the following semimartingale convergence
theorem established by Liptser and Shiryayev [19,Theorem 7,p.139].
Lemma 2.2.Let AðtÞ and UðtÞ be two continuous adapted increasing processes on t50
with Að0Þ ¼ Uð0Þ ¼ 0 a.s.Let MðtÞ be a realvalued continuous local martingale with
Mð0Þ ¼ 0 a.s.Let z be a nonnegative F
0
measurable random variable with Ez51.
Deﬁne
XðtÞ ¼ z þAðtÞ UðtÞ þMðtÞ for t50:
S.Blythe et al./Journal of the Franklin Institute 338 (2001) 481–495484
If XðtÞ is nonnegative;then
lim
t!1
AðtÞ51
lim
t!1
XðtÞ51
\lim
t!1
UðtÞ51
a:s:;
where B D a.s.means PðB\D
c
Þ ¼ 0.In particular;if lim
t!1
AðtÞ51 a.s.;then
for almost all o 2 O
lim
t!1
Xðt;oÞ51 and lim
t!1
Uðt;oÞ51;
that is both XðtÞ and UðtÞ converge to ﬁnite random variables.
Proof.Fix initial data x 2 Cð½t;0;R
n
Þ arbitrarily and write simply xðt;xÞ ¼ xðtÞ.
Deﬁne
Uðx;tÞ ¼ e
gt
Vðx;tÞ for ðx;tÞ 2 R
n
R
þ
;
which is in C
2;1
ðR
n
R
þ
;R
þ
Þ obviously.Using conditions (2.1) and (2.2),one can
compute
LUðx;y;tÞ ¼e
gt
½gVðx;tÞ þLVðx;y;tÞ
4e
gt
ðl
1
gÞfðxÞ þl
2
X
n
i¼1
f
i
ðy
i
Þ
"#
:
The It
#
o formula shows that for any t50
e
gt
VðxðtÞ;tÞ ¼Vðxð0Þ;0Þ þ
Z
t
0
LVðxðsÞ;x
t
ðsÞ;sÞ ds
þ
Z
t
0
e
gs
V
x
ðxðsÞ;sÞsðxðsÞ;x
t
ðsÞ;sÞ dwðsÞ
4Vðxð0Þ;0Þ ðl
1
gÞ
Z
t
0
e
gs
fðxðsÞÞ ds
þl
2
X
n
i¼1
Z
t
0
e
gs
f
i
ðx
i
ðs t
i
ÞÞ ds
þ
Z
t
0
e
gs
V
x
ðxðsÞ;sÞsðxðsÞ;x
t
ðsÞ;sÞ dwðsÞ:ð2:6Þ
On the other hand,it is easy to see that
Z
t
tt
i
e
gs
f
i
ðx
i
ðsÞÞ ds ¼
Z
t
t
i
e
gs
f
i
ðx
i
ðsÞÞ ds
Z
t
0
e
gðst
i
Þ
f
i
ðx
i
ðs t
i
ÞÞ ds
4
Z
t
%
t
e
gs
f
i
ðx
i
ðsÞÞ ds e
g
%
t
Z
t
0
e
gs
f
i
ðx
i
ðs t
i
ÞÞ ds:
This,together with condition (2.3),implies
X
n
i¼1
Z
t
tt
i
e
gs
f
i
ðx
i
ðsÞÞ ds4
Z
t
%
t
e
gs
fðxðsÞÞ ds e
g
%
t
X
n
i¼1
Z
t
0
e
gs
f
i
ðx
i
ðs t
i
ÞÞ ds:
ð2:7Þ
S.Blythe et al./Journal of the Franklin Institute 338 (2001) 481–495 485
It then follows from (2.6) and (2.7) that
e
gt
VðxðtÞ;tÞ þl
2
e
g%t
X
n
i¼1
Z
t
tt
i
e
gs
f
i
ðx
i
ðsÞÞ ds
4Vðxð0Þ;0Þ þ
Z
0
%t
fðxðsÞÞ ds ðl
1
g l
2
e
g
%
t
Þ
Z
t
0
e
gs
fðxðsÞÞ ds
þ
Z
t
0
e
gs
V
x
ðxðsÞ;sÞsðxðsÞ;x
t
ðsÞ;sÞ dwðsÞ:
Making use of (2.5) yields that
e
gt
VðxðtÞ;tÞ þl
2
e
g
%
t
X
n
i¼1
Z
t
tt
i
e
gs
f
i
ðx
i
ðsÞÞ ds4XðtÞ;ð2:8Þ
where
XðtÞ:¼ Vðxð0Þ;0Þ þ
Z
0
%
t
fðxðsÞÞ ds þ
Z
t
0
e
gs
V
x
ðxðsÞ;sÞ sðxðsÞ;x
t
ðsÞ;sÞ dwðsÞ;
which is a nonnegative martingale,and Lemma 2.2 shows
lim
t!1
XðtÞ51 a:s:
It therefore follows from (2.8) that
limsup
t!1
½e
gt
VðxðtÞ;tÞ51 a:s:
which implies
limsup
t!1
1
t
log ðVðxðtÞ;tÞÞ4g a:s:
as required.The proof is complete.&
Theorem 2.3.Let ð1:2Þ hold.Assume that there exist symmetric nonnegativedeﬁnite
matrices C
1
;C
2
and C
3
¼ diagðd
1
;...;d
n
Þ such that
trace½s
T
ðx;y;tÞsðx;y;tÞ4x
T
C
1
x þg
T
ðyÞC
2
gðyÞ þy
T
C
3
y ð2:9Þ
for all ðx;y;tÞ 2 R
n
R
n
R
þ
.Assume also that there exists a positivedeﬁnite
diagonal matrix D ¼ diagðd
1
;...;d
n
Þ such that the symmetric matrix
H ¼
2BþC
1
þC
3
þ
%
D A
A
T
DþC
2
0
@
1
A
is negativedeﬁnite;where
%
D ¼ diagðd
1
b
2
1
;...;d
n
b
2
n
Þ.Let l ¼ l
max
ðHÞ;the biggest
eigenvalue of H so l > 0.Then;for every x 2 Cð½t;0;R
n
Þ;the sample Lyapunov
S.Blythe et al./Journal of the Franklin Institute 338 (2001) 481–495486
exponent of the solution of Eq.ð1:6Þ can be estimated as
limsup
t!1
1
t
log ðjxðt;xÞjÞ4
g
2
a:s:;ð2:10Þ
where g > 0 is the unique root to the equation
l
1
¼ g þl
1
l
2
e
g%t
ð2:11Þ
with
l
1
¼ min
14i4n
ðl þd
i
þd
i
b
2
i
Þ and l
2
¼ max
14i4n
d
i
þðd
i
lÞb
2
i
l þd
i
þd
i
b
2
i
:ð2:12Þ
In other words;the stochastic delay neural network ð1:6Þ is almost surely exponentially
stable.
Proof.Let Vðx;tÞ ¼ jxj
2
.Then the operator LV has the form
LVðx;y;tÞ ¼ 2x
T
½Bx þAgðyÞ þtrace½s
T
ðx;y;tÞsðx;y;tÞ:
Compute,by the hypotheses,
LVðx;y;tÞ4 2x
T
Bx þx
T
AgðyÞ þg
T
ðyÞA
T
x þx
T
C
1
x þg
T
ðyÞC
2
gðyÞ þy
T
C
3
y
¼x
T
ð2BþC
1
þC
3
þ
%
DÞx þx
T
AgðyÞ þg
T
ðyÞA
T
x
þg
T
ðyÞðDþC
2
ÞgðyÞ x
T
ðC
3
þ
%
DÞx þy
T
C
3
y þg
T
ðyÞDgðyÞ
¼ðx
T
;g
T
ðyÞÞ H
x
gðyÞ
!
x
T
ðC
3
þ
%
DÞx þy
T
C
3
y þg
T
ðyÞDgðyÞ
4 lðjxj
2
þjgðyÞj
2
Þ x
T
ðC
3
þ
%
DÞx þy
T
C
3
y þg
T
ðyÞDgðyÞ
¼
X
n
i¼1
ðl þd
i
þd
i
b
2
i
Þx
2
i
þ
X
n
i¼1
½d
i
y
2
i
þðd
i
lÞg
2
i
ðy
i
Þ:
It is easy to see fromthe construction of H that l4d
i
for all 14i4n.Using (1.2) one
can then derive that
LVðx;y;tÞ4
X
n
i¼1
ðl þd
i
þd
i
b
2
i
Þx
2
i
þ
X
n
i¼1
½d
i
þðd
i
lÞb
2
i
y
2
i
:ð2:13Þ
In order to apply Theorem 2.1,deﬁne f 2 CðR
n
;R
þ
Þ and f
i
2 CðR;R
þ
Þ by
fðxÞ ¼
1
l
1
X
n
i¼1
ðl þd
i
þd
i
b
2
i
Þx
2
i
and f
i
ðy
i
Þ ¼
1
l
1
ðl þd
i
þd
i
b
2
i
Þy
2
i
:
It is obvious that
fðxÞ5jxj
2
¼ Vðx;tÞ and fðxÞ ¼
X
n
i¼1
f
i
ðx
i
Þ:
S.Blythe et al./Journal of the Franklin Institute 338 (2001) 481–495 487
Moreover,
LVðx;y;tÞ4 l
1
fðxÞ þ
X
n
i¼1
d
i
þðd
i
lÞb
2
i
l þd
i
þd
i
b
2
i
l
1
f
i
ðy
i
Þ
4 l
1
fðxÞ þl
1
l
2
X
n
i¼1
f
i
ðy
i
Þ:
By Theorem 2.1,for every x 2 Cð½t;0;R
n
Þ,the solution of Eq.(1.6) has the
property
limsup
t!1
1
t
log ðjxðt;xÞj
2
Þ4g a:s:
and the required assertion (2.10) follows.The proof is complete.&
Theorem 2.4.Let ð1:2Þ hold.Assume that there exist nonnegative numbers m
i
;y
i
and d
i
such that
trace½s
T
ðx;y;tÞsðx;y;tÞ4
X
n
i¼1
½m
i
x
2
i
þy
i
g
2
i
ðy
i
Þ þd
i
y
2
i
ð2:14Þ
for all ðx;y;tÞ 2 R
n
R
n
R
þ
.Assume also that there exists a positivedeﬁnite
diagonal matrix D ¼ diagðd
1
;...;d
n
Þ such that the symmetric matrix
%
H ¼
2B þ
%
D A
A
T
D
!
is negativedeﬁnite;where
%
D is the same as deﬁned in Theorem 2:3;namely
%
D ¼
diagðd
1
b
2
1
;...;d
n
b
2
n
Þ.Let
%
l ¼ l
max
ð
%
HÞ so
%
l > 0.If
ðm
i
þd
i
Þ _y
i
5
%
l;14i4n;ð2:15Þ
then the stochastic delay neural network ð1:6Þ is almost surely exponentially
stable.Moreover;the sample Lyapunov exponent ði.e.the left hand side of ð2:10ÞÞ
can be estimated by ð2:10Þ as long as the l in ð2:12Þ is determined by
l ¼ min
14i4n
½
%
l ðm
i
þd
i
Þ _y
i
:ð2:16Þ
Proof.Set
C
1
¼ diagðm
1
;...;m
n
Þ;C
2
¼ diagðy
1
;...;y
n
Þ;C
3
¼ diagðd
1
;...;d
n
Þ:
Then (2.14) can be written as
trace½s
T
ðx;y;tÞsðx;y;tÞ4x
T
C
1
x þg
T
ðyÞC
2
gðyÞ þy
T
C
3
y:
S.Blythe et al./Journal of the Franklin Institute 338 (2001) 481–495488
In view of Theorem 2.3,it is suﬃcient to verify that the matrix H deﬁned there is
negativedeﬁnite.To do so,for any x;y 2 R
n
,compute
ðx
T
;y
T
ÞH
x
y
!
¼ðx
T
;y
T
Þ
%
H
x
y
!
þðx
T
;y
T
Þ
C
1
þC
3
0
0 C
2
!
x
y
!
4
%
lðjxj
2
þjyj
2
Þ þ
X
n
i¼1
½ðm
i
þd
i
Þx
2
i
þy
i
y
2
i
4lðjxj
2
þjyj
2
Þ;
where l is deﬁned by (2.16) and is positive due to (2.15).The proof is therefore
complete.&
3.Further results
In the previous section several general criteria were obtained for the almost
sure exponential stability of the delay network.The use of these criteria depends
very much on the construction of the Lyapunov function V (Theorem 2.1) or
the choice of positive numbers d
i
(Theorems 2.3 and 2.4).However,it would be
nice and convenient to have some criteria which are only based on the system
parameters e.g.b
i
and b
i
.Moreover,relations (1.2) and (1.5) are both the properties
of the neural network but the criteria established in the previous have only used
property (1.2).In this section we shall also make use of the nice property (1.5) in
order to obtain further results.In particular,the new criteria here will be described
only in terms of the system parameters b
i
;b
i
;etc.given by (1.2),(1.5) and (2.14)
hence these criteria can be veriﬁed easily and should be proven very useful in
applications.
Corollary 3.1.Let ð1:2Þ;ð1:5Þ and ð2:14Þ hold.Assume
b
i
> b
2
i
X
n
j¼1
ja
ji
j for all 14i4n:ð3:1Þ
Let
%
l ¼ min
14i4n
b
i
b
2
i
P
n
j¼1
ja
ji
j
1 þb
2
i
:ð3:2Þ
If
ðm
i
þd
i
Þ _y
i
5
%
l;14i4n;ð3:3Þ
then the stochastic delay neural network ð1:6Þ is almost surely exponentially stable.
Proof.Choose
d
i
¼
b
i
þ
P
n
j¼1
ja
ji
j
1 þb
2
i
for 14i4n
S.Blythe et al./Journal of the Franklin Institute 338 (2001) 481–495 489
and then deﬁne the symmetric matrix
%
H the same as in Theorem 2.4.For any
x;y 2 R
n
,compute
ðx
T
;y
T
Þ
%
H
x
y
!
¼
X
n
i¼1
ð2b
i
þd
i
b
2
i
Þx
2
i
þ2
X
n
i;j¼1
a
ij
x
i
y
j
X
n
i¼1
d
i
y
2
i
4
X
n
i¼1
ð2b
i
þd
i
b
2
i
Þx
2
i
þ
X
n
i;j¼1
ja
ij
jðx
2
i
þy
2
j
Þ
X
n
i¼1
d
i
y
2
i
¼
X
n
i¼1
ðb
i
d
i
b
2
i
Þx
2
i
X
n
i¼1
d
i
X
n
j¼1
ja
ji
j
!
y
2
i
;
where condition (1.5) has been used.But
b
i
d
i
b
2
i
¼ d
i
X
n
j¼1
ja
ji
j ¼
b
i
b
2
i
P
n
j¼1
ja
ji
j
1 þb
2
i
5
%
l:
So
ðx
T
;y
T
ÞH
x
y
!
4
%
lðjxj
2
þjyj
2
Þ;
which implies l
max
ð
%
HÞ4
%
l.Now the conclusion of this corollary follows from
Theorem 2.4.The proof is complete.&
In practice,the networks are often symmetric in the sense ja
ij
j ¼ ja
ji
j.For such
symmetric networks the following result are particularly useful.
Corollary 3.2.Let ð1:2Þ;ð1:5Þ and ð2:14Þ hold.Assume the network is symmetric in the
sense
ja
ij
j ¼ ja
ji
j for all 14i;j4n:ð3:4Þ
Assume
b
i
51 for all 14i4n:ð3:5Þ
Let
%
l ¼ min
14i4n
b
i
ð1 b
2
i
Þ
1 þb
2
i
:ð3:6Þ
If ð3:3Þ holds;then the stochastic delay network ð1:6Þ is almost surely exponentially
stable.
Proof.By (1.5),(3.4) and (3.5),
b
2
i
X
n
j¼1
ja
ji
j5
X
n
j¼1
ja
ij
j ¼ b
i
S.Blythe et al./Journal of the Franklin Institute 338 (2001) 481–495490
for all 14i4n.Also
min
14i4n
b
i
b
2
i
P
n
j¼1
ja
ji
j
1 þb
2
i
¼ min
14i4n
b
i
ð1 b
2
i
Þ
1 þb
2
i
:
Hence the conclusion follows from Corollary 3.1 directly.The proof is complete.
In the sequel we shall denote by jjAjj the operator norm of matrix A,i.e.
jjAjj ¼ supfjAxj:x 2 R
n
;jxj ¼ 1g.
Corollary 3.3.Let ð1:2Þ and ð2:14Þ hold.Assume
2b
i
> jjAjjð1 þb
2
i
Þ for all 14i4n:ð3:7Þ
Let
%
l ¼ min
14i4n
2b
i
1 þb
2
i
jjAjj
!
:ð3:8Þ
If ð3:3Þ holds;then the stochastic delay network ð1:6Þ is almost surely exponentially
stable.
Proof.Choose
d
i
¼
2b
i
1 þb
2
i
for 14i4n
and then deﬁne the symmetric matrix
%
H the same as in Theorem 2.4.For any
x;y 2 R
n
,compute
ðx
T
;y
T
Þ
%
H
x
y
!
¼
X
n
i¼1
ð2b
i
þd
i
b
2
i
Þx
2
i
þ2x
T
Ay
X
n
i¼1
d
i
y
2
i
4
X
n
i¼1
ð2b
i
þd
i
b
2
i
Þx
2
i
þjjAjjðjxj
2
þjyj
2
Þ
X
n
i¼1
d
i
y
2
i
¼
X
n
i¼1
ð2b
i
d
i
b
2
i
jjAjjÞx
2
i
X
n
i¼1
ðd
i
jjAjjÞy
2
i
:
Note
2b
i
d
i
b
2
i
jjAjj ¼ d
i
jjAjj ¼
2b
i
1 þb
2
i
jjAjj5
%
l:
So
ðx
T
;y
T
ÞH
x
y
!
4
%
lðjxj
2
þjyj
2
Þ
which yields l
max
ð
%
HÞ4
%
l.Now the conclusion of this corollary follows from
Theorem 2.4.The proof is complete.&
S.Blythe et al./Journal of the Franklin Institute 338 (2001) 481–495 491
4.Examples
This section provides the examples as illustrations of the use of the theorems.The
networks described here are a little bit artiﬁcial but how the theory of this paper can
be applied is clearly illustrated.
The use of the criteria established in Section 2 depends very much on the
construction of the Lyapunov function V (Theorem 2.1) or the choice of positive
numbers d
i
(Theorems 2.3 and 2.4).The following example illustrates how to choose
positive numbers d
i
so that Theorem 2.3 can be applied.It also illustrates how to
estimate the sample Lyapunov exponent.
Example 4.1.Consider a stochastic delay network
d
x
1
ðtÞ
x
2
ðtÞ
!
¼
4 0
0 2
!
x
1
ðtÞ
x
2
ðtÞ
!
dt þ
2 2
1 1
!
g
1
ðx
1
ðt t
1
ÞÞ
g
2
ðx
2
ðt t
2
ÞÞ
!
dt
þ
0:2x
2
ðt t
2
ÞÞ
0:5x
1
ðt t
1
ÞÞ
!
dwðtÞ;ð4:1Þ
where wðtÞ is a realvalued Brownian motion,t
1
and t
2
are both positive numbers
while
g
i
ðu
i
Þ ¼
1 e
u
i
1 þe
u
i
:
It is easily shown that (1.2) is satisﬁed with b
1
¼ b
2
¼ 1.To apply Theorem2.3,note
in this example that
sðx;y;tÞ ¼ ð0:2y
2
;0:5y
1
Þ
T
whence s
T
ðx;y;tÞsðx;y;tÞ ¼ 0:25y
2
1
þ0:04y
2
2
:
Hence condition (2.9) is satisﬁed with C
1
¼ C
2
¼ 0 and C
3
¼ diagð0:25;0:04Þ.Now,
choose D ¼ diagðd
1
;d
2
Þ such that
2BþC
3
þD ¼ D;namely
8 0
0 4
!
þ
0:25 0
0 0:04
!
¼
2d
1
0
0 2d
2
!
which gives d
1
¼ 3:875 and d
2
¼ 1:98.So the matrix H deﬁned in Theorem 2.3
becomes
H ¼
3:875 0 2 2
0 1:98 1 1
2 1 3:875 0
2 1 0 1:98
0
B
B
B
@
1
C
C
C
A
:
Compute l
max
ðHÞ ¼ 0:19578 which means that H is negativedeﬁnite.By Theorem
2.3,the delay network (4.1) is almost surely exponentially stable.To estimate the
sample Lyapunov exponent,compute,by (2.12),l
1
¼ 2:21578 and l
2
¼ 0:9094 so
S.Blythe et al./Journal of the Franklin Institute 338 (2001) 481–495492
(2.11) becomes
2:21578 ¼ g þ2:015e
g%t
:ð4:2Þ
If both t
1
and t
2
are known,e.g.t
1
¼ t
2
¼ 0:1 then
%
t ¼ 0:1 and ð4:2Þ reduces to
2:21578 ¼ g þ2:015e
0:1g
which has a unique root g ¼ 0:1668.Therefore,Theorem 2.3 shows that the
sample Lyapunov exponent of the solution of Eq.(4.1) should not be greater than
0:0834.
By making use of the special construction of the network,a number of very useful
results are obtained in Section 3.Since these results are described only in terms of
given system parameters,they are very convenient to be used in applications.The
following example illustrates that if the neural network is symmetric how we can
make use of this nice symmetric property to show the stability.
Example 4.2.Consider a threedimensional stochastic delay neural network
dxðtÞ ¼½BxðtÞ þAgðx
t
ðtÞÞ dt þB
1
xðtÞ dw
1
ðtÞ
þðy
1
sinðx
1
ðt t
1
ÞÞ;y
2
sinðx
2
ðt t
2
ÞÞ;y
3
sinðx
3
ðt t
3
ÞÞÞ
T
dw
2
ðtÞ:ð4:3Þ
Here ðw
1
ðtÞ;w
2
ðtÞÞ is a twodimensional Brownian motion,B
1
a 3 3 constant
matrices,y
i
’s are real numbers and t
i
’s positive constants while
B ¼ diagð2;3;4Þ;A ¼
0 1 1
1 1 1
1 1 2
0
B
@
1
C
A
;
g
i
ðy
i
Þ ¼ ðb
i
y
i
^1Þ _ð1Þ with b
1
¼ 0:4;b
2
¼ 0:5;b
3
¼ 0:6;
gðyÞ ¼ ðg
1
ðy
1
Þ;g
2
ðy
2
Þ;g
3
ðy
3
ÞÞ
T
:
Clearly (1.2) and (1.5) holds and the network is symmetric so (3.4) is satisﬁed as well.
By (3.6),compute
%
l ¼ min
2ð1 0:4
2
Þ
1 þ0:4
2
;
3ð1 0:5
2
Þ
1 þ0:5
2
;
4ð1 0:6
2
Þ
1 þ0:6
2
¼ 1:448:
On the other hand,note in this example that
sðx;y;tÞ ¼ ðB
1
x;ðy
1
sin y
1
;y
2
sin y
2
;y
3
sin y
3
Þ
T
Þ:
Note also
sin
2
y
i
4ððy
i
^1Þ _ð1ÞÞ
2
4
1
b
2
i
ððb
i
^1Þ _ð1ÞÞ
2
¼
g
2
i
ðy
i
Þ
b
2
i
:
Consequently
trace½s
T
ðx;y;tÞ sðx;y;tÞ4jjB
1
jj
2
jxj
2
þ
y
2
1
0:16
g
2
1
ðy
1
Þ þ
y
2
2
0:25
g
2
2
ðy
2
Þ þ
y
2
1
0:36
g
2
3
ðy
3
Þ:
ð4:4Þ
S.Blythe et al./Journal of the Franklin Institute 338 (2001) 481–495 493
By Corollary 3.2,if
jjB
1
jj
2
51:448;y
2
1
50:23168;y
2
2
50:362;y
2
3
50:52128;ð4:5Þ
then the stochastic delay network (4.3) is almost surely exponentially stable.Of
course,one may estimate trace½s
T
ðx;y;tÞsðx;y;tÞ in a diﬀerent way in order to
obtain an alternative result.For instance,given
jjB
1
jj
2
¼ 0:5;y
1
¼ y
2
¼ y
3
¼ 1 ð4:6Þ
(4.5) is not satisﬁed.However,one can estimate
trace½s
T
ðx;y;tÞsðx;y;tÞ40:5jxj
2
þsin
2
ðy
1
Þ þsin
2
ðy
2
Þ þsin
2
ðy
2
Þ
40:5jxj
2
þ0:8jyj
2
þ0:2½sin
2
ðy
1
Þ þsin
2
ðy
2
Þ þsin
2
ðy
2
Þ
40:5jxj
2
þ0:8jyj
2
þ1:25g
2
1
ðy
1
Þ þ0:8g
2
2
ðy
2
Þ þ0:56g
2
3
ðy
3
Þ:
So (3.3) is satisﬁed and Corollary 3.2 shows that the stochastic delay network (4.3) is
still almost surely exponentially stable under condition (4.6).
The following example illustrates that for a given nonsymmetric network how we
can obtain a robust result on the intensity of noise provided the system parameters
are speciﬁed.
Example 4.3.Consider another threedimensional stochastic delay neural
network
dxðtÞ ¼ ½BxðtÞ þAgðx
t
ðtÞÞ dt þB
1
gðx
t
ðtÞÞ dwðtÞ;ð4:7Þ
where wðtÞ is a scalar Brownian motion,B
1
a 3 3 constant matrix and
B ¼ diagð3;4;3Þ;A ¼
1 0 2
1 2 1
1 1 1
0
B
@
1
C
A
;
gðyÞ ¼ ðg
1
ðy
1
Þ;g
2
ðy
2
Þ;g
3
ðy
3
ÞÞ
T
;g
i
ðy
i
Þ ¼
e
b
i
y
i
e
b
i
y
i
e
b
i
y
i
þe
b
i
y
i
with b
1
¼ 0:4;b
2
¼ 0:5;b
3
¼ 0:4.Clearly (1.2) and (1.5) are satisﬁed.Note that
sðx;y;tÞ ¼ B
1
gðyÞ and for any e > 0,
s
T
ðx;y;tÞsðx;y;tÞ4jjB
1
jj
2
jgðyÞj
2
4jjB
1
jj
2
½eb
2
1
y
2
1
þeb
2
2
y
2
2
þeb
2
3
y
2
3
þð1 eÞjgðyÞj
2
4jjB
1
jj
2
½0:25ejyj
2
þð1 eÞjgðyÞj
2
:
Letting e ¼ 0:8 gives
s
T
ðx;y;tÞsðx;y;tÞ40:2jjB
1
jj
2
ðjyj
2
þjgðyÞj
2
Þ:
That is,(2.14) holds with m
i
¼ 0 and y
i
¼ d
i
¼ 0:2jjB
1
jj
2
.To apply Corollary 3.1,
compute by (3.2) that
%
l ¼ min
3 0:16 3
1 þ0:16
;
4 0:25 3
1 þ0:25
;
3 0:16 4
1 þ0:16
¼ 2:034:
S.Blythe et al./Journal of the Franklin Institute 338 (2001) 481–495494
Hence,(3.3) becomes
0:2jjB
1
jj
2
52:034;namely jjB
1
jj53:189:ð4:8Þ
Corollary 3.3 therefore concludes that the delay network (4.7) is almost surely
exponentially stable as long as jjB
1
jj53:189.
Acknowledgements
The authors would like to thank the referees for their very helpful suggestions.
They would also like to thank the Royal Society (UK) and the EPSRC=BBSRC for
their ﬁnancial supports.
References
[1] M.A.Coben,S.Crosshery,Absolute stability and global pattern formation and patrolled memory
storage by competitive neural networks,IEEE Trans.Systems Man Cybernet.13 (1983) 815–826.
[2] J.J.Hopﬁeld,Neural networks and physical systems with emergent collect computational abilities,
Proc.Natl.Acad.Sci.USA 79 (1982) 2554–2558.
[3] J.J.Hopﬁeld,Neurons with graded response have collective computational properties like those of
twostate neurons,Proc.Natl.Acad.Sci.USA 81 (1984) 3088–3092.
[4] J.J.Hopﬁeld,D.W.Tank,Computing with neural circuits,Model Sci.233 (1986) 3088–3092.
[5] X.X.Liao,Absolute Stability of Nonlinear Control Systems,Kluwer Academic Publishers,
Dordrecht,1993.
[6] A.Quezz,V.Protoposecu,J.Barben,On the stability storage capacity and design of nonlinear
continuous neural networks,IEEE Trans.Systems Man Cybernet.18 (1983) 80–87.
[7] C.M.Marcus,R.M.Westervelt,Stability of analog networks with delay,Physical Review A 39 (1)
(1989) 347–359.
[8] S.Haykin,Neural Networks,PrenticeHall,NJ,1994.
[9] Y.Le Cun,C.C.Galland,G.E.Hinton,GEMINI:gradient estimation through matrix inversion after
noise injection,in:D.S.Touretzky (Ed.),Advances in Neural Information Processing Systems,Vol.I,
Morgan Kaufmann,San Mateo,1989,pp.141–148.
[10] X.Mao,Exponential Stability of Stochastic Diﬀerential Equations,Marcel Dekker,New York,1994.
[11] X.Mao,Stochastic Diﬀerential Equations and Applications,Horwood Publishing,1997.
[12] S.E.A.Mohammed,Stochastic Functional Diﬀerential Equations,Longman,New York,1986.
[13] L.Arnold,Stochastic Diﬀerential Equations:Theory and Applications,Wiley,New York,1972.
[14] A.Friedman,Stochastic Diﬀerential Equations and Applications,Academic Press,New York,1976.
[15] R.Z.Has’minskii,Stochastic Stability of Diﬀerential Equations,Sijthoﬀ and Noordhoﬀ,Alphen,
1981.
[16] V.B.Kolmanovskii,A.Myshkis,Applied Theory of Functional Diﬀerential Equations,Kluwer
Academic Publishers,Dordrecht,1992.
[17] X.X.Liao,X.Mao,Exponential stability and instability of stochastic neural networks,Stochast.
Anal.Appl.14 (2) (1996a) 165–185.
[18] X.X.Liao,X.Mao,Stability of stochastic neural networks,Neural,Parallel Sci.Comput.4 (2)
(1996b) 205–224.
[19] R.Sh.Liptser,A.N.Shiryayev,Theory of Martingales,Kluwer Academic Publishers,Dordrecht,
1986.
S.Blythe et al./Journal of the Franklin Institute 338 (2001) 481–495 495
Enter the password to open this PDF file:
File name:

File size:

Title:

Author:

Subject:

Keywords:

Creation Date:

Modification Date:

Creator:

PDF Producer:

PDF Version:

Page Count:

Preparing document for printing…
0%
Comments 0
Log in to post a comment