› gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure...

27
1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural networks Priya Nair * Department of Mathematics, Anna University, Chennai - 600025, India. Abstract The numerical studies of the delayed stochastic Hopfield neural networks are important due to the lack of analytical methods for solving these equations. Based on the literature survey, the numerical studies of almost sure exponential stability of delayed stochastic Hopfield neural networks are limited. Therefore, the almost sure asymptotic stability and almost sure exponential stability of the delayed stochastic Hopfield neural networks are considered in this paper. The main aim of this paper is to prove the almost sure asymptotic stability and the almost sure exponential stability of split-step theta methods and stochastic linear theta methods using semi-martingale convergence technique. It is proved that under some reasonable conditions, the almost sure asymptotic stability and almost sure exponential stability is guaranteed. Numerical examples are given to prove the theoretical results and numerical simulations illustrate the stability for different values of θ. keywords: Stochastic differential equations; Delayed stochastic Hopfield neural networks; Split-step theta methods; Almost sure exponential stability; Almost sure asymptotic stability. MSC: 60H10; 60H35; 65L20; 60G48; 60G42. I. I NTRODUCTION Due to the variety of applications, we are interested in the study of delayed stochastic Hopfield neural networks, which is a subclass of delayed stochastic differential equations. The delayed stochastic Hopfield neural network has wide applications in the study of signal transmission, which can be noted as a model due to the inherent randomness. Also, the models related to the nervous systems are studied and developed with the idea of neural networks. There are two types of Hopfield neural network. One is the discrete Hopfield neural network, which has a wide application in the field of pattern recognition, addressable memory and optimization problems. The other one is the continuous Hopfield neural networks. The mathematical analysis and practical applications of Hopfield neural networks have gained considerable research attention. The stability properties of delayed Hopfield neural networks such as exponential stability, asymptotic stability, trajectory stability and mean-square stability has been studied. It can be noted that there has been very little research on the almost sure asymptotic stability of the delayed stochastic Hopfield neural networks. The early study of the stability of neural network dealt with the models in which the updating and propagation is assumed to occur instantaneously i.e. no time delay is considered. In many applications of neural networks, it is essential to require the designed neural network to possess a certain type of stability property. The interest in analyzing the stability property has been considerably increased in the recent years. The study of the stability of numerical solutions to the stochastic delayed Hopfield neural network is one of the important problems in the research field. Many authors have studied the stability of stochastic differential * Corresponding author: [email protected] (Priya Nair) JASC: Journal of Applied Science and Computations Volume VI, Issue I, January/2019 ISSN NO: 1076-5131 Page No:780

Transcript of › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure...

Page 1: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

1

Almost sure asymptotic stability and almost sureexponential stability of split-step theta methods to

delayed stochastic Hopfield neural networksPriya Nair ∗

Department of Mathematics, Anna University, Chennai - 600025, India.

Abstract

The numerical studies of the delayed stochastic Hopfield neural networks are important due to the lack ofanalytical methods for solving these equations. Based on the literature survey, the numerical studies of almostsure exponential stability of delayed stochastic Hopfield neural networks are limited. Therefore, the almost sureasymptotic stability and almost sure exponential stability of the delayed stochastic Hopfield neural networks areconsidered in this paper. The main aim of this paper is to prove the almost sure asymptotic stability and the almostsure exponential stability of split-step theta methods and stochastic linear theta methods using semi-martingaleconvergence technique. It is proved that under some reasonable conditions, the almost sure asymptotic stability andalmost sure exponential stability is guaranteed. Numerical examples are given to prove the theoretical results andnumerical simulations illustrate the stability for different values of θ.

keywords: Stochastic differential equations; Delayed stochastic Hopfield neural networks; Split-step theta methods;Almost sure exponential stability; Almost sure asymptotic stability.MSC: 60H10; 60H35; 65L20; 60G48; 60G42.

I. INTRODUCTION

Due to the variety of applications, we are interested in the study of delayed stochastic Hopfield neural networks,which is a subclass of delayed stochastic differential equations. The delayed stochastic Hopfield neural networkhas wide applications in the study of signal transmission, which can be noted as a model due to the inherentrandomness. Also, the models related to the nervous systems are studied and developed with the idea of neuralnetworks. There are two types of Hopfield neural network. One is the discrete Hopfield neural network, which hasa wide application in the field of pattern recognition, addressable memory and optimization problems. The otherone is the continuous Hopfield neural networks. The mathematical analysis and practical applications of Hopfieldneural networks have gained considerable research attention. The stability properties of delayed Hopfield neuralnetworks such as exponential stability, asymptotic stability, trajectory stability and mean-square stability has beenstudied. It can be noted that there has been very little research on the almost sure asymptotic stability of the delayedstochastic Hopfield neural networks. The early study of the stability of neural network dealt with the models inwhich the updating and propagation is assumed to occur instantaneously i.e. no time delay is considered. In manyapplications of neural networks, it is essential to require the designed neural network to possess a certain typeof stability property. The interest in analyzing the stability property has been considerably increased in the recentyears.

The study of the stability of numerical solutions to the stochastic delayed Hopfield neural network is one ofthe important problems in the research field. Many authors have studied the stability of stochastic differential

∗Corresponding author: [email protected] (Priya Nair)

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:780

Page 2: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

equations in [12] and [18]. Higham et. al [3] introduced the split-step backward Euler method for solving nonlinearautonomous stochastic differential equations. Ding et. al [2] has analyzed the split-step θ methods for stochasticdifferential equations. In [6] and [14] the authors has studied the split-step theta method for stochastic delayedHopfield neural networks. Later, Zhong [23] proved that stochastic linear theta method can inherit the exponentialmean-square stability of stochastic delay differential equations. Chen [1] has investigated the global exponentialstability of delayed Hopfield neural networks. The mean-square stability of numerical solutions to stochastic delaydifferential equations has studied in [5]. The mean-square exponential stability of stochastic delay Hopfield neuralnetworks has been discussed in [19]. The exponential stability of recurrent neural networks is studied in [7]. Theexponential stability of stochastic delayed Hopfield neural networks are studied in [16] and [22]. Convergence ofdiscrete delayed Hopfield neural network is established in [11]. Wu et. al [20] studied the almost sure exponentialfor stochastic differential equations. Huang et. al [4] and Liu and Zhu in [9] investigated the almost sure exponentialstability of stochastic delay Hopfield neural networks with the help of the Lyapunov function and semi-martingaleconvergence theorem. Rodkina and Schurz in [15] has studied the almost sure asymptotic stability of the driftimplicit theta methods. The almost sure asymptotic stability of neutral stochastic delay differential equations withMarkovian switching is studied in [8], [13] and [21]. Schurz in [17] has proved the almost sure asymptotic stabilityof trivial solution and almost sure convergence of stochastic theta methods applied to the systems of linear stochasticdifferential equations in Rd. Recently, Liu et. al in [10] has investigated the stability of two classes of theta methodsfor numerical solutions to delayed stochastic Hopfield neural networks. This motivated us to establish the almostsure asymptotic stability and almost sure exponential stability to the delayed Hopfield neural network. To the bestof the author’s knowledge, there has been no work related to the study of almost sure asymptotic stability andalmost sure exponential stability of a class of split-step theta methods for the delayed stochastic Hopfield neuralnetworks. The aim of this paper is to prove the almost sure asymptotic stability and almost sure exponential stabilityfor delayed stochastic Hopfield neural networks. The split-step theta methods, for θ = 0 reduces to Euler methodand the stochastic linear theta methods for θ = 1 reduces to backward Euler method.

The rest of the paper is designed as follows. Section 2, discusses the basic notations and Lemmas which willbe useful in establishing the stability of the systems. The stability of the split-step theta methods is established inSection 3. The stability of the stochastic linear theta methods is investigated in Section 4. In Section 5, a numericalexample is provided to illustrate the efficiency of the theoretical results proved in Section 3 and Section 4. Thepaper concludes in Section 6.

II. MODEL DESCRIPTION AND LEMMAS

Consider the delayed stochastic Hopfield neural network:

dyi(t) =

−ciyi(t) +

n∑j=1

aijfj(yj(t)) +

n∑j=1

bijgj(yj(t− τj))

dt+

n∑j=1

σij(yj(t), yj(t− τj))dWj(t), t > 0,

yi(t) = ξi(t), t ∈ [−τi, 0], (1)

for i = 1, 2, ..., n and τ = max1≤i≤n τi. n is the number of neurons in the network, yi denotes the state variable ofthe ith neuron at time t, fj and gj is yield obtained when the jth neurons react at time t and t− τj , respectively.σij is a continuous function, ci represents the rate with which the ith unit will reset its potential to the restingstate when isolated from the other networks, and is a positive constant; τj is the transmission delay and is a non-negative constant; aij and bij weight the strength of the jth unit on the ith unit at time t and t− τj respectively.W (t) = (W1(t), ...,Wn(t))T is n-dimensional Brownian motion defined on a complete probability space (Ω,F ,P)

with a natural filtration Ftt≥0 generated by W (t), where we associate Ω with the canonical space generated

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:781

Page 3: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

by W (t), and denote by F the associated σ-algebra generated by W (s) : 0 ≤ s ≤ t with the probability measureP. Moreover, yi(t, ξi) represents the exact solution of (1) for the initial data ξi.

For the delayed stochastic Hopfield neural networks (1) we consider the following numerical approximation:

yki = xki + θ∆[−ciyki +

n∑j=1

aijfj(ykj ) +

n∑j=1

bijgj(yk−mj

j )],

xk+1i = xki + ∆[−ciyki +

n∑j=1

aijfj(ykj ) +

n∑j=1

bijgj(yk−mj

j )] +

n∑j=1

σij(ykj , y

k−mj

j )∆W kj ,

y−li = ξi(−l∆), x0i = ξi(0), l = 0, 1, · · · ,mi. (2)

The equation (2) can be rewritten as

yk+1i = yki + (1− θ)∆[−ciyki +

n∑j=1

aijfj(ykj ) +

n∑j=1

bijgj(yk−mj

j )] +

n∑j=1

σij(ykj , y

k−mj

j )∆W kj ,

y−li = ξi(−l∆), x0i = ξi(0), l = 0, 1, · · · ,mi,

where i = 1, 2, · · · , n. ∆ > 0 is a step size with τj = mj∆ for some positive integers mj , and tk = k∆, yki isan approximation to yi(tk). We refer to the numerical approximation

xki

as split-step theta approximation. If wedenote m = max1≤j≤n mj, then we have τ = m∆.

The classical theta approximation is

yk+1i = yki + θ∆[−ciyk+1

i +

n∑j=1

aijfj(yk+1j ) +

n∑j=1

bijgj(yk−mj+1j )] + (1− θ)∆[−ciyki +

n∑j=1

aijfj(ykj )

+

n∑j=1

bijgj(yk−mj

j )] +

n∑j=1

σij(ykj , y

k−mj

j )∆W kj , (3)

which we refer as stochastic linear theta method. Now let us give the following definition on almost sure asymptoticstability and almost sure exponential stability of delayed stochastic Hopfield neural networks.

Definition 1: [18] The solution yi(t, ξi) to equation (1) is said to be almost surely asymptotically stable, if

limt→∞|yi(t, ξi)| = 0, a.s. (4)

for any initial data ξi ∈ CbF0([−τi, 0];Rn).

Definition 2: [12] The trivial solution of equation (1) is said to be almost surely exponentially stable if for almostall sample paths of the solution yi(t, ξi), we have

lim supt→∞

1

tlog |yi(t, ξi)| ≤ 0, (5)

for any initial data ξi ∈ CbF0([−τi, 0];Rn).

Now let us impose the following assumptions on the model (1):

A1 : fj(0) = gj(0) =∑n

j=1 σij(0, 0) = 0. fj , gj and∑n

j=1 σij satisfy the global Lipschitz condition withLipschitz constants αj > 0, βj > 0,

∑nj=1 lj ≥ 0 and

∑nj=1 pj ≥ 0, i.e.

|fj(x)− fj(y)| ≤ αj |x− y| , (6)

|gj(x)− gj(y)| ≤ βj |x− y| , (7)

|σij(x1, y1)− σij(x2, y2)| ≤ lj |x1 − x2|+ pj |y1 − y2| , j = 1, · · · ,m. (8)

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:782

Page 4: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

A2 : For i = 1, 2, · · · , n

ci −n∑j=1

|aij |αj −n∑j=1

|bij |βj −n∑j=1

(p2j + l2j ) > 0. (9)

Now, let us present the semi-martingale convergence theorem:Lemma 1: [18] Let Ai and Ui be two sequences of non-negative random variables such that both Ai and

Ui are Fi−1-measurable for i = 1, 2, · · · , and A0 = U0 = 0 a.s. Let Mi be a real-valued local martingalewith M0 = 0 a.s. Let ξ be a non-negative F0-measurable random variable. Assume that Xi is a non-negativesemi-martingale with the Doob-Meyer decomposition

Xi = ξ +Ai − Ui +Mi.

If limi→∞Ai <∞ a.s., then for almost all ω ∈ Ω, limi→∞Xi <∞ and limi→∞ Ui <∞, that is, both Xi andUi converge to finite random variables.Consider a stochastic differential equation of the form

dx(t) = f(x(t), x(t− τ), t)dt+ g(x(t), x(t− τ), t)dW (t), t ≥ 0, (10)

Lemma 2: Both f and g satisfy the local Lipschitz condition. Assume that there exists four constants λi (i =

1, 2, 3, 4) such that

2xT f(x, 0, t) ≤ −λ1|x|2, (11)

|f(x, y, t)− f(x, 0, t)| ≤ λ2|y|, (12)

|g(x, y, t)|2 ≤ λ3|x|2 + λ4|y|2, (13)

for all t ≥ 0 and x, y ∈ Rn. If λ1 − λ3 > λ2 + λ4, then for any initial data ξi ∈ CbF0([−τi, 0];Rn), there exists a

unique global solution x(t, ξi) to equation (10) and the solution x(t, ξi) is almost surely asymptotically stable.Lemma 3: Suppose f and g satisfies the equations (12) and (13). And if λ1 > 2λ2 +λ3 +λ4, then for any initial

data ξi ∈ CbF0([−τi, 0];Rn), there exists a unique global solution x(t, ξi) to equation (10) and the solution x(t, ξi)

has the property that

lim supt→∞

1

tlog |x(t, ξi)| ≤

−γi2

a.s.

where γi > 0 is the unique positive root of the following equation

λ1 − λ2 − λ3 − γi = (λ2 + λ4)eγiτ ,

i.e.

2ci − 2

n∑j=1

|aij |αj −n∑j=1

|bij |βj − 2

n∑j=1

l2j − γi =

n∑j=1

|bij |βj + 2

n∑j=1

p2j

eγiτ . (14)

Then the solution is almost surely exponentially stable i.e there exists a constant ηi > 0 such that

lim supt→∞

1

tlog(|yi(t, ξi)|) ≤ −ηi, a.s.

for any initial data ξi ∈ CbF0([−τi, 0];Rn).

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:783

Page 5: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

III. STABILITY OF SPLIT-STEP THETA METHODS

In this section, we will investigate the almost sure asymptotic stability and almost sure exponential stability ofthe split-step theta methods (2).

Definition 3: The numerical solution yki to equation (2) is said to be almost sure asymptotically stable if

limk→∞

∣∣∣yki ∣∣∣ = 0, a.s. (15)

Theorem 1: Let the assumptions A1 and A2 hold. Then there exists a ∆0 > 0 such that if ∆ < ∆0, then for anyfinite-valued F0-measurable random variables ξi(k∆), k = −m,−m + 1, · · · , 0, the split-step theta approximatesolution (2) is almost surely asymptotically stable.Proof: By the assumptions A1 and A2, we have

∣∣∣yk+1i

∣∣∣2 =

⟨yki + (1− θ)∆

−ciyki +

n∑j=1

aijfj(ykj ) +

n∑j=1

bijgj(yk−mj

j )

+

n∑j=1

σij(ykj , y

k−mj

j )∆W kj ,

yki + (1− θ)∆

−ciyki +

n∑j=1

aijfj(ykj ) +

n∑j=1

bijgj(yk−mj

j ) +

n∑j=1

σij(ykj , y

k−mj

j )∆W kj

≤∣∣∣yki ∣∣∣2 − 2ci(1− θ)∆|yki |2 + 2(1− θ)∆yki

n∑j=1

|aij |αj |ykj |+ 2(1− θ)∆ykin∑j=1

|bij |βj |yk−mj

j |

+c2i (1− θ)2∆2|yki |2 + 2(1− θ)2∆2

n∑j=1

|aij |αj

(n∑r=1

|air|αr

)|ykj |2

+2(1− θ)2∆2n∑j=1

|bij |βj

(n∑r=1

|bir|βr

)|yk−mj

j |2 − 2ci(1− θ)2∆2yki

n∑j=1

|aij |αj |ykj |

−2ci(1− θ)2∆2yki

n∑j=1

|bij |βj |yk−mj

j |+

n∑j=1

σij(ykj , y

k−mj

j )∆W kj

2

+2

⟨yki + (1− θ)∆

−ciyki +

n∑j=1

aijfj(ykj ) +

n∑j=1

bijgj(yk−mj

j )

, n∑j=1

σij(ykj , y

k−mj

j )∆W kj

⟩.

Let us take∣∣Y k

∣∣ = max1≤j≤n

∣∣∣ykj ∣∣∣, ∣∣Y k+1∣∣ = max1≤j≤n |yk+1

j | and∣∣Y k−mj

∣∣ = max1≤j≤n

∣∣∣yk−mj

j

∣∣∣. Then, we get

|Y k+1|2 ≤ |Y k|2 +

(− 2ci(1− θ)∆ + 2(1− θ)∆

n∑j=1

|aij |αj + (1− θ)∆n∑j=1

|bij |βj + c2i (1− θ)2∆2

+(1− θ)2∆2

n∑j=1

|aij |αj

2

− 2ci(1− θ)2∆2n∑j=1

|aij |αj − ci(1− θ)2∆2n∑j=1

|bij |βj

+2

n∑j=1

l2j (∆Wkj )2

)|Y k|2 +

((1− θ)∆

n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

−ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

p2j (∆W

kj )2

)|Y k−mj |2 + 2

⟨yki + (1− θ)∆

[− ciyki

+

n∑j=1

aijfj(ykj ) +

n∑j=1

bijgj(yk−mj

j )

],

n∑j=1

σij(ykj , y

k−mj

j )∆W kj

⟩. (16)

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:784

Page 6: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

It follows that∣∣∣Y k+1∣∣∣2 − |Y k|2 ≤

(− 2ci(1− θ)∆ + 2(1− θ)∆

n∑j=1

|aij |αj + (1− θ)∆n∑j=1

|bij |βj + c2i (1− θ)2∆2

+(1− θ)2∆2

n∑j=1

|aij |αj

2

− 2ci(1− θ)2∆2n∑j=1

|aij |αj − ci(1− θ)2∆2n∑j=1

|bij |βj

+2

n∑j=1

l2j (∆Wkj )2

)|Y k|2 +

((1− θ)∆

n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

−ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

p2j (∆W

kj )2

)|Y k−mj |2 +mk,

where

mk = 2

⟨yki + (1− θ)∆

[− ciyki +

n∑j=1

aijfj(ykj ) +

n∑j=1

bijgj(yk−mj

j )

],

n∑j=1

σij(ykj , y

k−mj

j )∆W kj

⟩,

which implies that∣∣∣Y k∣∣∣2 − |Y 0|2 ≤

(− 2ci(1− θ)∆ + 2(1− θ)∆

n∑j=1

|aij |αj + (1− θ)∆n∑j=1

|bij |βj + c2i (1− θ)2∆2

+(1− θ)2∆2

n∑j=1

|aij |αj

2

− 2ci(1− θ)2∆2n∑j=1

|aij |αj − ci(1− θ)2∆2n∑j=1

|bij |βj

+2

n∑j=1

l2j∆

) k−1∑p=0

|Y p|2 +

((1− θ)∆

n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

−ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

p2j∆

) k−1∑p=0

|Y p−mj |2 +mp,

where mp =∑p−1

k=0mk. Since

k−1∑p=0

|Y p−m|2 =

−1∑p=−m

|Y p|2 +

k−1∑p=0

|Y p|2 −k−1∑

p=k−m|Y p|2.

Then, we get

|Y k|2 +

((1− θ)∆

n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

− ci(1− θ)2∆2n∑j=1

|bij |βj

+2

n∑j=1

p2j∆

) k−1∑p=k−m

|Y p|2

≤ |Y 0|2 +

((1− θ)∆

n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

− ci(1− θ)2∆2n∑j=1

|bij |βj

+2

n∑j=1

p2j∆

) −1∑p=−m

|Y p|2 +

[− 2ci(1− θ)∆ + 2(1− θ)∆

n∑j=1

|aij |αj + 2(1− θ)∆n∑j=1

|bij |βj

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:785

Page 7: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

+c2i (1− θ)2∆2 + (1− θ)2∆2

n∑j=1

|aij |αj

2

− 2ci(1− θ)2∆2n∑j=1

|aij |αj

−2ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

l2j∆ + (1− θ)2∆2

n∑j=1

|bij |βj

2

+ 2

n∑j=1

p2j∆

] k−1∑p=0

|Y p|2 +Mp

where Mp =∑k−1

p=0 mp is a martingale with M0 = 0. Therefore by assumption A2, we get

−2ci(1− θ)∆ + 2(1− θ)∆n∑j=1

|aij |αj + 2(1− θ)∆n∑j=1

|bij |βj + c2i (1− θ)2∆2

+(1− θ)2∆2

n∑j=1

|aij |αj

2

− 2ci(1− θ)2∆2n∑j=1

|aij |αj − 2ci(1− θ)2∆2n∑j=1

|bij |βj

+2

n∑j=1

l2j∆ + (1− θ)2∆2

n∑j=1

|bij |βj

2

+ 2

n∑j=1

p2j∆ > 0, as 0 < ∆ < ∆0,

where

∆0 =2(1− θ)

ci −

∑nj=1 |aij |αj −

∑nj=1 |bij |βj

− 2

∑nj=1(l2j + p2

j )

(1− θ)2ci −

∑nj=1 |aij |αj −

∑nj=1 |bij |βj

2 (17)

Hence, we conclude

limk→∞

k−1∑p=0

|Y p|2 < ∞ a.s. 0 < ∆ < ∆0,

limk→∞

|yki |2 = 0 a.s. 0 < ∆ < ∆0.

Remark 1: For θ = 0, this approximation is reduced to Euler method and for θ = 1, the approximation is actuallysplit-step backward Euler method.

Definition 4: The approximate solution yki to equation (2) is said to be almost surely exponentially stable if thereexists a constant ηi > 0 such that

lim supk→∞

1

k∆log |yki | ≤ −ηi, a.s.

Theorem 2: Let the assumptions A1 and A2 hold. Let γi > 0 be a number defined by equation (14) andε ∈ (0, γ∗i /2) be arbitrary, where γ∗i = max γi, γi. Then there exists a ∆∗1 > 0 such that if ∆ < ∆∗1, then for anyfinite-valued F0 measurable random variable ξ(k∆), k = −m,−m+1, · · · , 0, then the split-step theta approximatesolution (2) is almost surely exponentially stable i.e.

lim supk→∞

1

k∆log |yki | ≤ −

γ∗i2

+ ε, a.s.

Proof: Following the same steps as in Theorem 1 and by using the equation (16), we obtain the following. For anypositive constant C > 1, we have

C(k+1)∆|Y k+1|2 − Ck∆|Y k|2 = C(k+1)∆(|Y k+1|2 − |Y k|2

)+(C(k+1)∆ − Ck∆

)|Y k|2.

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:786

Page 8: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

Then

C(k+1)∆|Y k+1|2 − Ck∆|Y k|2 ≤ C(k+1)∆

[− 2ci(1− θ)∆ + 2(1− θ)∆

n∑j=1

|aij |αj + (1− θ)∆n∑j=1

|bij |βj

+c2i (1− θ)2∆2 + (1− θ)2∆2

n∑j=1

|aij |αj

2

− 2ci(1− θ)2∆2n∑j=1

|aij |αj

−ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

l2j (∆Wkj )2 + (1− C−∆)

]|Y k|2

+C(k+1)∆

[(1− θ)∆

n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

−ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

p2j (∆W

kj )2

]|Y k−mj |2

+2C(k+1)∆

⟨yki + (1− θ)∆

[− ciyki +

n∑j=1

aijfj(ykj )

+

n∑j=1

bijgj(yk−mj

j )

],

n∑j=1

σij(ykj , y

k−mj

j )∆W kj

⟩,

which implies that

Ck∆|Y k|2 ≤ |Y 0|2 +

[− 2ci(1− θ)∆ + 2(1− θ)∆

n∑j=1

|aij |αj + (1− θ)∆n∑j=1

|bij |βj + c2i (1− θ)2∆2

+(1− θ)2∆2

n∑j=1

|aij |αj

2

− 2ci(1− θ)2∆2n∑j=1

|aij |αj − ci(1− θ)2∆2n∑j=1

|bij |βj

+(1− C−∆)

] k−1∑p=0

C(p+1)∆|Y p|2 + 2

n∑j=1

l2j (∆Wkj )2

k−1∑p=0

C(p+1)∆|Y p|2

+

[(1− θ)∆

n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

− ci(1− θ)2∆2n∑j=1

|bij |βj]

k−1∑p=0

C(p+1)∆|Y p−mj |2 + 2

n∑j=1

p2j (∆W

kj )2

k−1∑p=0

C(p+1)∆|Y p−mj |2 + 2

k−1∑p=0

C(p+1)∆

⟨yki

+(1− θ)∆

−ciyki +

n∑j=1

aijfj(ykj ) +

n∑j=1

bijgj(yk−mj

j )

, n∑j=1

σij(ykj , y

k−mj

j )∆W kj

⟩. (18)

Let mk =∑k−1

p=0 C(p+1)∆|Y p|2((∆W p

j )2−∆). Noting that E[((∆W pj )2−∆)/Fk∆] = 0 and Y k is Fk∆ measurable,

then we have E(mk/F(k−1)∆) = mk−1, which implies that mk is a martingale.Thus mk =

∑k−1p=0 |Y p−mj |2((∆W p

j )2 −∆) is a martingale.

mk = 2

k−1∑p=0

C(p+1)∆

⟨yki + (1− θ)∆

−ciyki +

n∑j=1

aijfj(ykj ) +

n∑j=1

bijgj(yk−mj

j )

, n∑j=1

σij(ykj , y

k−mj

j )∆W kj

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:787

Page 9: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

is also a martingale. Therefore Mk =∑n

j=1(2l2jmk + 2p2jmk + mk) is a martingale with M0 = 0. Then

Ck∆|Y k|2 ≤ |Y 0|2 +

[− 2ci(1− θ)∆ + 2(1− θ)∆

n∑j=1

|aij |αj + (1− θ)∆n∑j=1

|bij |βj + c2i (1− θ)2∆2

+(1− θ)2∆2

n∑j=1

|aij |αj

2

− 2ci(1− θ)2∆2n∑j=1

|aij |αj − ci(1− θ)2∆2n∑j=1

|bij |βj

+2

n∑j=1

l2j + (1− C−∆)

] k−1∑p=0

C(p+1)∆|Y p|2 +

[(1− θ)∆

n∑j=1

|bij |βj

+(1− θ)2∆2

n∑j=1

|bij |βj

2

− ci(1− θ)2∆2n∑j=1

|bij |βj

+2

n∑j=1

p2j∆

] k−1∑p=0

C(p+1)∆|Y p−mj |2 +Mk.

Sincek−1∑p=0

C(p+1)∆|Y p−mj |2 =

−1∑p=−m

C(p+m+1)∆|Y p|2 +

k−1∑p=0

C(p+m+1)∆|Y p|2 −k−1∑

p=k−mC(p+m+1)∆|Y p|2,

we have

Ck∆|Y k|2 +

[(1− θ)∆

n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

− ci(1− θ)2∆2n∑j=1

|bij |βj

+2

n∑j=1

p2j∆

] k−1∑p=k−m

C(p+m+1)∆|Y p|2 ≤ Xk, (19)

where

Xk = |Y 0|2 +

[(1− θ)∆

n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

− ci(1− θ)2∆2n∑j=1

|bij |βj

+2

n∑j=1

p2j∆

] −1∑p=−m

C(p+m+1)∆|Y p|2 +

[− 2ci(1− θ)∆ + 2(1− θ)∆

n∑j=1

|aij |αj

+(1− θ)∆n∑j=1

|bij |βj + c2i (1− θ)2∆2 + (1− θ)2∆2

n∑j=1

|aij |αj

2

− 2ci(1− θ)2∆2n∑j=1

|aij |αj

−ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

l2j∆ + (1− C−∆) +

[(1− θ)∆

n∑j=1

|bij |βj

+(1− θ)2∆2

n∑j=1

|bij |βj

2

− ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

p2j∆

]cm∆

] k−1∑p=0

C(p+1)∆|Y p|2

+Mk.

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:788

Page 10: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

Now, let us introduce the following function

φ(C) =

[(1− θ)∆

n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

− ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

p2j∆

]C(m+1)∆

+

[− 2ci(1− θ)∆ + 2(1− θ)∆

n∑j=1

|aij |αj + (1− θ)∆n∑j=1

|bij |βj + c2i (1− θ)2∆2

+(1− θ)2∆2

n∑j=1

|aij |αj

2

− 2ci(1− θ)2∆2n∑j=1

|aij |αj − ci(1− θ)2∆2n∑j=1

|bij |βj + 1

+2

n∑j=1

l2j∆

]C∆ − 1. (20)

Obviously, we have that φ′(C) > 0 for any C > 1. Moreover

φ(1) = 2(1− θ)∆n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

− 2ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

p2j∆

−2ci(1− θ)∆ + 2(1− θ)∆n∑j=1

|aij |αj + c2i (1− θ)2∆2 + (1− θ)2∆2

n∑j=1

|aij |αj

2

−2ci(1− θ)2∆2n∑j=1

|aij |αj + 2

n∑j=1

l2j∆.

Hence, for any

∆ < ∆∗1 =2(1− θ)

(ci −

∑nj=1 |aij |αj −

∑nj=1 |bij |βj

)− 2

∑nj=1(p2

j + l2j )

(1− θ)2(ci −

∑nj=1 |aij |αj −

∑nj=1 |bij |βj

)2

We obtain φ(1) < 0, which implies that for any ∆ < ∆∗1, there exists a unique C∗∆ such that φ(C∗∆) = 0. ChoosingC = C∗∆, we have

Xk = |Y 0|2 +

[(1− θ)∆

n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

− ci(1− θ)2∆2n∑j=1

|bij |βj

+2

n∑j=1

p2j∆

] −1∑p=−m

C(p+m+1)∆|Y p|2 +Mk.

It follows from Lemma 1 that, for C = C∗∆,

limt→∞

Xk <∞, a.s.

By the equation (19), we have

lim supk→∞

C∗k∆∆ |Y k|2 ≤ lim sup

k→∞

[C∗k∆

∆ |Y k|2 +

(1− θ)∆

n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

−ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

p2j∆

k−1∑p=k−m

C∗(p+m+1)∆∆ |Y p|2

]

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:789

Page 11: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

≤ limk→∞

Xk <∞ a.s. (21)

Since τj = mj∆, it follows from equation (20), that

(1− θ)

n∑j=1

|bij |βj + 2(1− θ)2∆

n∑j=1

|bij |βj

2

− ci(1− θ)2∆

n∑j=1

|bij |βj + 2

n∑j=1

p2j

C∗τj∆

+1

(1− C∗−∆

)− 2ci(1− θ)∆ + 2(1− θ)

n∑j=1

|aij |αj + c2i (1− θ)2∆ + (1− θ)2∆

n∑j=1

|aij |αj

2

+(1− θ)n∑j=1

|bij |βj − 2ci(1− θ)2∆

n∑j=1

|aij |αj + 2

n∑j=1

l2j − ci(1− θ)2∆

n∑j=1

|bij |βj = 0. (22)

Choosing the constant C = eµ, we obtain 1− C−∆ = 1− e−µ∆. Define

φ∆(µ) =

(1− θ)

n∑j=1

|bij |βj + (1− θ)2∆

n∑j=1

|bij |βj

2

− ci(1− θ)2∆

n∑j=1

|bij |βj + 2

n∑j=1

p2j

eµτj

+1

(1− e−µ∆

)− 2ci(1− θ) + 2(1− θ)

n∑j=1

|aij |αj + c2i (1− θ)2∆ + (1− θ)2∆

n∑j=1

|aij |αj

2

−2ci(1− θ)2∆

n∑j=1

|aij |αj + 2

n∑j=1

l2j − ci(1− θ)2∆

n∑j=1

|bij |βj + (1− θ)n∑j=1

|bij |βj .

Let µ∗∆ = log(C∗∆), then it follows from equation (22) that for any ∆ < ∆∗1,

φ∆(µ∗∆) = 0. (23)

Noting that lim∆→0(1− e−µ∆)/∆ = µ, we have

lim∆→0

φ∆(µ) =

[(1− θ)

n∑j=1

|bij |βj + 2

n∑j=1

p2j

]eµτj + µ− 2ci(1− θ) + 2(1− θ)

n∑j=1

|aij |αj + 2

n∑j=1

l2j

+(1− θ)n∑j=1

|bij |βj (24)

γi = µ, equation (23) and (24), we obtainlim∆→0

µ∗∆ = γi,

which implies that for any positive ε ∈ (0, γ∗i /2) where γ∗i = max γi, γi, then there exists a ∆∗2 > 0 such thatfor any ∆ < ∆∗2, we have µ∗∆ > γ∗i − 2ε. From equation (21) and the definition of µ∗∆, we see that

lim supk→∞

eµ∗∆k∆|Y k|2 <∞,

and so lim supk→∞ eµ∗

∆k∆|yki |2 <∞. We therefore obtain that for any ∆ < ∆∗3 = ∆∗1 ∧∆∗2,

lim supk→∞

1

k∆log |yki | ≤ −

γ∗i2

+ ε, a.s.

as required.

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:790

Page 12: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

IV. STABILITY OF STOCHASTIC LINEAR THETA METHODS

In this section, we will establish the almost sure asymptotic stability and almost sure exponential stability ofa class of stochastic linear theta methods and will investigate for what values of θ, the stability of the numericalsolutions is maintained.

Theorem 3: Suppose that the equation (3) satisfies the assumptions A1 and A2. Then there exists a ∆1 > 0 suchthat ∆ < ∆1, then for any finite valued F0-measurable random variables ξi(k∆), k = −m,−m + 1, · · · , 0, theequation (3) is almost surely asymptotically stable.Proof: By assumptions A1 and A2, we have

|yk+1i |2 =

⟨yki + θ∆[−ciyk+1

i +

n∑j=1

aijfj(yk+1j ) +

n∑j=1

bijgj(yk−mj+1j )] + (1− θ)∆[−ciyki +

n∑j=1

aijfj(ykj )

+

n∑j=1

bijgj(yk−mj

j )] +

n∑j=1

σij(ykj , y

k−mj

j )∆W kj , y

ki + θ∆[−ciyk+1

i +

n∑j=1

aijfj(yk+1j )

+

n∑j=1

bijgj(yk−mj+1j )] + (1− θ)∆[−ciyki +

n∑j=1

aijfj(ykj ) +

n∑j=1

bijgj(yk−mj

j )]

+

n∑j=1

σij(ykj , y

k−mj

j )∆W kj

≤ |yki |2 +

[− 2ci(1− θ)∆ + 2(1− θ)∆

n∑j=1

|aij |αj + (1− θ)∆n∑j=1

|bij |βj + c2i (1− θ)2∆2

+(1− θ)2∆2

n∑j=1

|aij |αj

2

− 2ci(1− θ)2∆2n∑j=1

|aij |αj − ci(1− θ)2∆2n∑j=1

|bij |βj

+2

n∑j=1

l2j (∆Wkj )2

]|yki |2 +

[c2i θ

2∆2 + θ2∆2

n∑j=1

|aij |αj

2

− 2ciθ2∆2

n∑j=1

|aij |αj

−ciθ2∆2n∑j=1

|bij |βj]|yk+1j |2 +

[(1− θ)∆

n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

−ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

p2j (∆W

kj )2

]|yk−mj

j |2 +

[θ2∆2

n∑j=1

|bij |βj

2

−ciθ2∆2n∑j=1

|bij |βj]|yk−mj+1j |2 +

⟨yki , θ∆[−ciyk+1

i +

n∑j=1

aijfj(yk+1j ) +

n∑j=1

bijgj(yk−mj+1j )]

⟩⟨yki ,

n∑j=1

σij(ykj , y

k−mj

j )∆W kj

⟩+

⟨θ∆

[− ciyk+1

i +

n∑j=1

aijfj(yk+1j )

+

n∑j=1

bijgj(yk−mj+1j )

],

n∑j=1

σij(ykj , y

k−mj

j )∆W kj

⟩+

⟨(1− θ)∆

[− ciyki +

n∑j=1

aijfj(ykj )

+

n∑j=1

bijgj(yk−mj

j )

],

n∑j=1

σij(ykj , y

k−mj

j )∆W kj

⟩.

Let |Y k| = max1≤j≤n |ykj |, |Y k+1| = max1≤j≤n |yk+1j |, |Y k−mj | = max1≤j≤n |yk−mj

j | and

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:791

Page 13: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

|Y k−mj+1| = max1≤j≤n |yk−mj+1j |, we get1− c2

i θ2∆2 − θ2∆2

n∑j=1

|aij |αj

2

+ 2ciθ2∆2

n∑j=1

|aij |αj + ciθ2∆2

n∑j=1

|bij |βj

|Y k+1|2

≤ |Y k|2 +

[− 2ci(1− θ)∆ + 2(1− θ)∆

n∑j=1

|aij |αj + (1− θ)∆n∑j=1

|bij |βj + c2i (1− θ)2∆2

+(1− θ)2∆2

n∑j=1

|aij |αj

2

− 2ci(1− θ)2∆2n∑j=1

|aij |αj − ci(1− θ)2∆2n∑j=1

|bij |βj

+2

n∑j=1

l2j (∆Wkj )2

]|Y k|2 +

[(1− θ)∆

n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

−ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

p2j (∆W

kj )2

]|Y k−mj |2 +

[θ2∆2

n∑j=1

|bij |βj

2

−ciθ2∆2n∑j=1

|bij |βj]|Y k−mj+1|2 +

⟨yki , θ∆[−ciyk+1

i +

n∑j=1

aijfj(yk+1j ) +

n∑j=1

bijgj(yk−mj+1j )]

+

⟨yki ,

n∑j=1

σij(ykj , y

k−mj

j )∆W kj

⟩+

⟨θ∆

[− ciyk+1

i +

n∑j=1

aijfj(yk+1j )

+

n∑j=1

bijgj(yk−mj+1j )

],

n∑j=1

σij(ykj , y

k−mj

j )∆W kj

⟩+

⟨(1− θ)∆

[− ciyki +

n∑j=1

aijfj(ykj )

+

n∑j=1

bijgj(yk−mj

j )

],

n∑j=1

σij(ykj , y

k−mj

j )∆W kj

⟩(25)

Then

|Y k+1|2 ≤ 1

P

|Y k|2 +

[− 2ci(1− θ)∆ + 2(1− θ)∆

n∑j=1

|aij |αj + (1− θ)∆n∑j=1

|bij |βj + c2i (1− θ)2∆2

+(1− θ)2∆2

n∑j=1

|aij |αj

2

− 2ci(1− θ)2∆2n∑j=1

|aij |αj − ci(1− θ)2∆2n∑j=1

|bij |βj

+2

n∑j=1

l2j (∆Wkj )2

]|Y k|2 +

[(1− θ)∆

n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

−ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

p2j (∆W

kj )2

]|Y k−mj |2 +

[θ2∆2

n∑j=1

|bij |βj

2

−ciθ2∆2n∑j=1

|bij |βj]|Y k−mj+1|2 +mk

,

where

P = 1− c2i θ

2∆2 − θ2∆2

n∑j=1

|aij |αj

2

+ 2ciθ2∆2

n∑j=1

|aij |αj + ciθ2∆2

n∑j=1

|bij |βj

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:792

Page 14: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

and

mk =

⟨yki , θ∆[−ciyk+1

i +

n∑j=1

aijfj(yk+1j ) +

n∑j=1

bijgj(yk−mj+1j )]

⟩+

⟨yki ,

n∑j=1

σij(ykj , y

k−mj

j )∆W kj

+

⟨θ∆

−ciyk+1i +

n∑j=1

aijfj(yk+1j ) +

n∑j=1

bijgj(yk−mj+1j )

, n∑j=1

σij(ykj , y

k−mj

j )∆W kj

+

⟨(1− θ)∆

−ciyki +

n∑j=1

aijfj(ykj ) +

n∑j=1

bijgj(yk−mj

j )

, n∑j=1

σij(ykj , y

k−mj

j )∆W kj

⟩.

Following the same steps as in Theorem 1 of section III and taking 1/P > 0, we get

∆1 =2(1− θ)

ci −

∑nj=1 |aij |αj −

∑nj=1 |bij |βj

− 2

∑nj=1(l2j + p2

j )

(1− θ)2ci −

∑nj=1 |aij |αj −

∑nj=1 |bij |βj

2+ θ2

(∑nj=1 |bij |βj

)2− ciθ2

∑nj=1 |bij |βj

(26)

Hence we can conclude

limk→∞

k−1∑p=0

|Y p|2 < ∞ a.s. 0 < ∆ < ∆1,

limk→∞

|yki |2 = 0 a.s. 0 < ∆ < ∆1.

Remark 2: For θ = 0, this approximation is reduced to Euler method and for θ = 1, the approximation is actuallybackward Euler method.

Theorem 4: Let the assumptions A1 and A2 hold. Let γi > 0 be a number defined by equation (14) andε ∈ (0, γ∗i /2), where γ∗i = max γi, γi be arbitrary. Then there exists a ∆∗4 > 0 such that if ∆ < ∆∗4, then forany finite-valued F0 measurable random variable ξ(k∆), k = −m,−m+ 1, · · · , 0, then the stochastic linear thetaapproximate solution (3) is almost surely exponentially stable i.e.

lim supk→∞

1

k∆log |yki | ≤ −

γ∗i2

+ ε, a.s.

Proof: For any positive constant C > 1, we have

C(k+1)∆|Y k+1|2 − Ck∆|Y k|2 = C(k+1)∆(|Y k+1|2 − |Y k|2

)+(C(k+1)∆ − Ck∆

)|Y k|2.

Then equation (25), becomes[1− c2

i θ2∆2 − θ2∆2

n∑j=1

|aij |αj

2

+ 2ciθ2∆2

n∑j=1

|aij |αj + ciθ2∆2

n∑j=1

|bij |βj] [C(k+1)∆|Y k+1|2 − Ck∆|Y k|2

]≤[1− 2ci(1− θ)∆ + 2(1− θ)∆

n∑j=1

|aij |αj + (1− θ)∆n∑j=1

|bij |βj + (1− θ)2∆2c2i

+(1− θ)2∆2

n∑j=1

|aij |αj

2

− 2ci(1− θ)2∆2n∑j=1

|aij |αj − ci(1− θ)2∆2n∑j=1

|bij |βj

+2

n∑j=1

l2j (∆Wkj )2 −

(1− θ2∆2ci − θ2∆2

n∑j=1

|aij |αj

2

+ 2θ2∆2ci

n∑j=1

|aij |αj

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:793

Page 15: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

+ciθ2∆2

n∑j=1

|bij |βj)C−∆

]C(k+1)∆|Y k|2 +

[(1− θ)∆

n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

−ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

p2j (∆W

kj )2

]C(k+1)∆|Y k−mj |2

+

[θ2∆2

n∑j=1

|bij |βj

2

− ciθ2∆2n∑j=1

|bij |βj]C(k+1)∆|Y k−mj+1|2 + C(k+1)∆mk,

which implies that[1− c2

i θ2∆2 − θ2∆2

n∑j=1

|aij |αj

2

+ 2ciθ2∆2

n∑j=1

|aij |αj + ciθ2∆2

n∑j=1

|bij |βj] k−1∑p=0

Cp∆|Y p|2

≤[1− c2

i θ2∆2 − θ2∆2

n∑j=1

|aij |αj

2

+ 2ciθ2∆2

n∑j=1

|aij |αj + ciθ2∆2

n∑j=1

|bij |βj]|Y 0|2

+

[1− 2ci(1− θ)∆ + 2(1− θ)∆

n∑j=1

|aij |αj + (1− θ)∆n∑j=1

|bij |βj + c2i (1− θ)2∆2

+(1− θ)2∆2

n∑j=1

|aij |αj

2

− 2ci(1− θ)2∆2n∑j=1

|aij |αj − ci(1− θ)2∆2n∑j=1

|bij |βj

+2

n∑j=1

l2j∆−(

1− c2i θ

2∆2 − θ2∆2

n∑j=1

|aij |αj

2

+ 2ciθ2∆2

n∑j=1

|aij |αj

+ciθ2∆2

n∑j=1

|bij |βj)C−∆

] k−1∑p=0

C(p+1)∆|Y p|2 +

[(1− θ)∆

n∑j=1

|bij |βj

+(1− θ)2∆2

n∑j=1

|bij |βj

2

− ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

p2j∆

] k−1∑p=0

C(p+1)∆|Y p−mj |2

+

[θ2∆2

n∑j=1

|bij |βj

2

− ciθ2∆2n∑j=1

|bij |βj] k−1∑p=0

C(p+1)∆|Y p−mj+1|2 +Mk, (27)

where Mk =∑k−1

p=0 C(p+1)∆mk is a martingale with M0 = 0. Since

k−1∑p=0

C(p+1)∆|Y p−mj+1|2 = C(m−1)∆−1∑

p=−m+1

C(p+1)∆|Y p|2 + C(m−1)∆k−1∑p=0

C(p+1)∆|Y p|2

−C(m−1)∆k−1∑

p=k−mj+1

C(p+1)∆|Y p|2,

k−1∑p=0

C(p+1)∆|Y p−m|2 = Cm∆−1∑

p=−mC(p+1)∆|Y p|2 + Cm∆

k−1∑p=0

C(p+1)∆|Y p|2

−Cm∆k−1∑

p=k−mC(p+1)∆|Y p|2.

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:794

Page 16: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

We therefore obtain[1− c2

i θ2∆2 − θ2∆2

n∑j=1

|aij |αj

2

+ 2ciθ2∆2

n∑j=1

|aij |αj + ciθ2∆2

n∑j=1

|bij |βj] k−1∑p=0

Cp∆|Y p|2

+

[(1− θ)∆

n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

− ci(1− θ)2∆2n∑j=1

|bij |βj

+2

n∑j=1

p2j∆

]Cm∆

k−1∑p=k−m

C(p+1)∆|Y p|2 +

[θ2∆2

n∑j=1

|bij |βj

2

−ciθ2∆2n∑j=1

|bij |βj]C(m−1)∆

k−1∑p=k−m+1

C(p+1)∆|Y p|2 ≤ Yk, (28)

where

Yk =

[1− c2

i θ2∆2 − θ2∆2

n∑j=1

|aij |αj

2

+ 2ciθ2∆2

n∑j=1

|aij |αj + ciθ2∆2

n∑j=1

|bij |βj]|Y 0|2

+

[θ2∆2

n∑j=1

|bij |βj

2

− ciθ2∆2n∑j=1

|bij |βj]C(m−1)∆

−1∑p=−m+1

C(p+1)∆|Y p|2 +

[(1− θ)∆

n∑j=1

|bij |βj

+(1− θ)2∆2

n∑j=1

|bij |βj

2

− ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

p2j∆

]Cm∆

−1∑p=−m

C(p+1)∆|Y p|2

+

[1− 2ci(1− θ)∆ + 2(1− θ)∆

n∑j=1

|aij |αj + (1− θ)∆n∑j=1

|bij |βj + c2i (1− θ)2∆2

+(1− θ)2∆2

n∑j=1

|aij |αj

2

− 2ci(1− θ)2∆2n∑j=1

|aij |αj − ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

l2j∆

−(

1− c2i θ

2∆2 − θ2∆2

n∑j=1

|aij |αj

2

+ 2ciθ2∆2

n∑j=1

|aij |αj + ciθ2∆2

n∑j=1

|bij |βj)C−∆

+

[(1− θ)∆

n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

− ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

p2j∆

]Cm∆

+

[θ2∆2

n∑j=1

|bij |βj

2

− ciθ2∆2n∑j=1

|bij |βj]C(m−1)∆

] k−1∑p=0

C(p+1)∆|Y p|2 +Mk.

Now, let us introduce the following function

ψ(C) =

[θ2∆2

n∑j=1

|bij |βj

2

− ciθ2∆2n∑j=1

|bij |βj]Cm∆ +

[(1− θ)∆

n∑j=1

|bij |βj

+(1− θ)2∆2

n∑j=1

|bij |βj

2

− ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

p2j∆

]C(m+1)∆ +

[1− 2ci(1− θ)∆

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:795

Page 17: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

+2(1− θ)∆n∑j=1

|aij |αj + (1− θ)∆n∑j=1

|bij |βj + c2i (1− θ)2∆2 + (1− θ)2∆2

n∑j=1

|aij |αj

2

−2ci(1− θ)2∆2n∑j=1

|aij |αj − ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

l2j∆

]C∆ − 1 + c2

i θ2∆2

+θ2∆2

n∑j=1

|aij |αj

2

− 2ciθ2∆2

n∑j=1

|aij |αj − ciθ2∆2n∑j=1

|bij |βj . (29)

Obviously, we have that ψ′(C) > 0 for any C > 1. Moreover

ψ(1) = θ2∆2

n∑j=1

|bij |βj

2

− ciθ2∆2n∑j=1

|bij |βj + 2(1− θ)∆n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

−2ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

p2j∆ + c2

i θ2∆2 + θ2∆2

n∑j=1

|aij |αj

2

− 2ciθ2∆2

n∑j=1

|aij |αj

−ciθ2∆2n∑j=1

|bij |βj − 2ci(1− θ)∆ + +2(1− θ)∆n∑j=1

|aij |αj + c2i (1− θ)2∆2

+(1− θ)2∆2

n∑j=1

|aij |αj

2

− 2ci(1− θ)2∆2n∑j=1

|aij |αj + 2

n∑j=1

l2j∆.

We therefore obtain from assumption A2 that ψ(1) < 0, which implies that there exists a unique C∗∆ such thatψ(C∗∆) = 0. Choosing C = C∗∆, we have

Yk =

[1− c2

i θ2∆2 − θ2∆2

n∑j=1

|aij |αj

2

+ 2ciθ2∆2

n∑j=1

|aij |αj + ciθ2∆2

n∑j=1

|bij |βj]|Y 0|2

+

[θ2∆2

n∑j=1

|bij |βj

2

− ciθ2∆2n∑j=1

|bij |βj]C(m−1)∆

−1∑p=−m+1

C(p+1)∆|Y p|2 +

[(1− θ)∆

n∑j=1

|bij |βj

+(1− θ)2∆2

n∑j=1

|bij |βj

2

− ci(1− θ)2∆2n∑j=1

|bij |βj + 2

n∑j=1

p2j∆

]Cm∆

−1∑p=−m

C(p+1)∆|Y p|2 +Mk.

It follows from Lemma 1 that, for C = C∗∆,

limt→∞

Yk ≤ ∞, a.s.

By the equation (28), we get

lim supk→∞

[1− c2

i θ2∆2 − θ2∆2

n∑j=1

|aij |αj

2

+ 2ciθ2∆2

n∑j=1

|aij |αj + ciθ2∆2

n∑j=1

|bij |βj]Ck∆|Y k|2

≤ lim supk→∞

[1− c2

i θ2∆2 − θ2∆2

n∑j=1

|aij |αj

2

+ 2ciθ2∆2

n∑j=1

|aij |αj + ciθ2∆2

n∑j=1

|bij |βj]Ck∆|Y k|2

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:796

Page 18: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

+

[(1− θ)∆

n∑j=1

|bij |βj + (1− θ)2∆2

n∑j=1

|bij |βj

2

− ci(1− θ)2∆2n∑j=1

|bij |βj

+2

n∑j=1

p2j∆

]Cm∆

−1∑p=−m

C(p+1)∆|Y p|2 +

[θ2∆2

n∑j=1

|bij |βj

2

−ciθ2∆2n∑j=1

|bij |βj]C(m−1)∆

−1∑p=−m+1

C(p+1)∆|Y p|2

≤ limk→∞

Yk <∞ a.s. (30)

Since τj = mj∆, it follows from the equation (29) that

[θ2∆

n∑j=1

|bij |βj

2

− ciθ2∆

n∑j=1

|bij |βj]Cτj∆ C

−∆∆ +

[(1− θ)

n∑j=1

|bij |βj + (1− θ)2∆

n∑j=1

|bij |βj

2

−ci(1− θ)2∆

n∑j=1

|bij |βj + 2

n∑j=1

p2j

]Cτj∆ +

1

[1−

(1− c2

i θ2∆2 − θ2∆2

n∑j=1

|aij |αj

2

+2ciθ2∆2

n∑j=1

|aij |αj + ciθ2∆2

n∑j=1

|bij |βj)C−∆

]− 2ci(1− θ) + 2(1− θ)

n∑j=1

|aij |αj

+(1− θ)n∑j=1

|bij |βj + c2i (1− θ)2∆ + (1− θ)2∆

n∑j=1

|aij |αj

2

− 2ci(1− θ)2∆

n∑j=1

|aij |αj

−ci(1− θ)2∆

n∑j=1

|bij |βj + 2

n∑j=1

l2j = 0. (31)

Choosing the constant C = eη, we obtain 1− C−∆ = 1− e−η∆. Define

ψ∆(η) =

[θ2∆

n∑j=1

|bij |βj

2

− ciθ2∆

n∑j=1

|bij |βj]eητje−η∆ +

[(1− θ)

n∑j=1

|bij |βj

+(1− θ)2∆

n∑j=1

|bij |βj

2

− ci(1− θ)2∆

n∑j=1

|bij |βj + 2

n∑j=1

p2j

]eητj +

1

[1−

(1− c2

i θ2∆2

−θ2∆2

n∑j=1

|aij |αj

2

+ 2ciθ2∆2

n∑j=1

|aij |αj + ciθ2∆2

n∑j=1

|bij |βj)e−η∆

]− 2ci(1− θ)

+2(1− θ)n∑j=1

|aij |αj + c2i (1− θ)2∆ + (1− θ)2∆

n∑j=1

|aij |αj

2

− 2ci(1− θ)2∆

n∑j=1

|aij |αj

−ci(1− θ)2∆

n∑j=1

|bij |βj + 2

n∑j=1

l2j . (32)

Let η∗∆ = log(C∆), then it follows from the equation (31) that,

ψ∆(η∗∆) = 0. (33)

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:797

Page 19: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

Noting that lim∆→0(1− e−η∆)/∆ = η, we have

lim∆→0

ψ∆(η) =

(1− θ)n∑j=1

|bij |βj + 2

n∑j=1

p2j

eητj + η − 2ci(1− θ) + 2(1− θ)n∑j=1

|aij |αj

+(1− θ)n∑j=1

|bij |βj + 2

n∑j=1

l2j (34)

γi = η, equation (33) and (34), we obtainlim∆→0

η∗∆ = γi,

which implies that for any positive ε ∈ (0, γ∗i /2), where γ∗i = max γi, γi, then there exists a ∆∗4 > 0 such thatfor any ∆ < ∆∗4, we have η∗∆ > γ∗i − 2ε. From the equation (30) and the definition of η∗∆, we see that

lim supk→∞

[1− c2

i θ2∆2 − θ2∆2

n∑j=1

|aij |αj

2

+ 2ciθ2∆2

n∑j=1

|aij |αj + ciθ2∆2

n∑j=1

|bij |βj]eη

∗∆k∆|Y k|2 <∞,

which implies

lim supk→∞

[1− c2

i θ2∆2 − θ2∆2

n∑j=1

|aij |αj

2

+ 2ciθ2∆2

n∑j=1

|aij |αj + ciθ2∆2

n∑j=1

|bij |βj]eη

∗∆k∆|yki |2 <∞.

We therefore obtain that for any ∆ < ∆∗4,

lim supk→∞

1

k∆log |yki | ≤ −

γ∗i2

+ ε, a.s.

as required.Remark 3: When θ = 1, the above method reduces to backward Euler method and it can be noted that the result

coincides with the almost sure exponential stability proved in [9].

V. NUMERICAL EXAMPLES

Example 1: Consider the following two-dimensional delayed stochastic Hopfield neural network:

d

(x1(t)

x2(t)

)= C

(x1(t)

x2(t)

)dt+A

(f(x1(t))

f(x2(t))

)dt+B

(g(x1(t− 1))

g(x2(t− 2))

)dt

+

(x1(t) x1(t− 1)

x2(t) x2(t− 2)

)σdW (t), (35)

on t ≥ 0 with initial segment ξ1(t) = t+1, t ∈ [−1, 0]; ξ2(t) = t+1 for t ∈ [−2, 0]. Let f(x) = x, g(x) = sinx,

C =

(−20 0

0 −20

), A =

(4 0

0 3

), B =

(−4 2

2 1

), σ =

(1

−√

5

).

It is obvious that αj = βj = 1, j = 1, 2. l1 = l2 = 1, p1 = p2 =√

5. By computation,

2ci − 2

n∑j=1

|aij |αj − 2

n∑j=1

|bij |βj − 2

n∑j=1

l2j − 2

n∑j=1

p2j = 6 > 0 i = 1,

2ci − 2

n∑j=1

|aij |αj − 2

n∑j=1

|bij |βj − 2

n∑j=1

l2j − 2

n∑j=1

p2j = 14 > 0 i = 2,

which verifies the assumption A2. Therefore, Euler-Maruyama method is almost sure asymptotically stable and

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:798

Page 20: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

almost sure exponentially stable under the step sizes ∆ ∈ (0, 0.05) which is illustrated in figure 1 and figure2 respectively. But split-step theta method and classical linear theta methods for θ = 0.5 are almost surelyasymptotically stable and almost surely exponentially stable for all the step sizes which is illustrated in figures3-6. In figure 7 and figure 8 the almost sure asymptotic stability and almost sure exponential stability for backwardEuler-Maruyama method is illustrated.

VI. CONCLUSION

In this paper, we have obtained the almost surely asymptotic stability and almost surely exponential stabilityof split-step theta methods to delayed Hopfield neural networks by using some assumptions. From the numericalexperiments, it is noted that for different values of θ, the split-step theta methods reduces to Euler method, split-stepbackward Euler method and backward Euler method. The stability of the solution is maintained which shows theefficiency of the methods. As future work, we are interested in the study of the class of split-step theta methodsfor neutral stochastic delay differential equations.

REFERENCES

[1] T. Chen, Global exponential stability of delayed Hopfield neural networks, Neural Netw. 14 (2001) 977-980.[2] X. Ding, Q. Ma, L. Zhang, Convergence and stability of the split-step h-method for stochastic differential equations, Comput. Math.

Appl. 60 (2010) 1310–1321.[3] D.J. Higham, X. Mao, A.M. Stuart, Strong convergence of Euler-type methods for nonlinear stochastic differential equations, SIAM J.

Numer. Anal. 40 (2002) 1041–1063.[4] C. Huang, P. Chen, Y. He, L. Huang, W. Tan, Almost sure exponential stability of delayed Hopfield neural networks, App. Math. Lett.

21 (2008) 701-705.[5] C. Huang, Mean square stability and dissipativity of two classes of theta methods for systems of stochastic delay differential equations,

J. Comput. Appl. Math. 259 (2014) 77–86.[6] F. Jiang, Y. Shen, Stability in the numerical simulation of stochastic delayed Hopfield neural networks, Neural Comput. Appl. 22 (2013)

1493-1498.[7] S. Kuang, Y. Peng, F. Deng, W. Gao, Exponential stability and numerical methods of stochastic recurrent neural networks with delays,

Abstract and Applied Analysis Article ID 761237, 11 pages (2013)[8] X. Li, X. Mao, A note on almost sure asymptotic stability of neutral stochastic delay differential equations with Markovian switching,

Automatica 48 (2012) 2329–2334.[9] L. Liu, Q. Zhu, Almost sure exponential stability of numerical solutions to stochastic delay Hopfield neural networks, App. Math.

Comput. 266 (2015) 698-712.[10] L. Liu, F. Deng, Q. Zhu, Mean square stability of two classes of theta methods for numerical computation and simulation of delayed

stochastic Hopfield neural networks, J. Comput. Appl. Math. 343 (2018) 428-447.[11] R. Maa, Y. Xie, S. Zhang, W. Liu, Convergence of discrete delayed Hopfield neural networks, Comput. Math. Appl. 57 (2009) 1869-1876[12] X. Mao, Stochastic differential equations and applications, Horwood Publishing, Chichester (1997).[13] X. Mao, Y. Shen, C. Yuan, Almost surely asymptotic stability of neutral stochastic differential delay equations with Markovian switching,

Stochast. Proc. Appl. 118 (2008) 1385–1406.[14] A. Rathinasamy, The split-step theta-methods for stochastic delay Hopfield neural networks, Appl. Math. Comput. 36 (2012) 3477–3485.[15] A. Rodkina, H. Schurz, Almost sure asymptotic stability of the drift-implicit θ-methods for bilinear ordinary stochastic differential

equations in R1, J. Comput. Appl. Math. 180 (2005) 13-31.[16] L. Ronghua, W.K. Pang, P.K. Leung, Exponential stability of numerical solutions to stochastic delay Hopfield neural networks,

Neurocomputing 73 (2010) 920-926.[17] H. Schurz, Almost sure asymptotic stability and convergence of stochastic theta methods applied to systems of linear Stochastic

differential equations in Rd, Random Oper. Stoch. Equ. 19 (2011), 111–129.[18] A. N. Shiryaev, Probability, Springler, Berlin (1996).[19] L. Wan, J. Sun, Mean square exponential stability of stochastic delayed Hopfield neural networks, Phys. Lett. A 343 (2005) 306-318.[20] F. Wu, X. Mao, L. Szpruch, Almost sure exponential stability of numerical solutions for stochastic delay differential equations, Numer.

Math. 115 (2010) 681-697.[21] Yu Z. and Liu M., Almost Surely Asymptotic Stability of Numerical Solutions for Neutral Stochastic Delay Differential Equations,

Discrete Dyn. Nat. Soc. Article ID 217672, 11 pages (2011).

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:799

Page 21: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

0 20 40 60 80 10010

−60

10−50

10−40

10−30

10−20

10−10

100

tk = k∆

|yk i|

= |yk1|

= |yk2|

0 20 40 60 80 10010

−3

10−2

10−1

100

tk = k∆

|yk i|

= |yk1|

= |yk2|

Fig. 1: Almost surely asymptotic stability of the Euler method with ∆ = 0.01 (Top) and ∆ = 0.1 (Bottom).

[22] Q. Zhou, L. Wan, Exponential stability of stochastic delayed Hopfield neural networks, Appl. Math. Comput. 199 (2008) 84-89.[23] X. Zong, F. Wu, C. Huang, Preserving exponential mean-square stability and decay rates in two classes of theta approximations of

stochastic differential equations, J. Difference Equ. Appl. 20 (2014) 1091-1111.

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:800

Page 22: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

0 20 40 60 80 100−10

2

−101

−100

−10−1

tk = k∆

log(

|yk i|)

/t k

= log(|yk1|)/t

k

= log(|yk2|)/t

k

=−γ1/2

=−γ2/2

0 20 40 60 80 100−10

1

−100

−10−1

−10−2

−10−3

tk = k∆

log(

|yk i|)

/t k

= log(|yk1|)/t

k

= log(|yk2|)/t

k

=−γ1/2

=−γ2/2

Fig. 2: Almost surely exponential stability of the Euler method with ∆ = 0.01 (Top) stable case and ∆ = 0.1(Bottom) unstable case.

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:801

Page 23: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

0 20 40 60 80 10010

−25

10−20

10−15

10−10

10−5

100

tk = k∆

|yk i|

= |yk1|

= |yk2|

0 20 40 60 80 10010

−25

10−20

10−15

10−10

10−5

100

tk = k∆

|yk i|

= |yk1|

= |yk2|

Fig. 3: Almost surely asymptotic stability of the split-step theta method for θ = 0.5 with step-size ∆ = 0.05(Top) and ∆ = 0.1 (Bottom).

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:802

Page 24: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

0 20 40 60 80 10010

−30

10−25

10−20

10−15

10−10

10−5

100

tk = k∆

|yk i|

= |yk1|

= |yk2|

0 20 40 60 80 10010

−30

10−25

10−20

10−15

10−10

10−5

100

tk = k∆

|yk i|

= |yk1|

= |yk2|

Fig. 4: Almost surely asymptotic stability of the stochastic linear theta method for θ = 0.5 with step-size∆ = 0.05 (Top) and ∆ = 0.1 (Bottom).

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:803

Page 25: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

0 20 40 60 80 100−10

2

−101

−100

−10−1

tk = k∆

log(

|yk i|)

/t k

= log(|yk1|)/t

k

= log(|yk2|)/t

k

=−γ1/2

=−γ2/2

0 20 40 60 80 100

−100

tk = k∆

log(

|yk i|)

/t k

= log(|yk1|)/t

k

= log(|yk2|)/t

k

=−γ1/2

=−γ2/2

Fig. 5: Almost surely exponential stability of the split-step theta method for θ = 0.5 with step-size ∆ = 0.05(Top) and ∆ = 0.1 (Bottom).

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:804

Page 26: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

0 20 40 60 80 100−10

2

−101

−100

−10−1

tk = k∆

log(

|yk i|)

/t k

= log(|yk1|)/t

k

= log(|yk2|)/t

k

=−γ1/2

=−γ2/2

0 20 40 60 80 100−10

2

−101

−100

−10−1

tk = k∆

log(

|yk i|)

/t k

= log(|yk1|)/t

k

= log(|yk2|)/t

k

=−γ1/2

=−γ2/2

Fig. 6: Almost surely exponential stability of the stochastic linear theta method for θ = 0.5 with step sizes∆ = 0.05 (Top) and ∆ = 0.1 (Bottom).

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:805

Page 27: › gallery › 93-january-1768.pdf · 1 Almost sure asymptotic stability and almost sure exponential stability of split-step theta methods to delayed stochastic Hopfield neural

0 50 100 150 20010

−60

10−50

10−40

10−30

10−20

10−10

100

tk = k∆

|yk i|

= |yk1|

= |yk2|

Fig. 7: Almost surely asymptotic stability of the backward-Euler method with step size ∆ = 0.1.

0 20 40 60 80 100−10

2

−101

−100

−10−1

tk = k∆

log(

|yk i|)

/t k

= log(|yk1|)/t

k

= log(|yk2|)/t

k

=−γ1/2

=−γ2/2

Fig. 8: Almost surely exponential stability of the backward-Euler method with step size ∆ = 0.1.

JASC: Journal of Applied Science and Computations

Volume VI, Issue I, January/2019

ISSN NO: 1076-5131

Page No:806