Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By...

28
Global Journal of Pure and Applied Mathematics. ISSN 0973-1768 Volume 13, Number 2 (2017), pp. 589–616 © Research India Publications http://www.ripublication.com/gjpam.htm Existence and stability of neutral–type BAM neural networks with time delay in the neutral and leakage terms on time scales 1 Hui Zhou 2 School of Mathematical Sciences, University of Science and Technology of China, Hefei, 230026, PR China. School of Mathematics and Statistics, Hefei Normal University, Hefei, 230601, PR China. Jehad Alzabut Department of Mathematics and Physical sciences, Prince Sultan University, P. O. Box 66833, Riyadh 11586, Saudi Arabia. Abstract By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem and the method of Lyapunov functional, new sufficient conditions are established to ensure the existence and exponential stability of almost periodic solutions for neutral–type BAM neural networks. Unlike previously considered networks, the addressed system involves time delays in both the neutral derivative as well as the leakage terms. For numerical treatment, we construct a concrete example to illustrate the effectiveness of the proposed results. Our theorems are essentially new and extend existing results in the literature. AMS subject classification: Keywords: Neutral type BAM neural networks, Almost periodic solutions, Time scales, Leakage delays, Exponential stability. 1 This work is supported by the National Natural Science Fundation of China (11201109) and the Natural Science Fundation of Anhui Province ( KJ2012Z335). 2 Corresponding author, E-mail: [email protected]

Transcript of Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By...

Page 1: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

Global Journal of Pure and Applied Mathematics.ISSN 0973-1768 Volume 13, Number 2 (2017), pp. 589–616© Research India Publicationshttp://www.ripublication.com/gjpam.htm

Existence and stability of neutral–type BAM neuralnetworks with time delay in the neutral and leakage

terms on time scales1

Hui Zhou2

School of Mathematical Sciences,University of Science and Technology of China,

Hefei, 230026, PR China.School of Mathematics and Statistics,

Hefei Normal University, Hefei, 230601, PR China.

Jehad Alzabut

Department of Mathematics and Physical sciences,Prince Sultan University,

P. O. Box 66833, Riyadh 11586, Saudi Arabia.

Abstract

By employing the theory of exponential dichotomy on time scales, Banach fixedpoint theorem and the method of Lyapunov functional, new sufficient conditionsare established to ensure the existence and exponential stability of almost periodicsolutions for neutral–type BAM neural networks. Unlike previously considerednetworks, the addressed system involves time delays in both the neutral derivativeas well as the leakage terms. For numerical treatment, we construct a concreteexample to illustrate the effectiveness of the proposed results. Our theorems areessentially new and extend existing results in the literature.

AMS subject classification:Keywords: Neutral type BAM neural networks, Almost periodic solutions, Timescales, Leakage delays, Exponential stability.

1This work is supported by the National Natural Science Fundation of China (11201109) and theNatural Science Fundation of Anhui Province ( KJ2012Z335).

2Corresponding author, E-mail: [email protected]

Page 2: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

590 Hui Zhou and Jehad Alzabut

1. Introduction

In [1], Kosko proposed Bidirectional Associative Memories (BAM) as faithful modelsof neurodynamics. As such, they are able to develop attractors that correspond to desiredassociations. This allows them to be noise tolerant and able to perform pattern comple-tion. Over the next years, many researchers have found BAM undoubtedly importantin performing multi–step patterns recognition, learning real–valued correlated patterns,having increased storage capacities and extracting features from given inputs [2–8]. Dueto the finite switching speed of neurons and amplifiers, it is well known that time delayhas major impact that is unavoidable in the hardware implementation of neural networks.Indeed, the time delay often tends to cause instability or oscillation in the process de-scribing neural networks. In practical applications, there exist many types of time delayssuch as discrete delays, time–varying delays, distributed delays and leakage delays (orforgetting delays). The incorporation of delays in BAM networks have been reportedby many authors who studied the dynamics behavior of their solutions by employingseveral methods and techniques such as the fixed point theorems, theory of set–valuedmaps, functional differential inclusions, the matrix theory, the Lyapunov functional andcertain mathematical inequalities [9–19].

The leakage delay, in particular, which exists in the negative feedback term of BAM,has recently emerged as a research topic of great significance. In [20], Gopalsamy wasthe first who investigated the stability of the BAM neural networks with constant leakagedelays. Later on, a substantial achievements have been reached regarding the dynamicsof BAM with delay in the leakage term [21–25]. On the other hand, it is natural that theBAM contains certain information about the derivative of the past state. This is expressedvia the encompass of delay in the neutral derivative of BAM which often appears in thestudy of automatic control, population dynamics and vibrating masses attached to anelastic bar; the reader may consult the papers [26–31] for more details.

The theory of time scales, which has recently received a lot of considerable attention,was introduced by Stefan Hilger in his Ph.D. dissertation in 1988. The two main ob-jectives of this theory are the unification and extension. Indeed, it allows simultaneoustreatment for both differential and difference equations and extends those equations tothe so called dynamic equations [32]. This theory has been recently assimilated intoBAM networks [34–43]. To the best of our observation, however, the progress in thisdirection is heavily passing and thus it needs additional investigations. Inspired by theabove discussion, we consider the problem of existence and stability for BAM neural net-works on time scales. Unlike previously proposed and investigated network models, weprecisely carry out our investigations for neutral–type BAM neural networks involving

Page 3: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

Existence and stability of neutral–type BAM neural networks 591

delays in the neutral derivative and leakage terms on time scales of the form⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

x∇i (t) = −ai(t)xi(t − αi(t)) +

m∑j=1

bij (t)gj

(yj (t − τ j (t))

)+

m∑j=1

m∑l=1

eij l(t)pj

(y∇

j (t − τ j (t)))ql

(yl(t − τ l(t))

)+ Ii(t), i = 1, 2, . . . , n,

y∇j (t) = −dj (t)yj (t − βj (t)) +

n∑i=1

cji(t)fi

(xi(t − δi(t))

)+

n∑i=1

n∑l=1

sjil(t)ui

(x∇

i (t − δi(t)))vl

(xl(t − δl(t))

)+ Jj (t), j = 1, 2, . . . , m.

(1.1)To see the significant differences between system (1.1) and those obtained in [39–41],the reader is recommended to check the concluding remark at the end of this paper. Insystem (1.1), xi(t) and yj (t) denote the potential (or voltage) of the cell at time t wheren and m correspond to the number of units in a neural network; ai > 0 and dj > 0denote respectively the strengths at which the ith and the j th cell reset their potentialto the resting state when isolate from the other cells and inputs; gj , pj , fi, ui are theactivation functions; ql and vl are transmission functions which are used to regulate therate of activation functions; αi, βj , τ j and δi are nonnegative and correspond to finitespeed of axonal signal transmission delays satisfying t − αi(t), t − βj (t), t − τ j (t),t − δi(t) ∈ T; bij , cji, eij l and sjil are the first and second–order connection weights ofthe neural network, respectively and Ii and Jj denote the ith and the j th component ofan external input.

The main features of this paper that make it distinctive and innovative are stated asfollows: The general structure of system (1.1) which allows covering many particularcases, the incorporation of regulator functions in the system which helps measuringthe rate of activation functions, the insertion of delays in the neutral derivative andleakage terms which provide better description for the dynamics of neural reactions andthe establishment od a unified framework to handle both continuous and discrete BAMcases through the implementation of time scales calculus.

Let R = (−∞, +∞), R+ = (0, +∞), and T be an almost periodic time scale. The

definitions and properties of T and almost periodic functions on time scale will be statedin Section 2. Denote h+ = sup

t∈T

|h(t)| and h− = inft∈T

|h(t)|, where h : T → R is an

almost periodic function.

Set X ={φ = (ϕ1, ϕ2, . . . , ϕn, ψ1, ψ2, . . . , ψn)

T : ϕi, ψj ∈ C1(T, R), ϕi, ψj

are almost periodic functions on T, i = 1, 2, . . . , n, j = 1, 2, . . . , m}

with the norm

‖φ‖ = max{|ϕ|1, |ψ |1}, where |ϕ|1 = max{|ϕ|0, |ϕ∇|0}, |ψ |1 = max{|ψ |0, |ψ∇|0}, |ϕ|0 =max

1≤i≤nϕ+

i , |ϕ∇|0 = max1≤i≤n

(ϕ∇i )+, |ψ |0 = max

1≤i≤nψ+

i , |ψ∇|0 = max1≤i≤n

(ψ∇i )+ andC1(T, R)

is the set of continuous functions with continuous nabla derivatives on T. Clearly X is a

Page 4: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

592 Hui Zhou and Jehad Alzabut

Banach space.The initial values associated with system (1.1) are given by

xi(s) = ϕi(s), yj (s) = ψj(s), s ∈ [−σ , 0]T = {t : t ∈ [−σ , 0] ∩ T}, (1.2)

where ϕi, ψj ∈ C1([−σ , 0]T, R) and σ = max{

max1≤i≤n,1≤j≤m

{τ+j , δ+

i }}. Throughout

the remaining part of the paper, the following conditions are satisfied:

H.1 bij , eij l, Ii, cji, sjil, Jj ∈ C(T, R) are all almost periodic functions for i =1, 2, . . . , n, j = 1, 2, . . . , m, l = 1, 2, . . . , m.

H.2 ai, αi, τ i, dj , βj , δi ∈ C(T, R+) are all almost periodic functions for i = 1, 2, . . . ,

n, j = 1, 2, . . . , m.

H.3 fi, gj ∈ C(R, R) and there exist positive constants Lf

j , Lg

i , Lp

j , Lq

j , Lui , Lv

i , Pj ,

Qj , Ui and Vi such that

|fj (s1) − fj (s2)| ≤ Lf

j |s1 − s2|, |gi(s1) − gi(s2)| ≤ Lg

i |s1 − s2|,

|pj(s1) − pj(s2)| ≤ Lp

j |s1 − s2|, |qj (s1) − qj (s2)| ≤ Lq

j |s1 − s2|,

|ui(s1) − ui(s2)| ≤ Lui |s1 − s2|, |vi(s1) − vi(s2)| ≤ Lv

i |s1 − s2|,and

|pj(s)| ≤ Pj , |qj (s)| ≤ Qj, |ui(s)| ≤ Ui, |vi(s)| ≤ Vi,

for all s1, s2, s ∈ R and i = 1, 2, . . . , n, j = 1, 2, . . . , m.

It is obvious that condition (H.3) is of Lipschitz type that will be used to ensure theexistence and uniqueness od solutions for system (1.1).

Our approach is based on the Banach fixed point theorem and the method of Lyapunovfunctional to establish new sufficient conditions for the existence and exponential stabilityof almost periodic solutions for system (1.1). We will employ the theory of exponentialdichotomy on time scales and apply certain inequality techniques to prove the mainresults.

The contents of the paper is outlined as follows. Section 2 is devoted to some basicnotations, lemmas and definitions on T that will be used in the sequel. Section 3 providesthe existence and uniqueness of the almost periodic solution of system (1.1). The almostperiodic solution of system (1.1) which is exponentially stable is proved in Section 4. InSection 5, the proposed results are verified by demonstrating a numerical example. Wefinish the paper by a concluding remark.

Page 5: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

Existence and stability of neutral–type BAM neural networks 593

2. Preliminaries

For the sake of convenience, we recall some basic notations, definitions and lemmas oftime scales that are needed to prove the main results.

Let T be a time scale which is a closed subset of R. For t ∈ T, the forward andbackward jump operators σ , ρ : T → T, respectively, defined by,

σ(t) = inf{s ∈ T : s > t} and ρ(t) = sup {s ∈ T : s < t}.The point t ∈ T is called left–dense if t > inf T and ρ(t) = t, left–scattered if ρ(t) < t,

right–dense if t < sup T and σ(t) = t, and right–scattered if σ(t) > t . If T has a right–scattered minimum m, set Tk := T\{m}, otherwise Tk := T. The backwards graininessν : Tk → [0, +∞) is defined by ν(t) = t − ρ(t).

A function f : T → R is called left–dense continuous (ld-continuous) provided it iscontinuous at left–dense point in T and its right–side limits exist at right–dense pointsin T.

Definition 2.1. [32] Let f : T → R be a function and t ∈ Tk. Then define f ∇(t) tobe the number (provided it exists) with the property that given any ε > 0, there exits aneighborhood U of t such that∣∣∣f (ρ(t)) − f (s) − f ∇(t)(ρ(t) − s)

∣∣∣ ≤ ε|ρ(t) − s|,for all s ∈ U, f ∇ is called to be the nabla derivative of f at t.

If f is ld-continuous, then there exists a function F such that F∇(t) = f (t), and wedefine ∫ b

a

f (t)∇t = F(b) − F(a).

The function p is ν−regressive if 1 − ν(t)p(t) �= 0 for all t ∈ Tk. Let us denoteRν = {p : T → R|p is ld-continous and ν–regressive} and R+

ν = R+ν (T → R) = {p ∈

Rν : 1 − ν(t)p(t) > 0, for all t ∈ T}.The nabla exponential function is defined by

ep(t, s) = exp{ ∫ t

s

ξ ν(τ )(p(τ))∇τ},

for s, t ∈ T, where p ∈ Rν, the ν-cylinder transformation is expressed by

ξ h(z) ={

− log(1 − hz)

h, h �= 0,

z, h = 0.

Definition 2.2. [32, 33] If p, q ∈ Rν, then a circle plus addition is defined by (p ⊕ν

q)(t) := p(t) + q(t) − p(t)q(t)ν(t), for all t ∈ Tk. For p ∈ Rν, we defined a circle

minus p by νp := − p

1 − νp.

Lemma 2.3. [32, 33] If p, q ∈ Rν, and s, t, r ∈ T. Then

Page 6: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

594 Hui Zhou and Jehad Alzabut

I. e0(t, s) ≡ 1 and ep(t, t) ≡ 1;II. ep(ρ(t), s) = (1 − ν(t)p(t))ep(t, s);

III. ep(t, s) = 1/ep(s, t);IV. ep(t, r)ep(r, s) = ep(t, s);V. (ep(t, s))∇ = p(t)ep(t, s).

Lemma 2.4. [33] Let f, g be nabla differential functions on T. Then

I. (v1f + v2g)∇ = v1f∇ + v2g

∇, for any constants v1, v2;II. (fg)∇ = f ∇(t)g(t) + f (ρ(t))g∇(t) = f (t)g∇(t) + f ∇(t)g(ρ(t));

III. If f and f ∇ are continuous, then

(∫ t

a

f (t, s)∇s

)∇= f (ρ(t), t)+

∫ t

a

f (t, s)∇s.

Definition 2.5. [33] A time scale T is called an almost periodic time scale if∏:= {

τ ∈ R : t ± τ ∈ T,∀t ∈ T} �= {0}.

Definition 2.6. [33] Let T be an almost periodic time scale. A function f ∈ C(T, R) iscalled an almost periodic function if the ε−translation set of f

E{ε, f } = {τ ∈

∏: |f (t + τ) − f (t)| < ε,∀t ∈ T

}is a relatively dense set in T for all ε > 0; that is, for any given ε > 0, there exists aconstant l(ε) > 0 such that each interval of length l(ε) contains a τ(ε) ∈ E{ε, f } suchthat ∣∣f (t + τ) − f (t)

∣∣ < ε,∀t ∈ T,

where τ is called the ε−translation number of and l(ε) is called the inclusion length ofE{ε, f }.Definition 2.7. [32,34] Let A(t) be an n × n matrix–valued function on T. Then thelinear system

x∇(t) = A(t)x(t), t ∈ T, (2.1)

is said to admit an exponential dichotomy on T if there exist constants ki, αi, i = 1, 2,projection P and the fundamental solution matrix X(t) of Eq. (2.1) satisfying

‖X(t)PX−1(s)‖ ≤ k1eνα1(t, s), s, t ∈ T, t ≥ s,

‖X(t)(I − P)X−1(s)‖ ≤ k2eνα2(t, s), s, t ∈ T, t ≤ s.

Page 7: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

Existence and stability of neutral–type BAM neural networks 595

Lemma 2.8. [34] If the linear system (2.1) admits an exponential dichotomy, then thealmost periodic system

x∇(t) = A(t)x(t) + g(t), t ∈ T, (2.2)

has a unique almost periodic solution x(t), and

x(t) =∫ t

−∞X(t)PX−1(ρ(s))g(s)∇s −

∫ +∞

t

X(t)(I − P)X−1ρ(s)g(s)∇s,

where X(t) is the fundamental solution matrix of Eq. (2.1).

Lemma 2.9. [33, 34] Assume ci(t) ∈ R+ν (i = 1, 2, . . . , n) are almost periodic on T,

and min1≤i≤n

{inft∈T

ci(t)} = m > 0. Then the linear system

x∇(t) = diag(

− c1(t), −c2(t), . . . , −cn(t))x(t)

admits an exponential dichotomy on T.

Definition 2.10. [33, 34] Let z∗ = (x∗1 , x∗

2 , . . . , x∗n, y∗

1 , y∗2 , . . . , y∗

m)T be a continu-ously differentiable almost periodic solution of system (1.1) with initial value φ∗(s) =(ϕ∗

1(s), ϕ∗2(s), . . . , ϕ

∗n(s), ψ

∗1(s), ψ

∗2(s), . . . , ψ

∗m(s))T . If there exists a constant λ > 0

such that for every solution z(t) = (x1(t), x2(t), . . . , xn(t), y1(t), y2(t), . . . , ym(t))T ofsystem (1.1) with initial valueφ(s) = (ϕ1(s), ϕ2(s), . . . , ϕn(s), ψ1(s), ψ2(s), . . . , ψm(s))T

satisfying

xi(t) − x∗i (t) = O(eνλ(t, 0)),

yj (t) − y∗j (t) = O(eνλ(t, 0)), i = 1, 2, . . . , n, j = 1, 2, . . . , m.

Then, the solution z∗(t) is said to be exponential stable.

3. The main results

This section is devoted to the main results of this paper in which we will establish suffi-cient conditions for the existence and exponential stability of almost periodic solutionsof system (1.1). As per expected, the impenetrable structure of system (1.1) will lead toquite lengthy computations. The analysis will be presented in two separate divisions.

3.1. Existence of almost periodic solutions

Define φ0(t) = (ϕ0

1(t), ϕ02(t), . . . , ϕ

0n(t), ψ

01(t), ψ

02(t), . . . , ψ

0m(t)

)T, where

ϕ0i (t) =

∫ t

−∞e−ai

(t, ρ(s))Ii(s)∇s, i = 1, 2, . . . , n,

Page 8: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

596 Hui Zhou and Jehad Alzabut

and

ψ0j (t) =

∫ t

−∞e−bj

(t, ρ(s))Jj (s)∇s, j = 1, 2, . . . , m.

Let L be a constant satisfying

max

⎧⎨⎩ ‖φ0‖, max1≤j≤m

|gj (0)|, max1≤j≤m

|fi(0)|, max1≤j≤m

|pj(0)|,max

1≤j≤m|qj (0), max

1≤j≤m|ui(0)||, max

1≤j≤m|vi(0)|

⎫⎬⎭ ≤ L.

and assume the following conditions

H.4 There exists a constant ρ > 0 such that χk(t) ≥ ρ, t ∈ R, k = 1, 2, . . . , n + m,

where

χk(t) ={

ai(t), k = i, i = 1, 2, . . . , n,

dj (t), k = n + j, j = 1, 2, . . . , m.

H.5 For i = 1, 2, . . . , n, j = 1, 2, . . . , m,

max

{max

1≤i≤n

{�i,

(1 + a+

i

a−i

)�i, max

1≤j≤m

{�j,

(1 + d+

i

d−i

)�j

}}}≤ 1,

andmax

{max

1≤i≤n{κi1, κi2}, max

1≤j≤m{εj1, εj2}

}< 1,

where

�i = 2a+i α+

i + 2m∑

j=1

b+ijL

g

j +m∑

j=1

b+ij + L

m∑j=1

m∑l=1

e+ij l(2L

p

j + 1)(2Lq

l + 1),

�j = 2d+j β+

j + 2n∑

i=1

c+jiL

f

i +n∑

i=1

c+ji + L

n∑i=1

n∑l=1

s+jil(2Lu

i + 1)(2Lvl + 1),

κi1 = 1

a−i

⎡⎣a+i α+

i +m∑

j=1

b+ijL

g

j +m∑

j=1

m∑l=1

e+ij l(QlL

p

j + PjLq

l )

⎤⎦ ,

κi2 =(

1 + a+i

α−i

)⎡⎣a+i α+

i +m∑

j=1

b+ijL

g

j +m∑

j=1

m∑l=1

e+ij l(QlL

p

j + PjLq

l )

⎤⎦ ,

εj1 = 1

d−j

⎡⎣d+j β+

j +m∑

j=1

c+jiL

f

i +m∑

j=1

m∑l=1

s+jil(VlL

ui + UiL

vl )

⎤⎦and

εj2 =(

1 + d+j

d−j

)[d+

j β+j +

n∑i=1

c+jiL

f

i +n∑

i=1

n∑l=1

s+jil(VlL

ui + UiL

vl )

].

Page 9: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

Existence and stability of neutral–type BAM neural networks 597

Clearly, condition (H.5) is used to dominate certain quantities in the inequalities in theproofs of the main results.

Theorem 3.1. Suppose that the assumptions H.1–H.5 are satisfied. Then there exists aunique almost periodic solution of system (1.1) in X0 = {

φ ∈ X : ‖φ − φ0‖ ≤ L},

where

φ(t) =(ϕ1(t), ϕ2(t), . . . , ϕn(t), ψ1(t), ψ2(t), . . . , ψm(t)

)T

.

Proof. First, we rewrite the following form of system (1.1)⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

x∇i (t) = −ai(t)xi(t) + ai(t)

∫ t

t−α(t)

x∇i (s)∇s +

m∑j=1

bij (t)gj (yj (t − τ j (t)))

+m∑

j=1

m∑l=1

eij l(t)pj (y∇j (t − τ j (t)))ql(yl(t − τ l(t))) + Ii(t), i = 1, 2, . . . , n,

y∇j (t) = −dj (t)yj (t) + dj (t)

∫ t

t−α(t)

y∇j (s)∇s +

n∑i=1

cji(t)fi(xi(t − δi(t)))

+n∑

i=1

n∑l=1

sjil(t)ui(x∇i (t − δi(t)))vl(xl(t − δl(t))) + Jj (t), j = 1, 2, . . . , m,

(3.3)For any given φ ∈ X, we consider the following auxiliary almost periodic system ontime scale:{

x∇i (t) = −ai(t)xi(t) + Fi(t, ϕ, ψ) + Ii(t), i = 1, 2, . . . , n,

y∇j (t) = −dj (t)yj (t) + Gj(t, ϕ, ψ) + Jj (t), j = 1, 2, . . . , m,

(3.4)

where⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

Fi(t, ϕ, ψ) = ai(t)

∫ t

t−α(t)

ϕ∇i (s)∇s +

m∑j=1

bij (t)gj (ψj (t − τ j (t)))

+m∑

j=1

m∑l=1

eij l(t)pj (ψ∇j (t − τ j (t)))ql(ψl(t − τ l(t))),

Gj (t, ϕ, ψ) = dj (t)

∫ t

t−α(t)

ψ∇j (s)∇s +

n∑i=1

cji(t)fi(ϕi(t − δi(t)))

+n∑

i=1

n∑l=1

sjil(t)ui(ϕ∇i (t − δi(t)))vl(ϕl(t − δl(t))).

(3.5)

From (H.4) and Lemma 2.4, it follows that the linear system{x∇

i (t) = −ai(t)xi(t), i = 1, 2, . . . , n,

y∇j (t) = −dj (t)yj (t), j = 1, 2, . . . , m,

Page 10: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

598 Hui Zhou and Jehad Alzabut

admits an exponential dichotomy on T. Therefore, by Lemma 2.3, we have that system(3.4) has a unique almost periodic solution, which takes the following form:

⎧⎪⎪⎨⎪⎪⎩x

ϕi (t) =

∫ t

−∞e− ∫ t

s ai(u)∇u(t, ρ(s))[Fi(s, ϕ, ψ) + Ii(s)]∇s, i = 1, 2, . . . , n,

yψj (t) =

∫ t

−∞e− ∫ t

s dj (u)∇u(t, ρ(s))[Gj(s, ϕ, ψ) + Jj (s)]∇s, j = 1, 2, . . . , m.

(3.6)For φ ∈ X0, we have ‖φ‖ ≤ ‖φ − φ0‖ + ‖φ0‖ ≤ 2L. Define the following nonlinearoperator � by

� : X0 → X0,(ϕ1, ϕ2, . . . , ϕn, ψ1, ψ2, · · · , ψm

)T→ (

xϕ1 , x

ϕ2 , . . . , xϕ

n , yψ1 , y

ψ2 , . . . , yψ

m

)T,

(3.7)

where xϕi (i = 1, 2, . . . , n) and y

ψj (j = 1, 2, . . . , m) are defined by (3.4). In what

follows, we will show that � is a contraction mapping.

First, we prove that �φ ∈ X0, for any φ ∈ X0. It follows from (3.5) that

∣∣Fi(t, ϕ, ψ)∣∣ =

∣∣∣ai(t)

∫ t

t−α(t)

ϕ∇i (s)∇s +

m∑j=1

bij (t)gj (ψj (t − τ j (t)))

+m∑

j=1

m∑l=1

eij l(t)pj (ψ∇j (t − τ j (t)))ql(ψl(t − τ l(t)))

∣∣∣≤ ai(t)

∫ t

t−αi(t)

|ϕ∇i (u)|∇u

+m∑

j=1

bij (t)(|gj (ψj (t − τ j (t))) − gj (0)| + |gj (0)|)

+m∑

j=1

m∑l=1

eij l(t)(|pj(ψ

∇j (t − τ j (t)))

−pj(0)|)(|ql(ψl(t − τ l(t))) − ql(0)| + |ql(0)|)≤ a+

i α+i |ϕ∇|0 +

m∑j=1

b+ijL

g

j |ψj(t − τ j (t))| +m∑

j=1

b+ij |gj (0)|

+m∑

j=1

m∑l=1

e+ij l

(L

p

j |ψ∇j (t − τ j (t))|

+|pj(0)|)(Lq

l |ψl(t − τ l(t))| + |ql(0)|)

Page 11: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

Existence and stability of neutral–type BAM neural networks 599

≤ a+i α+

i |ϕ∇|0 +m∑

j=1

b+ijL

g

j |ψ |0 +m∑

j=1

b+ij |gj (0)|

+m∑

j=1

m∑l=1

e+ij l

(L

p

j |ψ∇|0 + |pj(0)|)(Lq

l |ψ |0 + |ql(0)|)≤ a+

i α+i |φ| +

m∑j=1

b+ijL

g

j |φ| +m∑

j=1

b+ij |gj (0)|

+m∑

j=1

m∑l=1

e+ij l

(L

p

j |φ| + |pj(0)|)(Lq

l |φ| + |ql(0)|)≤ 2a+

i α+i L + 2

m∑j=1

b+ijL

g

jL +m∑

j=1

b+ijL +

m∑j=1

m∑l=1

e+ij l(2L

p

j L + L)(2Lq

l L + L)

≤⎡⎣2a+

i α+i + 2

m∑j=1

b+ijL

g

j +m∑

j=1

b+ij + L

m∑j=1

m∑l=1

e+ij l(2L

p

j + 1)(2Lq

l + 1)

⎤⎦L

:= �iL, i = 1, 2, . . . , n, (3.8)

and

∣∣Gj(t, ϕ, ψ)∣∣ =

∣∣∣dj (t)

∫ t

t−β(t)

ψ∇j (s)∇s +

n∑i=1

cji(t)fi(ϕi(t − δi(t)))

+n∑

i=1

n∑l=1

sjil(t)ui(ϕ∇i (t − δi(t)))vl(ϕl(t − δl(t)))

∣∣∣≤ dj (t)

∫ t

t−βj (t)

|ψ∇j (u)|∇u

+n∑

i=1

cji(t)(|gi(ϕi(t − δi(t))) − fi(0)| + |fi(0)|)

+n∑

i=1

n∑l=1

sjil(t)(|ui(ϕ∇i (tδi(t)))

−ui(0)|)(|vl(ϕl(t − δl(t))) − vl(0)| + |vl(0)|)≤ d+

j β+j |ψ∇|0 +

n∑i=1

c+jiL

f

i |ϕi(t − δi(t))| +n∑

i=1

c+ji |fi(0)|

+n∑

i=1

n∑l=1

s+jil(L

ui |ϕ∇

i (t − δi(t))|

+|vi(0)|)(Lvl |ϕl(t − δl(t))| + |vl(0)|)

Page 12: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

600 Hui Zhou and Jehad Alzabut

≤ d+j β+

j |ψ∇|0 +n∑

i=1

c+jiL

f

i |ϕ|0 +n∑

i=1

c+ji |gi(0)|

+n∑

i=1

n∑l=1

s+jil(L

ui |ϕ∇|0 + |ui(0)|)(Lv

l |ϕ|0 + |vl(0)|)

≤ d+j β+

j |φ| +n∑

i=1

c+jiL

f

i |φ| +n∑

i=1

c+ji |fi(0)|

+n∑

i=1

n∑l=1

s+jil(L

ui |φ| + |ui(0)|)(Lv

l |φ| + |vl(0)|)

≤ 2d+j β+

j L + 2n∑

i=1

c+jiL

f

i L +n∑

i=1

c+jiL +

n∑i=1

n∑l=1

s+jil(2Lu

i L + L)(2Lvl L + L)

≤[

2d+j β+

j + 2n∑

i=1

c+jiL

f

i +n∑

i=1

c+ji + L

n∑i=1

n∑l=1

s+jil(2Lu

i + 1)(2Lvl + 1)

]L

:= �jL, j = 1, 2, . . . , m, (3.9)

where �i and �j are defined in (H.5).By virtue of (3.5)–(3.7), we get

∣∣(�φ − φ0)i(t)∣∣ =

∣∣∣∣∫ t

−∞e− ∫ t

s ai(u)∇u(t, ρ(s))Fi(s, ϕ, ψ)∇s

∣∣∣∣≤

∫ t

−∞e− ∫ t

s ai(u)∇u(t, ρ(s))|Fi(s, ϕ, ψ)|∇s

≤∫ t

−∞e− ∫ t

s ai(u)∇u(t, ρ(s))�iL∇s

≤ �iL

a−i

, i = 1, 2, . . . , n, (3.10)

and

∣∣(�φ − φ0)n+j (t)∣∣ =

∣∣∣∣∫ t

−∞e− ∫ t

s dj (u)∇u(t, ρ(s))Gj(s, ϕ, ψ)∇s

∣∣∣∣≤

∫ t

−∞e− ∫ t

s dj (u)∇u(t, ρ(s))|Gj(s, ϕ, ψ)|∇s

≤∫ t

−∞e− ∫ t

s dj (u)∇u(t, ρ(s))�jL∇s

≤ �jL

d−j

, j = 1, 2, . . . , m. (3.11)

Page 13: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

Existence and stability of neutral–type BAM neural networks 601

On the other hand, we obtain

∣∣(�φ − φ0)∇i (t)∣∣ =

∣∣∣∣∣(∫ t

−∞e− ∫ t

s ai(u)∇u(t, ρ(s))Fi(s, ϕ, ψ)∇s

)∇∣∣∣∣∣=

∣∣∣∣Fi(t, ϕ, ψ) − ai(t)

∫ t

−∞e− ∫ t

s ai(u)∇u(t, ρ(s))Fi(s, ϕ, ψ)∇s

∣∣∣∣≤ |Fi(t, ϕ, ψ)| + ai(t)

∫ t

−∞e− ∫ t

s ai(u)∇u(t, ρ(s))|Fi(s, ϕ, ψ)|∇s

≤ �iL + �iLa+

i

a−i

= (1 + a+i

a−i

)�iL, i = 1, 2, . . . , n, (3.12)

and

∣∣(�φ − φ0)∇n+j (t)∣∣ =

∣∣∣∣∣(∫ t

−∞e− ∫ t

s dj (u)∇u(t, ρ(s))Gj(s, ϕ, ψ)∇s

)∇∣∣∣∣∣=∣∣∣∣Gj(t, ϕ, ψ) − bj (t)

∫ t

−∞e−bj

∫∞0 kj (u)∇u(t, ρ(s))Gj(s, ϕ, ψ)∇s

∣∣∣∣≤∣∣∣∣Gj(t, ϕ, ψ)| + dj (t)

∫ t

−∞e− ∫ t

s dj (u)∇u(t, ρ(s))|Gj(s, ϕ, ψ)

∣∣∣∣∇s

≤ �jL + d+i

d−i

�jL

=(

1 + d+j

d−j

)�jL, j = 1, 2, . . . , m. (3.13)

In view of (3.8)–(3.11), we have

‖�φ − φ0‖ = max

{max

1≤i≤n

{�i,

(1 + a+

i

a−i

)�i

}, max

1≤j≤m

{�j,

(1 + d+

i

d−i

)�j

}}≤ L,

which yields that �φ ∈ X0.

Hereafter, we show that � is a contraction.Forφ = (ϕ1, ϕ2, . . . , ϕn, ψ1, ψ2, . . . , ψm)T , φ = (ϕ1, ϕ2, . . . , ϕn, ψ1, ψ2, . . . , ψm)T ∈

Page 14: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

602 Hui Zhou and Jehad Alzabut

X0, we have

|(�φ − �φ)i(t)| =∣∣∣ ∫ t

−∞e− ∫ t

s ai(u)∇u(t, ρ(s))

[ai(s)

∫ s

s−αi(s)

(ϕ∇i (u) − ϕ∇

i (u))∇u

+m∑

j=1

bij (s)(gj (ψj (s − τ j (s))) − gj (ψj (s − τ i(s))))

+m∑

j=1

m∑l=1

eij l(s)(pj (ψj (s − τ i(s)))ql(ψl(s − τ l(s)))

−pj(ψj (s − τ i(s)))ql(ψ l(s − τ l(s))))]∇s

∣∣∣≤

∫ t

−∞e− ∫ t

s ai(u)∇u(t, ρ(s))

[|ai(s)

∫ s

s−αi(s)

(ϕ∇i (u) − ϕ∇

i (u))∇u|

+∣∣∣∣∣∣

m∑j=1

bij (s)(gj (ψj (s − τ j (s))) − gj (ψj (s − τ i(s))))

∣∣∣∣∣∣+∣∣∣∣∣∣

m∑j=1

m∑l=1

eij l(s)(pj (ψj (s − τ i(s)))ql(ψl(s − τ l(s)))

−pj(ψj (s − τ i(s)))ql(ψ l(s − τ l(s))))∣∣]∇s

≤∫ t

−∞e−a−

i (t−s)(t, ρ(s))[a+

i α+i |ϕ − φ|1

+∣∣∣∣∣∣

m∑j=1

b+ijL

g

j |ψj(s − τ j (s)) − ϕj (s − τ i(s))

∣∣∣∣∣∣+

m∑j=1

m∑l=1

e+ij lQLL

p

j |ψj(s − τ i(s)) − ψj (s − τ i(s))|

+m∑

j=1

m∑l=1

e+ij lPjL

q

l |ϕj (s − τ i(s)) − ϕj (s − τ i(s))|]∇s

≤ 1

a−i

[a+

i α+i +

m∑j=1

b+ijL

g

j

+m∑

j=1

m∑l=1

e+ij lQlL

p

j +m∑

j=1

m∑l=1

e+ij lPjL

q

l

]‖φ − φ‖, (3.14)

Page 15: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

Existence and stability of neutral–type BAM neural networks 603

Furthermore

|(�φ − �φ)∇i (t)| =∣∣∣{ ∫ t

−∞e− ∫ t

s ai(u)∇u(t, ρ(s))

[ai(s)

∫ s

s−αi(s)

(ϕ∇i (u) − ϕ∇

i (u))∇u

+m∑

j=1

bij (s)(gj (ψj (s − τ j (s))) − gj (ψj (s − τ i(s))))

+m∑

j=1

m∑l=1

eij l(s)(pj (ψj (s − τ i(s)))ql(ψl(s − τ l(s)))

− pj(ψj (s − τ i(s)))ql(ψ l(s − τ l(s))))]∇s}∇∣∣∣

≤∣∣∣ai(s)

∫ s

s−αi(s)

(ϕ∇i (u) − ϕ∇

i (u))∇u

+m∑

j=1

bij (s)(gj (ψj (s − τ j (s))) − gj (ψj (s − τ i(s))))

+m∑

j=1

m∑l=1

eij l(s)(pj (ψj (s − τ i(s)))ql(ψl(s − τ l(s)))

− pj(ψj (s − τ i(s)))ql(ψ l(s − τ l(s))))

− ai(t)

∫ t

−∞e− ∫ t

s ai(u)∇u(t, ρ(s))

[ai(s)

∫ s

s−αi(s)

(ϕ∇i (u) − ϕ∇

i (u))∇u

+m∑

j=1

bij (s)(gj (ψj (s − τ j (s))) − gj (ψj (s − τ i(s))))

+m∑

j=1

m∑l=1

eij l(s)(pj (ψj (s − τ i(s)))ql(ψl(s − τ l(s)))

− pj(ψj (s − τ i(s)))ql(ψ l(s − τ l(s))))]∇s

∣∣∣≤∣∣∣ai(s)

∫ s

s−αi(s)

(ϕ∇i (u) − ϕ∇

i (u))∇u

∣∣∣+∣∣∣ m∑

j=1

bij (s)(gj (ψj (s − τ j (s))) − gj (ψj (s − τ i(s))))

∣∣∣+∣∣∣ m∑

j=1

m∑l=1

eij l(s)(pj (ψj (s − τ i(s)))ql(ψl(s − τ l(s)))

− pj(ψj (s − τ i(s)))ql(ψ l(s − τ l(s))))

∣∣∣+ ai(t)

∫ t

−∞e− ∫ t

s ai(u)∇u(t, ρ(s))

[∣∣ai(s)

∫ s

s−αi(s)

(ϕ∇i (u) − ϕ∇

i (u))∇u∣∣

Page 16: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

604 Hui Zhou and Jehad Alzabut

+ ∣∣ m∑j=1

bij (s)(gj (ψj (s − τ j (s))) − gj (ψj (s − τ i(s))))∣∣

+ ∣∣ m∑j=1

m∑l=1

eij l(s)(pj (ψj (s − τ i(s)))ql(ψl(s − τ l(s)))

− pj(ψj (s − τ i(s)))ql(ψ l(s − τ l(s))))∣∣]∇s

It follows that

|(�φ − �φ)∇i (t)| ≤ a+i α+

i |ϕ − ϕ|1 +m∑

j=1

b+ijL

g

j |ψ − ψ |1

+m∑

j=1

m∑l=1

e+ij lQLL

p

j |ψ − ψ |1 +m∑

j=1

m∑l=1

e+ij lPjL

q

l |ψ − ψ |1

+a+i

a−i

[a+

i α+i |ϕ − ϕ|1 +

m∑j=1

b+ijL

g

j |ψ − ψ |1

+m∑

j=1

m∑l=1

e+ij lQLL

p

j |ψ − ψ |1 +m∑

j=1

m∑l=1

e+ij lPjL

q

l |ψ − ψ |1]

= (1 + a+i

a−i

)[a+

i α+i |ϕ − ϕ|1 +

m∑j=1

b+ijL

g

j |ψ − ψ |1

+m∑

j=1

m∑l=1

e+ij lQLL

p

j |ψ − ψ |1 +m∑

j=1

m∑l=1

e+ij lPjL

q

l |ψ − ψ |1]

≤ (1 + a+i

a−i

)[a+

i α+i +

m∑j=1

b+ijL

g

j

+m∑

j=1

m∑l=1

e+ij l(QlL

p

j + PjLq

l )]‖φ − φ‖. (3.15)

By applying the same arguments, one can deduce that∣∣(�φ − �φ)n+j (t)∣∣ ≤ 1

d−j

[d+

j β+j +

n∑i=1

c+jiL

f

i +n∑

i=1

n∑l=1

s+jil(VlL

ui + UiL

vl )]‖φ − φ‖,

(3.16)and∣∣(�φ−�φ)∇n+j (t)

∣∣ ≤(

1 + d+j

d−j

)[d+

j β+j +

n∑i=1

c+jiL

f

i +n∑

i=1

n∑l=1

s+jil(VlL

ui +UiL

vl )]‖φ−φ‖.

(3.17)

Page 17: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

Existence and stability of neutral–type BAM neural networks 605

From (H.4), however, we have

max{

max1≤i≤n

{κi1, κi2}, max1≤j≤m

{εj1, εj2}}

< 1. (3.18)

It follows from (3.12)–(3.16) that ‖�φ − �φ‖ < ‖φ − φ‖, which means that � is acontraction mapping. Based on Banch fixed point theorem, � has a fixed point φ∗ ∈ X0such that �φ∗ = φ∗. That is, (3.1) has a unique almost periodic solution φ∗ ∈ X0.

Therefore, there exists a unique almost periodic solution φ∗ of system (1.1) in X0. Theproof of Theorem 3.1 is completed. �

3.2. Exponential stability of almost periodic solution

Let

H.6 For t ∈ (0, ∞)T, there exist positive constants λ ∈ R+ν , γ i and χj such that⎧⎪⎪⎪⎪⎨⎪⎪⎪⎪⎩

−[a−i − (a+

i )2]γ i +m∑

j=1

χj

[b+

ijLg

j +m∑

l=1

e+ij l(QlL

p

j + PjLq

l )]

< 0,

−[d−j − (d+

j )2]χj +n∑

i=1

γ i

[c+jiL

f

i +n∑

l=1

s+jil(VlL

ui + UiL

vl )]

< 0,

and ⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

[(1 − νλ)a+

i + (λ + (1 − νλ))a+i

∫ t

t−αi(t)

eλ(t, ω)∇ω]

+eλ(ρ(t), 0)γ i−1

m∑j=1

χj

[b+

ijLg

j eλ(0, t − τ ij (t))

+m∑

l=1

e+ij l(QlL

p

j + PjLq

l )eλ(0, t − τ ij (t))]

< 1,

[(1 − νλ)d+

j + (λ + (1 − νλ))d+j

∫ t

t−βj (t)

eλ(t, ω)∇ω]

+eλ(ρ(t), 0)χj−1

n∑i=1

γ i

[c+jiL

f

i eλ(0, t − δji(t))

+n∑

l=1

s+jil(VlL

ui + UiL

vl )eλ(0, t − δji(t))

]< 1.

Theorem 3.2. Let H.1–H.6 hold. Then the almost solution of system (1.1) is exponen-tially stable.

Proof. In view of Theorem 3.1, system (1.1) has an almost periodic solution

z∗(t) =(x∗

1 (t), x∗2 (t), . . . , x∗

n(t), y∗1 (t), y∗

2 (t), . . . , y∗m(t)

)T

Page 18: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

606 Hui Zhou and Jehad Alzabut

with initial condition

φ∗(s) =(ϕ∗

1(s), ϕ∗2(s), . . . , ϕ

∗n(t), ψ

∗1(s), ψ

∗2(s), . . . , ψ

∗m(s)

)T

.

Suppose that z(t) = (x1(t), x2(t), . . . , xn(t), y1(t), y2(t), . . . , ym(t)

)Tis an arbitrary

solution of system (1.1) with initial condition φ(s) = (ϕ1(s), ϕ2(s), . . . , ϕn(t), ψ1(s),

ψ2(s), . . . , ψm(s))T

. For i = 1, 2, . . . , n, j = 1, 2, . . . , m, let{µi(t) = xi(t) − x∗

i (t)

ζ j (t) = yj (t) − y∗j (t)

(4.1)

and⎧⎪⎪⎪⎨⎪⎪⎪⎩Gj (t) = gj (ζ j (t − τ j (t))) − gj (ζ

∗j (t − τ j (t))),

Fi(t) = fi(µi(t − δi(t))) − fi(µ∗i (t − δi(t))),

Hjl(t) = pj(ζ∇j (t − τ j (t))ql(ζ j (t − τ l(t))) − pj(ζ

∗j∇(t − τ j (t))ql(ζ

∗j (t − τ l(t))),

Nil(t) = ui(µ∇i (t − δi(t))vl(µi(t − δl(t))) − ui(µ

∗i∇(t − δi(t))vl(µ

∗i (t − δl(t))).

(4.2)

Then ⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

u∇i (t) = −ai(t)ui(t) + ai(t)

∫ t

t−αi(t)

u∇i (ω)∇ω

+m∑

j=1

bij (t)Gj (t) +m∑

j=1

m∑l=1

eij l(t)Hjl(t),

ζ∇j (t) = −di(t)ζ j (t) + di(t)

∫ t

t−βj (t)

ζ∇j (ω)∇ω

+n∑

i=1

cji(t)Fi(t) +n∑

i=1

n∑l=1

sjil(t)Nil(t).

(4.3)

Define

Xi(t) = eλ(t, 0)ui(t), Yj (t) = eλ(t, 0)ζ j (t), i = 1, 2, . . . , n, j = 1, 2, . . . , m.

(4.4)

It follows from (4.3) and (4.4) that

X∇i (t) = λeλ(t, 0)ui(t) + eλ(ρ(t), 0)u∇

i (t)

= λXi(t) + eλ(ρ(t), 0)[− ai(t)ui(t) + ai(t)

∫ t

t−αi(t)

u∇i (ω)∇ω

+m∑

j=1

bij (t)Gj (t) +m∑

j=1

m∑l=1

eij l(t)Hjl(t)]

= λXi(t) − (1 − νλ)ai(t)Xi(t) + (1 − νλ)ai(t)

∫ t

t−αi(t)

eλ(t, ω)X∇i (ω)∇ω

+eλ(ρ(t), 0)

m∑j=1

[bij (t)Gj (t) +

m∑j=1

m∑l=1

eij l(t)Hjl(t)], i = 1, 2, . . . , n.

(4.5)

Page 19: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

Existence and stability of neutral–type BAM neural networks 607

In similar manner, we have⎧⎪⎪⎪⎪⎨⎪⎪⎪⎪⎩Y∇

j (t) = λYj (t) − (1 − νλ)dj (t)Yj (t) + (1 − νλ)dj (t)

∫ t

t−βj (t)

eλ(t, ω)Y∇j (ω)∇ω

+ eλ(ρ(t), 0)

m∑j=1

[cji(t)Fi(t) +

n∑i=1

n∑l=1

sjil(t)Nil(t)], j = 1, 2, . . . , m.

Define two continuous functions �i(ξ) and �j(ξ), i = 1, 2, . . . , n, j = 1, 2, . . . , m, bysetting⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

�i(ξ) = −[(1 − νξ)a−i − (ξ + (1 − νξ)a+

i

∫ t

t−αi(t)

eξ (t, ω)∇ω)]γ i

+eξ (ρ(t), 0)

m∑j=1

χj

[b+

ijLg

j eλ(0, t − τ ij (t))

+m∑

l=1

e+ij l(QlL

p

j + PjLq

l )eλ(0, t − τ ij (t))],

�j (ξ) = −[(1 − νξ)d−j − (ξ + (1 − νξ)d+

j

∫ t

t−βj (t)

eξ (t, ω)∇ω)]χj

+eξ (ρ(t), 0)

n∑i=1

γ i

[c+jiL

f

i eλ(0, t − δji(t))

+n∑

l=1

s+jil(VlL

ui + UiL

vl )eλ(0, t − δji(t))

].

(4.6)

Then, for i = 1, 2, . . . , n, j = 1, 2, . . . , m,⎧⎪⎪⎪⎪⎨⎪⎪⎪⎪⎩�i(0) = −[a−

i − (a+i )2]γ i +

m∑j=1

χj

[b+

ijLg

j +m∑

l=1

e+ij l(QlL

p

j + PjLq

l )]

< 0,

�j (0) = −[d−j − (d+

j )2]χj +n∑

i=1

γ i

[c+jiL

f

i +n∑

l=1

s+jil(VlL

ui + UiL

vl )]

< 0,

which means that there exists a constant λ > 0 such that⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

−[(1 − νξ)a−i − (ξ + (1 − νξ)a+

i

∫ t

t−αi(t)

eξ (t, ω)∇ω)]γ i

+eξ (ρ(t), 0)

m∑j=1

χj

[b+ijL

gj eλ(0, t − τ ij (t)) +

m∑l=1

e+ij l(QlL

pj + PjL

ql )eλ(0, t − τ ij (t))

]< 0,

−[(1 − νξ)d−j − (ξ + (1 − νξ)d+

j

∫ t

t−βj (t)

eξ (t, ω)∇ω)]χj

+eξ (ρ(t), 0)

n∑i=1

γ i

[c+jiL

fi eλ(0, t − δji(t)) +

n∑l=1

s+jil(VlL

ui + UiL

vl )eλ(0, t − δji(t))

]< 0.

Page 20: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

608 Hui Zhou and Jehad Alzabut

Let

M = max{

max1≤i≤n

{|Xi(s)|, |X∇i (s)|}, max

1≤j≤m{|Yj (s)|, |Y∇

j (s)|}, s ∈ [−σ , 0]T}.

Therefore, for any t ∈ [−σ , 0]T, there exists K > 0 such that

{ |Xi(t)| ≤ M < Kγ i, |X∇i (t)| ≤ M < Kγ i,

|Yj (t)| ≤ M < Kχj, |Y∇j (t)| ≤ M < Kχj .

Next, we prove that

{ |Xi(t)| < Kγ i, |X∇i (t)| < Kγ i, t ∈ T, i = 1, 2, . . . , n,

|Yj (t)| < Kχj, |Y∇j (t)| < Kχj, t ∈ T, j = 1, 2, . . . , m.

For the sake of contradiction, one can find two indices i ∈ {1, 2, . . . , n}, j ∈ {1, 2, . . . , m}and a constant θ ∈ (0, ∞)T such that one of the following cases:

(i) |Xi(θ)| = Kγ i and |Xi(t)| < Kγ i for all t ∈ [−σ , θ)T, and |X∇i (t)| <

Kγ i, |Yj (t)| < Kχj, |Y∇j (t)| < Kχj for all t ∈ [−σ , θ ]T;

(ii) |X∇i (θ)| = Kγ i and |X∇

i (t)| < Kγ i for all t ∈ [−σ , θ)T, and |Xi(t)| <

Kγ i, |Yj (t)| < Kχj, |Y∇j (t)| < Kχj for all t ∈ [−σ , θ ]T;

(iii) |Yj (θ)| = Kχj and |Yj (t)| < Kχj for all t ∈ [−σ , θ)T, and |Xi(t)| < Kγ i, |X∇i (t)| <

Kγ i, |Y∇j (t)| < Kχj for all t ∈ [−σ , θ ]T;

(iv) |Y∇j (θ)| = Kχj and |Y∇

j (t)| < Kχj for all t ∈ [−σ , θ)T, and |Xi(t)| <

Kγ i, |X∇i (t)| < Kγ i, |Yj (t)| < Kχj for all t ∈ [−σ , θ ]T.

Let (i) hold, then either(i)

′Xi(θ) = Kγ i, X

∇i (θ) ≥ 0 and |Xi(t)| < Kγ i for all t ∈ [−σ , θ)T, and |X∇

i (t)| <

Kγ i, |Yj (t)| < Kχj, |Y∇j (t)| < Kχj for all t ∈ [−σ , θ ]T; or

(i)′′Xi(θ) = −Kγ i, X

∇i (θ) ≤ 0 and |Xi(t)| < Kγ i for all t ∈ [−σ , θ)T, and |X∇

i (t)| <

Kγ i, |Yj (t)| < Kχj, |Y∇j (t)| < Kχj for all t ∈ [−σ , θ ]T.

Page 21: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

Existence and stability of neutral–type BAM neural networks 609

For (i)′, from H.6 and equation (4.5), we get

0 ≤ X∇i (θ) = λXi(θ) − (1 − νλ)ai(θ)Xi(θ)

+(1 − νλ)ai(θ)

∫ θ

θ−αi(θ)

eλ(θ, ω)X∇i (ω)∇ω

+eλ(ρ(θ), 0)

m∑j=1

[bij (θ)Gj (θ) +

m∑l=1

eij l(θ)Hjl(θ)]

≤ −[(1 − νλ)a−i − λ]Xi(θ) + a+

i (1 − νλ)

∫ θ

θ−αi(θ)

eλ(θ, ω)|X∇i (ω)|∇ω

+eλ(ρ(θ), 0)

m∑j=1

[b+

ijLg

j eλ(0, θ − τ j (θ))|Yj (θ − τ j (θ))|

+m∑

l=1

e+ij l(QLL

p

j + PjLq

l )eλ(0, θ − τ l(θ))|Y∇j (θ − τ l(θ))|]

< −[(1 − νλ)a−i − λ

]Kγ i + (1 − νλ)a+

i

∫ θ

θ−αi(θ)

eλ(θ, ω)Kγ i∇ω

+eλ(ρ(θ), 0)

m∑j=1

[b+

ijLg

j eλ(0, θ − τ j (θ))Kχj

+m∑

l=1

e+ij l(QLL

p

j + PjLq

l )eλ(0, θ − τ l(θ))Kχj

]= {− [

(1 − νλ)a−i − (λ + (1 − νλ)a+

i

∫ θ

θ−αi(θ)

eλ(θ, ω)∇ω)]γ i

+eλ(ρ(θ), 0)

m∑j=1

χj

[b+

ijLg

j eλ(0, θ − τ j (θ))

+m∑

l=1

e+ij l(QLL

p

j + PjLq

l )eλ(0, θ − τ l(θ))]}

K < 0,

which is a contraction.For (i)

′′, from H.6 and equation (4.5), we get

0 ≥ X∇i (θ) ≥ −[(1 − νλ)ai(θ) − λ

]Xi(θ)

−ai(θ)(1 − νλ)

∫ θ

θ−αi(θ)

eλ(θ, ω)|X∇i (ω)|∇ω

−eλ(ρ(θ), 0)

m∑j=1

[bij (θ)L

g

j eλ(0, θ − τ j (θ))|Yj (θ − τ j (θ))|

+m∑

l=1

eij l(θ)(QLLp

j + PjLq

l )eλ(0, θ − τ l(θ))|Y∇j (θ − τ l(θ))|]

Page 22: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

610 Hui Zhou and Jehad Alzabut

>[(1 − νλ)a−

i − λ]Kγ i − (1 − νλ)a+

i

∫ θ

θ−αi(θ)

eλ(θ, ω)Kγ i∇ω

−eλ(ρ(θ), 0)

m∑j=1

[b+

ijLg

j eλ(0, θ − τ j (θ))Kχj

+l∑

j=1

e+ij l(QLL

p

j + PjLq

l )eλ(0, θ − τ j (θ))Kχj

]= {[

(1 − νλ)a−i − (λ + (1 − νλ)a+

i

∫ θ

θ−αi(θ)

eλ(θ, ω)∇ω)]γ i

−eλ(ρ(θ), 0)

m∑j=1

χj [b+ijL

g

j eλ(0, θ − τ j (θ))

+m∑

l=1

e+ij l(QLL

p

j + PjLq

l )eλ(0, θ − τ l(θ))]}

K > 0,

which is also a contraction. Thus, (i) is not valid.Let (ii) hold, from H.6 and equation (4.5), we obtain

Kγ i = |X∇i (θ)| ≤ (1 − νλ)ai(θ)|Xi(θ)|

+λ|Xi(θ)| + ai(θ)(1 − νλ)

∫ θ

θ−αi(θ)

eλ(θ, ω)|X∇i (ω)|∇ω

+eλ(ρ(θ), 0)

m∑j=1

[bij (θ)L

g

j eλ(0, θ − τ j (θ))|Yj (θ − τ j (θ))|

+m∑

l=1

eij l(θ)(QLLp

j + PjLq

l )eλ(0, θ − τ l(θ))|Y∇j (θ − τ l(θ))|]

< [(1 − νλ)a+i + λ]Kγ i + (1 − νλ)a+

i

∫ θ

θ−αi(θ)

eλ(θ, ω)Kγ i∇ω

+eλ(ρ(θ), 0)

m∑j=1

[b+

ijLg

j eλ(0, θ − τ j (θ))Kχj

+m∑

l=1

e+ij l(QLL

p

j + PjLq

l )eλ(0, θτ l(θ))Kχj

]= {[

(1 − νλ)a+i + (λ + (1 − νλ)a+

i

∫ θ

θ−ai(θ)

eλ(θ, ω)∇ω)]

+eλ(ρ(θ), 0)/γ i

m∑j=1

χj [b+ijL

g

j eλ(0, θ − τ j (θ))

+m∑

l=1

e+ij l(QLL

p

j + PjLq

l )eλ(0, θ − τ l(θ))]}

Kγ i

< Kγ i,

Page 23: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

Existence and stability of neutral–type BAM neural networks 611

which is a contraction. So (ii) is not valid. Applying similar arguments, one can alsoshow that (iii) and (iv) do not hold true. From the above discussions, it follows that{ |xi(t) − x∗

i (t)| ≤ eνλ(t, 0)Kγ i, t ∈ T, i = 1, 2, . . . , n,

|yi(t) − y∗i (t)| ≤ eνλ(t, 0)Kχj , t ∈ T, j = 1, 2, . . . , m.

Thus, the almost periodic solution on time scale of system (1.1) is exponentially stable.The proof is finished. �

In this section, an example is given to illustrate the effectiveness of the obtainedresults in previous sections.

Example 3.3. In view of system (1.1), consider the following neutral–type BAM neuralnetworks with delays in the neutral derivative and leakage terms of the form:⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

x∇1 (t) = −a1(t)xi(t − α1(t)) +

2∑j=1

b1j (t)gj (yj (t − τ j (t)))

+2∑

j=1

2∑l=1

e1j l(t)pj (y∇j (t − τ j (t)))ql(yl(t − τ l(t))) + I1(t),

x∇2 (t) = −a2(t)x2(t − α2(t)) +

2∑j=1

b2j (t)gj (yj (t − τ j (t)))

+2∑

j=1

2∑l=1

e2j l(t)pj (y∇j (t − τ j (t)))ql(yl(t − τ l(t))) + I2(t),

y∇1 (t) = −d1(t)y1(t − β1(t)) +

2∑i=1

c1i(t)fi(xi(t − δi(t)))

+2∑

i=1

2∑l=1

s1il(t)ui(x∇i (t − δi(t)))vl(xl(t − δl(t))) + J1(t),

y∇2 (t) = −d2(t)y2(t − β2(t)) +

2∑i=1

c2i(t)fi(xi(t − δi(t)))

+2∑

i=1

n∑l=1

s2il(t)ui(x∇i (t − δi(t)))vl(xl(t − δl(t))) + J2(t),

(5.1)

where [a1(t) a2(t)

d1(t) d2(t)

]=[

0.6 + 0.3 sin t 0.7 + 0.3 cos t

0.8 + 0.4 sin t 0.7 + 0.5 cos t

][

b11(t) b12(t)

b21(t) b22(t)

]=[

0.09 + 0.03 sin t 0.04 + 0.02 cos t

0.07 + 0.02 sin t 0.06 + 0.03 cos t

][

e111(t) e112(t)

e121(t) e122(t)

]=[

0.08 + 0.03 sin t 0.06 + 0.01 cos t

0.05 + 0.01 sin t 0.07 + 0.02 cos t

]

Page 24: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

612 Hui Zhou and Jehad Alzabut

[e211(t) e212(t)

e221(t) e222(t)

]=[

0.07 + 0.02 sin t 0.05 + 0.01 cos t

0.08 + 0.03 sin t 0.05 + 0.02 cos t

][

s111(t) s112(t)

s121(t) s122(t)

]=[

0.06 + 0.02 sin t 0.07 + 0.03 cos t

0.08 + 0.04 sin t 0.07 + 0.02 cos t

][

s211(t) s212(t)

s221(t) s222(t)

]=[

0.09 + 0.04 sin t 0.07 + 0.02 cos t

0.08 + 0.02 sin t 0.06 + 0.01 cos t

][

α1(t) α2(t)

β1(t) β2(t)

]=[

0.04 + 0.02 sin t 0.05 + 0.03 cos t

0.03 + 0.01 sin t 0.04 + 0.01 cos t

][

τ 1(t) τ 2(t)

δ1(t) δ2(t)

]=[

0.05 + 0.02 sin t 0.04 + 0.03 cos t

0.06 + 0.03 sin t 0.04 + 0.02 cos t

][

I1(t) I2(t)

J1(t) J2(t)

]=[

0.0027 sin t 0.0035 cos t

0.0054 sin t 0.0069 cos t

],

[g1(y1) g2(y2)

f1(x1) f2(x2)

]

=

⎡⎢⎢⎢⎣3 sin

√3

3y1 3 cos

√5

3y2

2 sin

√2

3x1 2 cos

√7

3x2

⎤⎥⎥⎥⎦

[p1(y1) p2(y2)

q1(y1) q2(y2)

]=

⎡⎢⎢⎢⎣3 sin

√3

3y1 3 cos

√5

3y2

3 sin

√2

3y1 3 cos

√7

3y2

⎤⎥⎥⎥⎦ ,

[u1(x1) u2(x2)

v1(x1) v2(x2)

]

=

⎡⎢⎢⎢⎣0.45 sin

√3

3x1 0.75 cos

√5

3x2

0.45 sin

√2

3x1 0.75 cos

√7

3x2

⎤⎥⎥⎥⎦ .

Consider the case T = R with ρ(t) = t and ν(t) = 0. Let γ 1 = γ 2 = χ1 = χ2 = 1 andλ = 0.003. Then, it is not difficult to check that all assumptions H.1–H.6 are satisfied.Hence, from Theorem 3.1 and Theorem 3.2, we can conclude that system (5.1) hasexactly one solution on almost periodic time scale which is exponentially stable.

Remark 3.4. In view of the above example, one can easily figure out that the proposedresults in [39–41] can not be applied to comment on the dynamic behavior of system(5.1). Therefore, our theorems extend the existing results in the literature.

Page 25: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

Existence and stability of neutral–type BAM neural networks 613

4. A concluding remark

This paper considers neutral type BAM neural networks with delays imposed in theneutral derivative and leakage terms on time scales. It has been evidenced that thepresence of time delay in BAM neural networks has unshakable consequences in thehardware implementation. Indeed, the time delay has capability to distabilize the systemand to cause oscillation in the networks. A primary objective of this paper is to seehow influential these days on system (1.1) will be! The exponential dichotomy of lineardifferential equation on time scales, Banach fixed point theorem and the method oflyapunov functional are the main techniques used to prove the main results in this paper.

It is worth mentioning here that the paper in [39] dealt with cellular networks of theform

x∇i (t) = −ci(t)xi(t − ηi(t)) +

n∑j=1

aij (t)gj (xj (t − τ ij (t)))

+n∑

j=1

bij (t)

∫ ∞

0kij (u)gj (xj (t − u))∇u + Ii, i = 1, 2, . . . , n. (6.19)

The authors studied the existence and global stability of () with delays in the leakageterms. in [39], on the other hand, the current author considered neutral–type BAM neuralnetworks involving a leakage term with distributed delay. However, it does not includea regulator transmission functions which noticeably play an important role in measuringthe rate of activation functions. In recent paper [41], Du et al, investigated a neutral–typeneural networks with distributed leakage delay of the form

(Aixi)∇(t) = −ai(t)

∫ ∞

0ki(s)xi(t − s)∇s +

n∑j=1

bij (t)fj (xj (t))

+n∑

j=1

dij (t)gj (xj (t − τ ij (t))) + Ii, i = 1, 2, . . . , n,

where (Aixi)(t) = xi(t) −n∑

j=1

cij (t)xi(t − δij (t)). A main feature of this paper is the

incorporation of operator A in the description of the networks and the implementationof new technique based on discrete–continuous analysis method. The authors claim thatteh conditions which are proposed in this paper to ensure the existence and stabilityof solutions of system (1.1) are unlike those established in [39–41] and never beenconsidered before. Indeed, they are entirely new, have their own merit and extend thoseexisting in the literature.

Page 26: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

614 Hui Zhou and Jehad Alzabut

References

[1] B. Kosko, Bidirectional associative memories, IEEE Trans Man Cybern 18 (1988)49–60.

[2] C. Francesco A. Vinciarelli, Combining neural gas and learning vector quantizationfor cursive character recognition, Neurocomputing 51 (2003) 147–159.

[3] P. Liu, F. Yi, Q. Guo, W. Wu, Analysis on global exponential robust stability ofreaction–diffusion neural networks with S–type distributed delays, Physica D: Non-linear Phenomena 237 (2008) 475–485.

[4] S. Senan, S. Arık, D. Liu, New robust stability results for bidirectional associativememory neural networks with multiple time delays,Appl. Math. Compu. 218 (2012)11472–11482.

[5] S. Mohamad, K. Gopalsamy, Exponential stability preservation in semi discreti-sations of BAM networks with nonlinear impulses, Communications in NonlinearScience and Numerical Simulation 14 (2009) 27–50.

[6] R. Sharma, Application of neural network models in recognition field: A survey,International Journal of Scientific and Engineering Research 4 (2) (2013).

[7] Z. J. Wang, Y. C. Liang, Application of BAM network in fault diagnosis of oil–immerseed transformer, Applied Mechanics and Materials, Vols. 325–326 (2013)424–430.

[8] M. Saylı, E. Yılmaz, Global robust asymptotic stability of variable–time impulsiveBAM neural networks, Neural Networks 60 (2014) 67–73.

[9] X. Liu, M. Tang, R. Martin, X. Liu, Discrete–time BAM neural networks withvariable delays, Phys. Lett. A 367 (2007) 322—330.

[10] M. Gao and B. Cui, Global robust exponential stability of discrete-time intervalBAM neural networks with time–varying delays, Appl. Math. Modelling 33 (2009)1270–1284.

[11] X. H. Tian, R. Xu, Q. T. Gan, Hopf bifurcation analysis of a BAM neural networkwith multiple time delays and diffusion, Appl. Math. Compu. 266 (2015) 909–926.

[12] L. Q. Zhou, Novel global exponential stability criteria for hybrid BAM neuralnetworks with proportional delays, Neurocomputing 161 (2015) 99–106.

[13] C. J. Xu, Q. M. Zhang, Existence and global exponential stability of anti–periodicsolutions for BAM neural networks with inertial term and delay, Neurocomputing153 (2015) 108–116.

[14] Y. K. Du, R. Xu, Multistability and multiperiodicity for a class of Cohen–GrossbergBAM neural networks with discontinuous activation functions and time delays,Neural Processing Letters 42 (2015) 417–435.

[15] Z. Zhang, W. Liu, D. Zhou, Global asymptotic stability to a generalized Cohen–Grossberg BAM neural networks of neutral type delays, Neural Networks 25 (2012)94–105.

Page 27: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

Existence and stability of neutral–type BAM neural networks 615

[16] Y. K. Li, C. Wang, Existence and global exponential stability of equilibrium fordiscrete–time fuzzy BAM neural networks with variable delays and impulses, FuzzySets and Systems 217 (2013) 62–79.

[17] I. M. Stamova, G. Tr. Stamov and J. O. Alzabut, Global exponential stability for aclass of impulsive BAM neural networks with distributed delays, Applied Mathe-matics and Information Sciences 7 (4) (2013) 1539–1546.

[18] S. Senthilraj, R. Raja, F. Jiang, Q.X. Zhu, R. Samidurai, New delay–interval-dependent stability analysis of neutral type BAM neural networks with successivetime delay components, Neurocomputing 171 (2016) 1265–1280.

[19] L. Duan, L. Huang, Z. Cai, Existence and stability of periodic solution for mixedtime–varying delayed neural networks with discontinuous activations, Neurocom-puting 123 (2014) 255–265.

[20] K. Gopalsamy, Leakage delays in BAM, J. Math. Anal. Appl. 325 (2007) 1117–1132.

[21] S. Peng, Global attractive periodic solutions of BAM neural networks with contin-uously distributed delays in the leakage terms, Nonlinear Anal.: Real World Appl.11 (2010) 2141–2151.

[22] P. Balasubramaniam, M. Kalpana, R. Rakkiyappan, Global asymptotic stability ofBAM fuzzy cellular neural networks with time delay in the leakage term, discreteand unbounded distributed delays. Math. Comput. Model. 53 (2011) 839–853.

[23] B. Liu, Global exponential stability for BAM neural networks with time–varyingdelays in the leakage terms, NonlinearAnal.: Real WorldAppl. 14 (2013) 559–566.

[24] S. Lakshmanan, J. Park, T. Lee, H. Jung, R. Rakkiyappan, Stability criteria forBAM neural networks with leakage delays and probabilistic time–varying delays,Appl. Math. Comput. 219 (2013) 9408–9423.

[25] H. Zhang, Existence ans stability of almost periodic solutions for CNNs with con-tinuously distributed leakage delays, Neural Compu. Appl. 24(2014) 1135–1146.

[26] J. Feng, S. Xu, Y. Zou, Delay–dependent stability of neutral type neural networkswith distributed delays, Neurocomputing 72 (2009) 2576–2580.

[27] P. Balasubramaniam, R. Rakkiyappan, Global exponential stability for neutral–type BAM neural networks with time–varaying delays, Int. J. Comput. Math. 87(9) (2010) 2064–2075.

[28] P. Balasubramaniam, V.Vembarasan, Asymptotic stability of BAM neural networksof neutral–type with impulsive effects time delay in the leakage term, Int. J. Comput.Math. 88 (15) (2011) 3271–3291.

[29] Z. Chen, A shunting inhibitory cellular neural network with leakage delays andcontinuously distributed delays of neutral type, Neural Comput. Appl. 23(2013)2429–2434.

Page 28: Existence and stability of neutral–type BAM neural networks with … · 2017-04-21 · By employing the theory of exponential dichotomy on time scales, Banach fixed point theorem

616 Hui Zhou and Jehad Alzabut

[30] Y. K. Li, Y. Q. Li, Existence and exponential stability of almost periodic solutionfor neutral delay BAM neural networks with time-varying delays in leakage terms,J. Frankin Inst., 350 (2013) 2808–2825.

[31] C. J. Xu, P.L. Li, Existence and exponentially stability of anti-periodic solutionsfor neutral BAM neural networks with time-varying delays in the leakage terms, J.Nonlinear Sci. Appl. 9 (3)(2016) 1285–1305.

[32] M. Bohner, A. Peterson, Dynamic equation on time scales: An introduction withapplications, Birkhauser Boston, Boston, 2001.

[33] L. Bourdin, Nonshifted calculus of variations on time scales with ∇-differentialbleσ , J. Math. Anal. Appl. 411 (2014) 543–554.

[34] Y. K. Li, C. Wang, Uniformly almost periodic functions and almsot periodic so-lutions to dynamic equations on time scales, Abs. Appl. Anal. 2011 (2011) 1–12(Aticle ID 341520).

[35] Z. Zhang, K. Liu, Existence and global exponential stability of a periodic solutionto interval general bidirectional associative memory neural networks with multipledelays on time scales, Neural Networks 24 (2011) 427–439.

[36] Y. K. Li, C. Wang, Existence and global exponential stability of almost periodicsolutions for BAM neural networks with variable coefficients on time scales, Adv.Syst. Appl. 7 (2012) 19–40.

[37] W. Yang, Existence and stability of periodic solutions of BAM high–order hopfieldneural networks with impulses and delays on time scales, Electronic Journal ofDifferential Equations, Vol. 2012 (2012), No. 38, pp. 122.

[38] Y. Li, L. Yang, Almost automorphic solution for neutral type high–order Hopfieldneural networks with delays in leakage terms on time scales, Appl. Math. Compu.242 (2014) 679–693.

[39] J. Gao, Q. R. Wang, L. W. Zhang, Existence and stability of almost-periodic so-lutions for cellular neural networks with time-varying delays in leakage terms ontime scales, Appl. Math. Compu. 237 (2014) 639–649.

[40] H. Zhou, Z. Zhou, W. Jiang, Almost periodic solutions for neutral type BAM neuralnetworks distributes leakage delays on time scales, Neurocomputing 157 (2015)223–230.

[41] B. Du,Y. Liu, H.A. Batarfi, F. E.Alsaadi,Almost periodic solution for a neutral-typeneural networks with distributed leakage delays on time scales, Neurocomputing173 (3) (2016) 921–929.

[42] Y. K Li, L. Yang, W. Q. Wu, Anti–periodic solution for impulsive BAM neuralnetworks with time varying leakage delays on time scales, Neurocomputing 149(2015) 536–545.

[43] C. Wang, R. P. Agarwal, Almost periodic dynamics for impulsive delay neuralneteorks of a beneral type on almost periodic time scales, Commmun NonlinearSci Numer. Simulat. 36 (2016) 238–251.