Novel Lagrange sense exponential stability criteria for time-delayed stochastic Cohen–Grossberg neural networks with Markovian jump parameters: A graph-theoretic approach

Iswarya Manickam, Raja Ramachandran, Grienggrai Rajchakit, Jinde Caod,1, Chuangxia Huang Department of Mathematics, Alagappa University, Karaikudi-630 004, India Ramanujan Centre for Higher Mathematics, Alagappa University, Karaikudi-630 004, India Department of Mathematics, Faculty of Science, Maejo University, Chiang Mai, Thailand School of Mathematics, Southeast University, Nanjing 211189, China jdcao@seu.edu.cn Department of Applied Mathematics, Changsha University of Science and Technology, Hunan Provincial Key Laboratory of Mathematical Modeling and Analysis in Engineering, Changsha 410114, China


Introduction
First and foremost, Cohen and Grossberg initiate the Cohen-Grossberg neural networks (CGNNs) in 1983 [6]. Among different types of neural networks (NNs), CGNNs has massively obtained many researchers attention on the basis of its comprehensive usage in various domain like biology, computing technology and so on. In execution of NNs, the time delays are necessary, and it may generate unwanted dynamic response like instability and oscillation or chaotic behavior [19]. The influence of time delay on the dynamic behavior of systems has become a fundamental problem and should be taken into consideration in natural science and engineering technology [3,4,[8][9][10][12][13][14]16,17,21,28].
Furthermore, communication of synaptic on the nervous system is a noisy progress, the stability and instability of a NNs could be noticed by a valid stochastic input [31]. Especially, the stability principle for stochastic delayed neural networks become a prime important research topic; see [1,5]. Generally, in NNs, the information processing is a fastening event that occurred in the modes, the modes can be concerned with expressing finite state description from a trained network. Therefore, there are finite number of modes in various extent of NNs and that may switch each other. Moreover, it can be demonstrated that switching over NNs modes are regulated by a Markovian jumping. The NNs with Markovian jumping were hot research topic, and many researchers admitted themselves to do research in this field during the last decade; for this case, see [35,36].
An important beneficial characteristics of CGNNs is Lyapunov global stability. In the viewpoint of dynamical system, the unique equilibrium point, which attracts the entire path and indicates in separate sense, is known as monostable [1,6,19,20,25,26,28,30,35]. It is worthy to note that the Lagrange stability concerned with the stability of the total system that does not depend upon the particular equilibrium points. In particular, the Poincaré inequality, Lyapunov-Krasovskii functional and stochastic analysis in Lagrange sense was utilized. In [28], by using Lyapunov-Krasovskii functional and stochastic theory, the stability results were obtained. In [22,23], the author studied the stability of coupled systems using Kronecker product method. With the help of Kirchhoff's matrix tree theorem, Li et al. [15] originated a novel approach called graph-theoretic approach. Utilizing the achievements of the pioneering works, few researchers have initiated their work and apply this approach. For example, in [30], boundedness of the stochastic van der Pol oscillators was studied, in [32], boundedness of the stochastic differential equations were studied, in [33], stability of the neutral networks was studied. In [31], the boundedness of stochastic Cohen-Grossberg neural networks was studied by using M-matrix theory, Lyapunov and graph theory. Even though the graph-theoretic approach is frequently used in the study of stability; see [7,11,27,31,34]. In this novel approach, to the best of the authors knowledge, researchers have not correlated them with NNs, there are one or two of them in NNs. Inspired by the aforementioned works, we considered the Lagrange stability analysis of CGNNs with Markovian jumping and mixed time delay. However, this type of works are not available in the existing literatures.
In order to fulfill this gap, in this paper, we have incorporated the concepts of graph theory with NNs. Especially, a mathematical representation of the NNs is assumed as a directed graph with vertex system as a single neuron, interaction or interconnection among neurons in the synaptic connections as directed arcs. On the one hand, it is observed that in the stability theory Lyapunov method plays a major role. But on the other hand, constructing a suitable Lyapunov function was quite harder. So, in order to overcome this situation, a novel technique was explored in the base of results in graph theory. Compared with the outcome of some existing research work, the contribution of this work is formulated by: (i) Novel method that depends on the incorporation of graph theory principles as well as Lyapunov method, which are applied to investigate the GES issues in Lagrange sense for a new class of SCGNNs with delays and Markovian jump effects. (ii) In this work, new sufficient conditions are provided by Lyapunov-type and coefficient-type theorems, respectively. (iii) To avoid the complication in finding the Lyapunov function straightly, we construct a suitable Lyapunov function for vertex system by using the results in graph theory. (iv) Finally, to expose the results in this work, we describe some illustrative examples and simulations.
The content of this work is formulated as threefold. In Section 2, preliminaries and model description are given. The main results of this work are given in Section 3, and two numerical examples and simulations are presented in Section 4. Finally, we give a conclusion of this work in Section 5.

Preliminaries and model descriptions
We begin this section with some notations and basic concepts related to graph theory, which are extensively applied throughout this study.

Notations
A complete probability space [2] (Ξ, F, F, P) with a 1-D Brownian movement W (·) and a filtration F = {F t } for t 0 satisfies the standard conditions, and P indicates the probability measure. Let R, R n , R m×n denote, respectively, the set of all real numbers, Euclidean n-space and m×n matrix. We define the length or norm for the vector in R n as Euclidean norm, and it is denoted by |·|. Also, E(·) denotes the mathematical expectation on the subject of the probability measure, L = {1, 2, . . . , l}, and R n + = {u ∈ R n : u k > 0, k = 1, 2, . . . , n}. A generator S = (s ij ) N ×N of a right-continuous Markovian chain s(t) taking values in a finite state space S = {1, 2, . . . , N } is defined by where ϑ > 0 and lim ϑ→0 o(ϑ)/ϑ = 0. Here the transition rate from i to j is s ij , which is nonnegative if i = j, while s ij − j =i s ij . [29] with the collection of all vertices V = {1, 2, . . . , l} of neuron, and A is a set of arcs (k, h) leading from kth neuron to hth neuron. A subgraph S of a digraph G is said to be spanning if the vertex set of S is same as the vertex set of G. If each arc (k, h) is assigned by a positive weight a kh , then the digraph G is said to be weighted digraph W. The weight W(S) of subgraph S is the product of all its arcs weights. A set of all arcs of the digraph G with distinct vertices {P 1 , P 2 , . . . , P k } is said to be a directed path P if {(P k , P k+1 ), k = 1, 2, . . . , l − 1}. The directed path P is said to be a directed cycle Q if P k = P 1 . A unicyclic graph U in digraph G is a subgraph with a disjoint union of rooted trees whose roots form a dicycle. We define the weighted matrix A = (a kh ) n×n of given weighted digraph W with vertices whose entry a kh > 0 if the weight of arc (k, h) exists, otherwise -0. G is said to be strongly connected whenever for any two of distinct vertices, there is a dipath from each other. The Laplacian matrix of (G, A) is defined by

Model description
In this paper, we consider the new class of stochastic Cohen-Grossberg neural networks with mixed delays and Markovian jumping as and the initial conditions of (1) in the form for t 0, where k = 1, 2, . . . , l. Here the state vector u k (t) ∈ R relate with kth neuron at time t; a k (u k (t), s(t)) be the sign of amplification function, and b k (u k (t), s(t)) symbolizes the behaved function. The neuron transfer functions are represented by f hi (·) for all i = 1, 2, 3. We individually expressed the matrices (a kh ) l×l , (b kh ) l×l , (c kh ) l×l as connection weight matrices, discretely and distributively delayed matrices, and J k (s(t)) ∈ R stands for the neuron's external input with |J k (s(t))| J k (k = 1, 2, . . . ). W (·), s(·) denote white noise and Markovian switching, respectively. τ d (t) and h d (t) denotes, respectively, discrete and distributed time delays with 0 (t kh ) l×l represented the connection weight matrices, and g hi (·), i = 1, 2, 3, denotes the activation function. Here ξ = max{τ d , h d }, and ϕ k (·) (k ∈ L) is a continuous function defined on [−ξ, 0] whose values are real numbers. On the whole, under some certain conditions, the solution of given system (1) has a unique solution, which is indicated by Lemma 1. (See [15].) For l 2, the upcoming identity holds: Here, for any k, h ∈ L, cofactor of the kth diagonal element of Laplacian matrix is expressed as c k , F kh (u k (t), u h (t), i) is a arbitrary function, U is the collection of all spanning unicyclic graphs of (G, A), W(U) is the weight of U, and Q U denotes the directed cycle of U. In addition, e k > 0 whenever (G, A) is strongly connected.
Definition 1. A set ω ⊆ R n with compactness is known to be an attractive set of SCGNNs (1) if, for every solution u(t) ∈ R n \ ω with an initial condition (2), Here ω c = R n \ ω and ρ(u, w) = inf z∈w u − z , which denotes the length or norm between u(t) and ω.

Definition 2.
If there exist a radially unbounded and the positive definite function V , V (u(t), i) ∈ C 2,1 (R n × S, R + ), two positive constants ζ, s and a nonnegative continuous function k(·) such that for any solution u(t) of the given CGNNs system (1), then the compact set ω = u(t) ∈ R n : V u(t), i s is called a globally exponentially attractive (GEA) set of (1). Moreover, if network (1) has GEA set, then it is known to be the GES through Lagrange sense for pth moment.
For the whole of this study, we consider the successive hypotheses to obtain Lagrange stablity: (A1) Let p 2, α k , β k , γ k , δ k , λ k , η k be positive constants. A positive definite function V k (u k (t), i) ∈ C 2,1 (R × S, R + ) exists, which is radially unbounded, there is a arbitrary function F kh (u k (t), u h (t), i) and matrix A = (a kh ) l×l , where a kh (i) > 0, such that for every k ∈ L, i ∈ S, t > 0, and (A2) For each i ∈ S, through every dicycle U of weighted digraph (G, A), (A3) Let a k (u k (t), i) be a positive and bounded, moreover we can assume that 0 where h ∈ L, i ∈ S.

Exponential stability of stochastic Cohen-Grossberg neural networks
In this section, we examined Lagrange stability issued of CGNNs. Our main results are given by two types of sufficient conditions in the system: Lyapunov-type theorem and coefficient-type theorem.

Remark 1.
Allowing that the digraph (G, A) is balanced, then the following equation possess: Extending condition (5) into From Figure 1 we point out that there are two directed cycle Q 1 and Q 2 . Here Hence, for all functions, assumption (A2) is satisfied the balanced digraph with these vertices.
Corollary 1. Let (G, A) be a balanced digraph. If we put (7) in the place of (5), then the Lyapunov-kind theorem holds.  Remark 2. For each k, h, there exists functions R k (u k ) and R h (u h ) such that At that point, It is easy to understand that in Fig. 1, Hence, assumption (A2) is satisfied for the dicycles.
Remark 3. We essentially need strongly connectedness of digraph (G, A) in Theorem 1. Particularly, the system stability depends on the topology property of CGNNs. Suppose conversely assume that, (G, A) is not strongly connected than we may examine the stability of a region of vertices system alone. For instance, (G, A) be a not strongly connected digraph having four vertices with weights one. It is easy to understand that in Fig. 2. The weighted matrix A and Laplacian matrix L are listed as below: By calculating, we get e 1 = 0, e 2 = 0, e 3 = 0, e 4 = 3. That is, the subsystem stability is illustratable. This means that stability of the 4th vertex can be verified, however, the complete system stability cannot be examined. So from the graph theory view point, strong connectedness is needed.
Remark 4. Suppose that system (1) has no noise, at that moment rewritten as follows: k ∈ L, t 0.
Here the system is same as the system in [24,Eq. (1)] .
Remark 5. Assume that hypotheses (A1)-(A2) hold, then the result of (16) is GES in lagrange sense. For that reason, we strictly generalizes the results in [24].
Remark 6. In fact, we proved Theorem 1 by using Lyapunov functions and topological structure of CGNNs. Consequently, it is essential to locate some suitable Lyapunov functions to examine the presence of Theorem 1, that is the role of Theorem 2. Obviously, Theorem 2 represents the specific aspect in what manner to obtain specific function F ij to find out (3) and (4) in Theorem 1. In addition, this explains that (3) and (4) are accessible and achievable.

Illustrative examples
In this section, we examine two examples and numerical simulations to determine the exactness of this work.
Example 1. Consider the following two interacting neurons of SCGNNs with time delays and Markovian jump effects: Here we labeled s(t) as the right-continuous Markovian chain and as a generator with the values in S = {1, 2}, where ν > 0, and the specified parameters are in Table 1 Let p = 2, u k (t) is positive and radially unbounded, moreover, 0 < inf i∈S u k (t) u k (t) sup i∈S u k (t) < +∞ then we have the following inequalities: Therefore, conditions (A4)-(A5) are satisfied by the given parameters. Hence, by the hypothesis of Theorem 2, (17) is GES in Lagrange sense. Finally, the path of (17) converges to zero. Two trajectories of the given system are displayed in Fig. 4. Table 1. Parameters for t 0 and k = 1. Table 2. Parameters for t 0 and k = 1.

Parameter Parameter
Parameter Example 2. In this illustration, we examine the upcoming delayed SCGNNs with Markovian jumping to show that our condition have a wide application than the results [24] du k (t) = −a k u k (t), s(t) b k u k (t), s(t) where s(t) be the right-continuous Markovian chain, is show in Fig. 3, and S = (s ij ) 2×2 = −δ δ 1 −1 is a generator, whose value is S = {1, 2}. Here δ > 0, and the initial conditions are By putting p = 2, the parameters in Table 2 satisfy all the conditions given in the hypothesis of Theorem 2. Then by theorem, the trivial solution of (18) in mean square Lagrange stability. Thus, our result is more easily verified and applied in practice than [14]. Two trajectories of system (18) are displayed in Fig. 5. Obviously from figure the result of given system (18) is exponential stability in Lagrange sense for p = 2.

Conclusions
The most important part of this work is concentrated on the investigation of the stability analysis of CGNNs with mixed time-varying delays and Markovian switching using graph-theory method. Here we accomplished two novel sufficient principles that assures the pth moment exponential stability in Lagrange Sense of CGNNs. It is pointed out that the method and techniques presented here are more precisely vary from the previous method and techniques such as linear matrix inequality, the method of variation parameter. At the end of this work, we have given two examples to conclude and justify our main results.