Upload
xiao-song-yang
View
218
Download
1
Embed Size (px)
Citation preview
ARTICLE IN PRESS
/neucom
Neurocomputing 69 (2005) 232–241
www.elsevier.com/locate
Ab
1.
092
doi
�
Tec
Letters
China
,
Chaos and transient chaos in simpleHopfield neural networks
Xiao-Song Yanga,b,�, Quan Yuanb
aDepartment of Mathematics, Huazhong University of Science and Technology, Wuhan 430074,bInstitute for Nonlinear Systems, Chongqing University of Posts and Telecommunications
Chongqing 400065, China
Received 10 May 2005; received in revised form 7 June 2005; accepted 7 June 2005
Available online 15 September 2005
Communicated by R.W. Newcomb
stract
and a
In this paper a class of simple chaotic Hopfield neural networks is presented,bifurcation from transient chaos to chaos is discussed.
r 2005 Elsevier B.V. All rights reserved.
Keywords: Chaos; Transient chaos; Lyapunov exponent; Hopfield neural networks
Introduction
mountg andworks,werfultworks
An understanding of dynamical behavior of neural networks is of parasignificance to the study of function of the brain in information processinpossible engineering applications. Among the various dynamics of neural netdeterministic chaos is of much interest and has been regarded as a pomechanism for the storage, retrieval and creation of information in neural neand has received a considerable attention [1–5,7,11].
5-2312/$ - see front matter r 2005 Elsevier B.V. All rights reserved.
:10.1016/j.neucom.2005.06.005
Corresponding author. Department of Mathematics, Huazhong University of Science and
hnology, Wuhan 430074, China.
E-mail addresses: [email protected] (X.-S. Yang), [email protected] (Q. Yuan).
As regards the Hopfield-type neural networks (HNNs) [8] described byhe caseo beennctionsfered a
pfield-
(1)
nctionix thatding ofchaotic
ARTICLE IN PRESS
X.-S. Yang, Q. Yuan / Neurocomputing 69 (2005) 232–241 233
autonomous ordinary differential equations, chaos was mainly observed in tthat the dimension is not less than four [2,3]. In the last few years chaos has alsnumerically observed in three-dimensional HNNs with sigmoidal response fuand investigated in several papers [4,11]. In particular, Yang and Li [11] ofrigorous analysis by means of topological horseshoe theory.In this paper, we are concerned with chaos in the following 3-neuron Ho
type neural networks
_x ¼ �cx þ Wf ðxÞ; x 2 R3,
where f ðxÞ ¼ ðf ðx1Þ; f ðx2Þ; f ðx3ÞÞT with f i being a monotone continuous fu
which is bounded above and below, and W ¼ ðwijÞ is a 3� 3 connection matrdescribes the strength of connections between neurons. We report here our fina class of simple three-dimensional HNNs that exhibit chaotic and transientbehavior.
2. Simple Hopfield neural networks and their Lyapunov exponents
(2)
trix of
(3)
ated in
nlinearthough
In this section we consider the following simple 3-neuron simple HNNs
_x ¼ �x þ Wf ðxÞ; x 2 R3
that has the connection topology as illustrated in Fig. 1.Corresponding to this connection topology, we consider the connection ma
the following form:
W ¼
2 �1:2 0
1:9þ p 1:71 1:15
�4:75 0 1:1
0B@
1CA
with parameter p varying from �0.35 to 0.55, and f ðxÞ ¼ tanhðxÞ as illustrFig. 2.Analytic approach is usually limited to some simple cases in studying no
systems, and numerical simulations by computer are more efficient (al
Fig. 1. The connection topology includes a directed loop.
heuristic) in studying complex behavior of nonlinear systems. Therefore we study thehe LEstime-m andetailed6,9,10].nd are
ARTICLE IN PRESS
Fig. 2. The sigmoidal function f ðxÞ ¼ tanhðxÞ.
Fig. 3. Lyapunov exponents for parameters p 2 ð�0:35; 0:55Þ.
X.-S. Yang, Q. Yuan / Neurocomputing 69 (2005) 232–241234
dynamics of (2) through Lyapunov exponent (LE) by numerical methods. Tare fundamental to understanding of chaotic processes. They describe theasymptotic rate of separation of neighboring trajectories in a dynamical systepositive LEs of a dynamical system indicate that the system is chaotic. For ddiscussions about the LEs and their relation to chaos, we refer the reader to [The LEs for every p 2 ð�0:35; 0:55Þ are calculated by means of Matlab, a
illustrated in Fig. 3.
3. From transient chaos to chaos in simple Hopfield neural networks
of theeal anaos: aroutes
stems,a finiteeriodic
variesameter
taking17) westrated
ulationcan beinitial
ergoesseemsas well
ARTICLE IN PRESS
X.-S. Yang, Q. Yuan / Neurocomputing 69 (2005) 232–241 235
From Fig. 3 it can be seen that (2) is chaotic for some parameters becausecorresponding positive LEs. Of special interest is that numerical studies revunfamiliar bifurcation phenomenon of a route from periodic trajectory to chroute from transient chaos to chaos, which seems different from the existingto chaos as discussed in [9].Transient chaos is a ubiquitous phenomenon in nonlinear dynamical sy
which refers to the case where a trajectory typically behaves chaotically foramount of time and then settles into a final nonchaotic state, such as a ptrajectory or an equilibrium point.For (2), this transient chaos to chaos can take place when the parameter p
from 0.099 to 0.1. Now we discuss this phenomenon for the parp 2 ½0:099; 0:1.For the parameter p ¼ 0:099, computer simulation demonstrates that by
two initial conditions (0.7454, 1.7542, �4.9817) and (�0.7454, �1.7542, 4.98turn out to have two different stable periodic trajectories (limit cycles) as illuin Fig. 4.For p ¼ 0:1, one LE of (2) is positive and thus is chaotic, computer sim
shows that (2) has a pair of chaotic attractors as shown in Fig. 5, whichlocated by numerical simulation of two trajectories with the two differentconditions (�1.6855, 0.2929, 3.4720) and (1.6855, �0.2929, �3.4720).When the parameter p varies increasingly from 0.099 to 0.1, HNN (2) und
an interesting bifurcation from limit cycles to chaotic attractors, whichdifferent from the period doubling, intermittency and crisis route to chaosstudied in the literature [9].
Fig. 4. Two different stable trajectories R1 and R2.
Numerical simulation shows that the transient chaotic time becomes longer andhen p
chaotic
havioreriodicinitialse twospond-ated inior, byely, as
s (�10,Þ, thee are
to two
ition isry with
ractionreases.appearted in
ARTICLE IN PRESS
Fig. 5. Two different chaotic attractors A1 and A2.
X.-S. Yang, Q. Yuan / Neurocomputing 69 (2005) 232–241236
longer as the parameter p varies increasingly from 0.099 to 0.1, and wapproaches to 0.1, the transient chaotic time goes to infinity, thus a sustainedbehavior appears.For the parameter p 2 ½0:099; 0:1, a typical trajectory exhibits a chaotic be
for a finite amount of time and then goes asymptotically to a stable ptrajectory. Take p ¼ 0:0993 for example. Consider two (properly chosen)conditions (�10, –20, 30) and (10, 20, �30). Then the two trajectories with theinitial conditions display chaotic behavior for t 2 ð10; 200Þ, the correing portraits of these two trajectories during this period of time are illustrFig. 6. However as t4400, these two trajectories display periodic behavapproaching to two different stable periodic trajectories r1 and r2, respectivshown in Fig. 7.Now for p ¼ 0:0997, the two trajectories with the same two initial condition
–20, 30) and (10, 20, �30) display chaotic behavior for t 2 ð10; 600corresponding portraits of these two trajectories during this period of timillustrated in Fig. 8.As t4800, these two trajectories display periodic behavior, by approaching
different stable periodic trajectories as shown in Fig. 9, respectively.A great deal of numerical simulation shows that as long as an initial cond
taken not to be near to these two stable periodic trajectories, then the trajectothis initial condition displays transient chaos-to-chaos behavior.A possible underlying mechanism for this phenomenon is that the att
basins of the stable periodic trajectories become smaller as the parameter incWhen the parameter approaches to 0.1, the attraction basin turns out to disand the two trajectories lose their stability. As for the chaotic sets illustra
Figs. 6 and 8, they may have a little change but begin to be attractive, consequently
ARTICLE IN PRESS
Fig. 7. Two different stable periodic trajectories r1 and r2.
Fig. 6. Transient chaos for t 2 ð10; 200Þ.
X.-S. Yang, Q. Yuan / Neurocomputing 69 (2005) 232–241 237
the chaotic behavior is sustained.
4. Conclusions
ple 3-lar, we
In this paper we have shown a rich dynamics in a new class of simdimensional autonomous continuous time HNNs described by (2). In particu
have demonstrated that (2) with very simple connection matrices can exhibitg moref chaos
HNNsis wellriginalnalysis
ARTICLE IN PRESS
Fig. 8. Transient chaos for t 2 ð10; 600Þ.
Fig. 9. Two different stable periodic trajectories r̄1 and r̄2.
X.-S. Yang, Q. Yuan / Neurocomputing 69 (2005) 232–241238
transient chaos-to-chaos when a parameter varies. It is expected that findinsuch simple chaotic neural networks would be helpful for studying the role oin neural networks’ information processing.However, the rich dynamics in the simple 3-dimensional autonomous
discussed here is only demonstrated be means of calculation of the LEs. Asknown, the LEs calculated by computer are just an approximation of the oLEs because of finite amount of time for calculation. Therefore a rigorous aremains to be carried out to prove the results obtained in this paper.
Acknowledgment
alents
ARTICLE IN PRESS
X.-S. Yang, Q. Yuan / Neurocomputing 69 (2005) 232–241 239
This work is supported in part by the Program for New Century Excellent Tin University. The valuable comments by reviewers are greatly appreciated.
Appendix A. The code for the Lyapunov exponents
[T,Res] ¼ lyapunov(3,@new_ext,@ode45,0,0.5,2000,[�10 �20 30],10);figure;plot(T,Res);grid on;function f ¼ new_ext(t,X)x ¼ X(1); y ¼ X(2); z ¼ X(3);k ¼ X(1:3);Y ¼ [X(4), X(7), X(10);
X(5), X(8), X(11);X(6), X(9), X(12)];
f ¼ zeros(9,1);
f(1) ¼ �x+(2)Þ
ðzÞ^2Þ
375;
t,tend,
*tanh(x)+(�1.2)*tanh(y);f(2) ¼ �y+(1.9+0.0997)*tanh(x)+(1.71)*tanh(y)+(1.15)*tanh(z);f(3) ¼ �z+(�4.75)*tanh(x)+(1.1)*tanh(z);
Jac ¼
�1þ 2nð1� tanhðxÞ^2Þ �1:2nð1� tanhðyÞ^2Þ 0
ð1:9þ 0:0997Þnð1� tanhðxÞ^2Þ �1þ 1:71nð1� tanhðyÞ^2Þ 1:15nð1� tanhðzÞ^2
ð�4:75Þnð1� tanhðxÞ^2Þ 0 �1þ 1:1nð1� tanh
264
f(4:12) ¼ Jac*Y;function [Texp,Lexp] ¼ lyapunov(n,rhs_ext_fcn,fcn_integrator,tstart,stepystart,ioutp);n1 ¼ n; n2 ¼ n1*(n1+1);% Number of stepsnit ¼ round((tend�tstart)/stept);% Memory allocationy ¼ zeros(n2,1); cum ¼ zeros(n1,1); y0 ¼ y;gsc ¼ cum; znorm ¼ cum;% Initial valuesy(1:n) ¼ ystart(:);for i ¼ 1:n1 y((n1+1)*i) ¼ 1.0; end;t ¼ tstart;% Main loopfor ITERLYAP ¼ 1:nit% Solutuion of extended ODE system
[T,Y] ¼ feval(fcn_integrator,rhs_ext_fcn,[t, t+stept],y);t ¼ t+stept;y ¼ Y(size(Y,1),:);for i ¼ 1:n1
for j ¼ 1:n1 y0(n1*i+j) ¼ y(n1*j+i); end;
ARTICLE IN PRESS
X.-S. Yang, Q. Yuan / Neurocomputing 69 (2005) 232–241240
end;% construct new orthonormal basis by gram-schmidt
znorm(1) ¼ 0.0;
for j ¼ 1:n1 znorm(1) ¼ znorm(1)+y0(n1*j+1)^2; end;znorm(1) ¼ sqrt(znorm(1));for j ¼ 1:n1 y0(n1*j+1) ¼ y0(n1*j+1)/znorm(1); end;for j ¼ 2:n1for k ¼ 1:(j�1)gsc(k) ¼ 0.0;for l ¼ 1:n1 gsc(k) ¼ gsc(k)+y0(n1*l+j)*y0(n1*l+k); end;
end;for k ¼ 1:n1
for l ¼ 1:(j�1)y0(n1*k+j) ¼ y0(n1*k+j)�gsc(l)*y0(n1*k+l);
end;end;znorm(j) ¼ 0.0;for k ¼ 1:n1 znorm(j) ¼ znorm(j)+y0(n1*k+j)^2; end;znorm(j) ¼ sqrt(znorm(j));for k ¼ 1:n1 y0(n1*k+j) ¼ y0(n1*k+j)/znorm(j); end;
end;% update running vector magnitudes
for k ¼ 1:n1 cum(k) ¼ cum(k)+log(znorm(k)); end;
% normalize exponentfor k ¼ 1:n1
lp(k) ¼ cum(k)/(t�tstart);end;% Output modification
if ITERLYAP ¼ ¼ 1Lexp ¼ lp;Texp ¼ t;else
Lexp ¼ [Lexp
; lp];Texp ¼ [Texp; t];end; if (mod(ITERLYAP,ioutp) ¼ ¼ 0)fprintf(‘t ¼ %6.4f’,t);for k ¼ 1:n1 fprintf(‘ %10.6f’,lp(k)); end;fprintf(‘\n’);end;for i ¼ 1:n1
for j ¼ 1:n1y(n1*j+i) ¼ y0(n1*i+j);
end;end;
end;
References
) (1996)
ARTICLE IN PRESS
X.-S. Yang, Q. Yuan / Neurocomputing 69 (2005) 232–241 241
[1] A. Babloyantz, C. Lourenco, Brain Chaos and Computation, Int. J. Neural Syst. 7 (4
461–471.
, Neural
chaos in
ork, Int.
g 32–33
s. 57 (3)
3 (1983)
those of
w York,
1) (2006)
egree in
in 1991,
[2] H. Bersini, The frustrated and compositional nature of chaos in small Hopfield networks
Networks 11 (6) (1998) 1017–1025.
[3] H. Bersini, P. Sener, The connections between the frustrated chaos and the intermittency
small Hopfield networks, Neural Networks 15 (10) (2002) 1197–1204.
[4] A. Das, P. Das, A.B. Roy, Chaos in a three-dimensional general model of neural netw
J. Bifurc. Chaos 12 (10) (2000) 22271–22281.
[5] G. Dror, M. Tsodyks, Chaos in neural networks with dynamic synapses, Neurocomputin
(2000) 365–370.
[6] J.P. Eckmann, D. Ruelle, Ergodic Theory of Chaos and Strange Attractors, Rev. Mod. Phy
(1985) 617–656.
[7] M.R. Guevara, L. Glass, et al., Chaos in Neurobiology, IEEE Trans. Syst. Man Cybern. 1
790–798.
[8] J.J. Hopfield, Neurons with graded response have collective computational properties like
two-state neurons, Proc. Natl. Acad. Sci. USA 81 (10) (1984) 3088–3092.
[9] E. Ott, Chaos in Dynamical Systems, Cambridge University Press, New York, 1993.
[10] S. Wiggins, Introduction to Applied Nonlinear Dynamical Systems and Chaos, Springer, Ne
1990.
[11] X.-S. Yang, Q. Li, Horseshoe Chaos in Cellular Neural Networks, Int. J. Bifurc. Chaos 16 (
to appear.
Xiao-Song Yang was born in China in 1964. He received his M.S. d
Applied Mathematics from the Huazhong Normal University, Wuhan
f Science
hemical
ssociate
ersity of
hematics
uazhong
tics and
His interest sing and
and received his Ph.D degree in Pure Mathematics from the University o
and Technology of China, Hefei in 1998.
He served as an Instructor of Mathematics at the Wuhan Institute of C
Technology from 1991 to 1995. In 1998, he served as a Director and an A
Professor at the Institute of Nonlinear Systems at the Chongqing Univ
Posts and Technology, Chongqing. In 2001, he was Professor of Mat
and Electronic Engineering at the same university. In 2004, he joined H
University of Science and Technology as a Professor of Mathema
Systems Engineering.
s include dynamical systems theory, chaos, control, neural information proces
ks, and robotics.
He also
of Posts
of Posts
neural networ
He served as an Adjunct Professor of Automatic control at the Xiamen University, Xiamen.
served as an Adjunct Professor of Mathematics at the Southwest University, Chongqing.
He is an Associate Editor of the Journal of Control and Application.
Quan Yuan was born in Nanjing, February 20, 1981. He graduated from the Nanjing University
and Telecommunications in June 2003. He is now a graduate student at the Chongqing University
and Telecommunications under the supervision of Prof. Xiao-Song Yang.
His interests include neural networks and communication.