Journal of Modern Physics, 2011, 2, 621-626
doi:10.4236/jmp.2011.226072 Published Online June 2011 (http://www.SciRP.org/journal/jmp)
Copyright © 2011 SciRes. JMP
Biological Evolution: Entropy, Complexity and Stability
Charu Gopal Chakrabarti, Koyel Ghosh
Department of Ap pl i e d Mat hematics, University of Calcutta, Kolkata, India
E-mail: cgc_math@rediffmail.com, koyelghosh1983@gmail.com
Received December 28, 2010; revised March 28, 2011; accepted April 10, 2011
Abstract
In the present paper we have made an attempt to investigate the importance of the concepts of dynamical
stability and complexity along with their interrelationship in an evolving biological systems described by a
system of kinetic (both deterministic and chaotic) equations. The key to the investigation lies in the expres-
sion of a time-dependent Boltzmann-like entropy function derived from the dynamical model of the system.
A significant result is the determination of the expression of Boltzmannentropy production rate of the
evolving system leading to the well-known Pesin-type identity which provides an elegant and simple meas-
ure of dynamical complexity in terms of positive Lyapunov exponents. The expression of dynamical com-
plexity has been found to be very suitable in the study of the increase of dynamical complexity with the suc-
cessive instabilities resulting from the appearance of new polymer species (or ecological species) into the
original system. The increase of the dynamical complexity with the evolutionary process has been explained
with a simple competitive model system leading to the “principle of natural selection”.
Keywords: Boltzmann-Like Entropy, Dynamical Stability, Pesin-Type Identity and Dynamical Complexity,
Lyapunov Exponents, Structural Instabilities, Biological Evolution
1. Introduction
The evolution in physical science is referred to the ap-
proach of a system to the thermodynamic equilibrium
characterized by the increase of entropy of the second
law of thermodynamics. The evolution in physical sys-
tem is always directed to the continuous disorganization,
that is, to the destruction of structures introduced by the
initial conditions [1-3]. The biological evolution, on the
other hand, points precisely to the opposite direction. In
biology the idea of evolution is associated with the irre-
versible increase of organization giving rise to the crea-
tion of more and more complex structures. These two
aspects of evolution can be reconciled by the concept of
the open-system model of living system and the applica-
tion of the second law of thermodynamics to the system
as a whole—living + environment [1-3]. Like in physical
science the entropy plays a significant role in biological
evolution [4-12]. Both organization and complexity can
be measured in terms of entropy. Another important idea
which play significant role in the study of evolution is
the concept of stability. The evolution can, in fact, be
viewed as a problem of stability [1] and it can also be
considered as the process that generates most of all, if
not all, complex structures in nature [8]. Both the con-
cepts of stability and complexity are interrelated playing
significant role in the process of evolution as we are go-
ing to investigate.
In the present paper we have made an attempt to study
the importance of the concepts of stability and complex-
ity together with their relationship in the characterization
of evolution of a biological system. We have considered
first a dynamical model of the biological system consist-
ing of a number of interacting polymer species (or eco-
logical species) described by a system of non-linear rate
equations. We have studied the local dynamical behav-
iours of the system such as the criteria of stability and
complexity around a stationary state. We have next con-
sidered a statistical mechanical model of the system
around the stationary state and found out the expression
of Boltzmann-like entropy of the evolving system. The
rate of change of Boltzmann-entropy, that is, Boltz-
mann-entropy production rate leads to the well-known
Pesin-type identity which provides an elegant and simple
measure of dynamical complexity in terms of positive
Lyapunov exponents [17-19]. The expression of dy-
namical complexity has been found to be very suitable in
the study of the increase of complexity with the succes-
sive instabilities resulting from the appearance of new
polymer species (or ecological species) into the original
C. G. CHAKRABARTI ET AL.
Copyright © 2011 SciRes. JMP
622
system. The increase of the dynamical complexity with
the evolutionary process has been explained with a sim-
ple competitive model system leading to the “principle of
natural selection”.
2. Biological System: Dynamical Model and
Stability
Let us consider a biological system consisting of n inter-
acting components (e.g. polymer species or ecological
species) of concentration xi, (i = 1,2, ···,n). The dynamical
model consists of the set of rate equations in state space
,xfx
(2.1)
where

12
,,,
n
x
xx x is a point in the n-dimensional
state-space,
is a control parameter. For simplicity we
consider a single parameter. The functions
1,
f
f
2,,
n
f
f are assumed to be continuously differenti-
able in some open set
;0,1,2,,
ii
Sxx in in
the state-space. The system of equations (2.1) are in gen-
eral non-linear and it is difficult to find out the solution
in closed form. We assume that for a certain value (or a
range of values) of the control parameter
the system
(2.1) has a stationary solution or fixed point
1,
x

2,n
xx (say). We consider a neighbouring point

δ
x
tx x , where

δ
x
t is the deviation of
x
t
from the stationary state or fixed point x*. Linearizing
the system of Equations (2.1) about the stationary point
or fixed point
x
the time-evolution of

δ
x
t is given
by
 
δδ
x
tAxxt
(2.2)
where


ij
x
Axff

  

is the Jacobian matrix of
the function f at the stationary state
x
. The time de-
rivative in the left-hand side represents the local deriva-
tive δδt instead of the total time-derivative ddt. The
solution of (2.2) is given by

δδ0
At
xte x (2.3)

δ0x is the deviation of the initial state (or point)
from the stationary point
x
. For orthonormal repre-
sentation of the Jacobian matrix

A
x we have

12
Diag, ,n
Ax

 (2.4)
where

12
,, n

are the eigenvalues of the matrix

A
x. In this case the solution (2.3) reduces to the
form
 
δδ0, 1,2,,
it
ii
x
txei n
 (2.5)
The solution (2.5) shows that the asymptotic stability
of the stationary state
x
requires that the real parts of
all eigenvalues must be negative: Re (0
j
for all
j
).
This implies that all the deviations

δi
x
t regress with
time for asymptotic stability. If any one of the eigenval-
ues has a positive real part, the stationary state is unsta-
ble. We can extend the above result for chaotic systems.
For the dynamical model system (2.1) let us consider two
neighbouring trajectories—one arising from the refer-
ence point r
x
and another from the neighbouring point
δ
r
x
tx xt . The distance between the two trajec-
tories at time t is given by the Euclidean norm
 
1
22 2
2
12
δδδ δ
n
x
txtxt xt

(2.6)
For orthonormal representation of the Jacobian matrix
A
x
we have
 
δδ0, 1,2,,
it
ii
x
txei n
 (2.7)
where
1, 2,,
iin
is the Lyapunov exponent defined
as the average rate of divergence of two neighbouring
trajectories [20]:

 
δ
1
limlog,1, 2,,
δ0
i
iti
xt in
tx





(2.8)
It is important to note that the Lyapunov exponent i
,
is the value of the real part of the ith eigenvalue averaged
over the trajectory under study [20]. For different values
(zero, negative and positive) of the Lyapunov exponents
we will get different types of attractors, for examples,
fixed points, limit cycles, quasiperiodic torus and chaotic
in three dimensional state space [17].
3. Statistical Model: Entropy and Dynamical
Complexity
We now consider the concept of complexity. A system
consisting of a large number of interacting or interrelated
elements or components is called a complex system.
How to measure the complexity? There are different ap-
proaches to the concept of complexity. The entropy
which is at the heart of statistical mechanics and infor-
mation theory plays a vital role in the characterization of
complexity [11]. For the entropic characterization of
complexity we need a statistical mechanical model out of
the dynamical model of the system. Statistical model is
necessary in view of the enormous number of accessible
microstates (or representative points) along the different
trajectories from the initial state to the current state. The
statistical mechanics is also necessary for the chaotic
systems which are characterized by exponential separa-
tion of nearby trajectories [18,19]. Both the cases repre-
sent complexity about the dynamical behaviours of the
system. To find out a measure of complexity we require
an appropriate measure of entropy characterizing the
evolution of the system from the initial state x(0) to the
current state x(t). For the system under consideration we
C. G. CHAKRABARTI ET AL.
Copyright © 2011 SciRes. JMP
623
have from (2.5)
  
δδ0δ0
At
xtexBt x (3.1)
where

At
Bt e is the matrix of evolution and plays a
crucial role in evolution of the system. In the orthonor-
mal representation the evolution matrix At
e can be rep-
resented as the diagonal matrix
12
diag, ,
tt
At
eee

nt
e
. The diagonal elements it
e
,

1, 2,,in char-
acterize the different trajectories connecting the initial
state to the final (current) state. The accessible micro-
states along the trajectories it
e
are thus characterized
by the quantities it
e
,

1, 2,,in. Then the measure
of all accessible microstates lying on the trajectory it
e
can be taken to be proportional to the quantity it
e
,

1, 2,,in. The quantity

1
i
nt
i
te

is then
proportional to the measure (or volume) of the totality of
all accessible microstates lying on the different trajecto-
ries for evolution:
δ0δ
x
xt. With this interpreta-
tion of the quantity
t we can define a Boltzmann -
like entropy of the macrostate consisting of all accessible
microstates in the evolution
δ0δ
x
xt as
 
11
lnln in
nt
B
i
ii
H
tt et
 
(3.2)
where we have taken the multiplicative constant factor to
be equal to unity. The quantity (3.2) is the measure of
entropy associated with the evolution
δ0δ
x
xt.
We can approach the determination of the expression of
the entropy (3.4) without any consideration of the statis-
tical model. The complexity of evolving system lies with
the entropy of the evolution matrix

At
Bt e. The en-
tropy of the non-probabilistic square matrix
Bt, con-
sistent with Boltzmann entropy, is given by [14]
 
1
loglog i
nt
Bi
H
tBt e

(3.3)
where

Bt is the determinant of the diagonal matrix

12
diag, ,nt
tt
At
eeee

. Question may arise about
the physical validity of Boltzmann-like entropy for the
system under consideration. The system under consid-
eration being in the vicinity of local stationary state with
a certain value of the control parameter
, the use of
Boltzmann-like entropy is justified [19]. The exponential
in (3.3) with negative Lyapunov exponents i
must be
replaced by 1, only one microstate with negative Lyapunov
exponent being occupied δ0
i
x
. So the entropy (3.2)
reduces to the form

1
n
B
i
i
H
tt
(3.4)
where the summation in the right-hand side extends over
all positive Lyapunov exponents only. Evidently the en-
tropy

B
H
t is zero at the initial time t = 0 and it in-
creases with the time evolution. We now proceed to
measure the complexity associated with the evolution.
The complexity which we call dynamical complexity is a
property of the evolution of a state and not of the state
itself [13]. We, therefore, define the dynamical complex-
ity as the rate of change of the entropy (3.4), that is, the
Boltzmann-entropy production rate

1
n
B
i
i
CHt

(3.5)
where the right-hand side represents the sum of all posi-
tive Lyapunov exponents. The result (3.5) is analogues to
Kolmogorov-Sinai entropy rate (or simply K-S entropy)
and corresponds to the well-known Pesin’s identity
[17,21]. Pesin-like identity plays significant role in the
characterization of complexity of different type of dy-
namical systems, for example, hyperbolic dynamical
system [22]. The complexity measure (3.5) shows that it
is completely dependent on the positive Lyapunov ex-
ponents of the system. It remains bounded for chaotic
attractors which may be a point, a closed curve or an
unclosed but bounded orbit, even for very complex one
such as Rossler’s strange attractor [15,16].
4. Structural Instabilities: Increase of
Complexity and Biological Evolution
In biology and sociology the idea of evolution is associ-
ated with the increase of complexity or organization giv-
ing rise to more and more complex structure [1]. In this
section we wish to study evolution on the basis of the
measure of complexity (3.5). To study evolution we have
to start with appropriate rate equations or model equa-
tions describing the process. We assume that the system
is maintained uniformly and that there exists at least one
asymptotically stable stationary solution of the model
system. This implies that all deviations or fluctuations
regress in time, that is, all the eigenvalues of the charac-
teristic equation have negative real parts. We now need
to incorporate structural fluctuation resulting from the
appearance of a new species or mutants in the system at
stable steady state. Like in many other systems, the evo-
lution of the system under consideration depends on the
control parameter
. As the system evolves and con-
tinuously perturbed by the outside world the parameter
can then change smoothly or abruptly. The change of
the parameter
generally changes the structure of the
rate equations. As a result of the change of the parameter
the stability of the system may be disturbed with the
inclusion of a new species into the system. However, in
view of the smallness of the change of the parameter
,
the structural stability of rate equations is assumed not to
be disturbed, the enlarged system evolves eventually to a
new stationary state, reached sooner or later depending
C. G. CHAKRABARTI ET AL.
Copyright © 2011 SciRes. JMP
624
on the magnitude of the changing parameter
[1,23].
We have to study the increase of complexity (under cer-
tain condition) of the enlarged system with the succes-
sive instabilities resulting from the successive appear-
ance of new species. According to the measure of com-
plexity (3.5) the increase of complexity is equivalent to
the appearance of a new positive Lyapunov exponent or
an eigenvalue with positive real part. Let us explain this
with a simple competitive model system. Let us first
consider a single polymer species (or an ecological spe-
cies) of concentration x1 in a medium of limited re-
sources. The dynamical equation of this reference spe-
cies is assumed to be governed by the logistic growth
equation
111
x
xabx
(4.1)
with stationary states
1,0xab
. The stationary
state 0 is unstable where as ab is stable. The point
ab thus represents the stable steady state of the spe-
cies 1
x
.
Now we suppose that a new species appears by muta-
tion or a new species invades the system. Let 2
x
be the
concentration of the new species at some time t and is in
competition with the original species 1
x
for a limited
resource. Let the governing equations for the whole sys-
tem be given by the Lotka-Volterra model equations of
competition (for simplicity excluding the case of compe-
tition exclusion) [24].




111112
222212
xxabxx
x
xabx x


(4.2)
The system (4.2) has three stationary states

123
,,
s
ss

12
12 3
12
0, 0,, 0,0,
aa
ss s
bb
 
 
 
 
(4.3)
The first stationary state

10, 0s is trivial and un-
stable. The second stationary state
211
,0sab con-
sists of the population of the first species only and cor-
responds to the moment of the appearance of the second
species or external disturbance. The system can then
evolve if the state

211
,0sab is unstable which
requires the positivity of the Lyapunov exponent or posi-
tivity of the real part of the eigenvalue. This requires
12 1221
121
0or
ba abaa
bbb

(4.4)
This is the condition of growth of the second species
2
x
and the second species 2
x
grows to some finite
value 22
ab. The total system then evolves to the new
(third) stationary state
322
0,
s
ab implying the
extinction of the first species 1
x
. The third stationary
322
0,
s
ab is stable if 2211
ab ab. With this cri-
teria of stability of the third stationary state 3
s
we can now
introduce a new species 3
x
to the state
322
0,
s
ab
to disturb its stability. The criteria of instability of the
system with the inclusion of the third new species 3
x
requires
32
32
aa
bb
(4.5)
The process may go on with successive instabilities of
the stationary states with the appearance of new species.
Note that the appearance of successive instabilities imply
the appearance of successive eigenvalues with positive
real parts (or positive Lyapunov exponents) and hence
the increase of complexity step by step.In ecology this
transitional process corresponds to the process of eco-
logical succession [24,25]. This result may be interpreted
in another way: the system tends in the long run to the
stationary state characterized by the maximum of the
fitness function ii
ab:
3
12
123
n
n
aa
aa
bbb b
 (4.6)
which is nothing but the Gauss-Volterra principle of
‘natural selection’ in competitive system. We thus see
that the process of natural selection lies in the increase of
complexity. We have proved the increase of complexity
given by (3.5) with the increase of number of species and
therefore with the number of differential equations de-
scribing the dynamics of the system. The validity of the
above statement requires two conditions to be satisfied.
First, the system must be at stable stationary states just
before the appearance of any new species and secondly,
the system must be unstable just after the appearance of
the new species. However, if any one of the conditions
fails, the increase of complexity with the increase of the
number of differential equations does not materialize.
We shall illustrate this with a model system. Let us con-
sider Rossler model system which is the simplest possi-
ble strange attractor. The system is described by the sys-
tem of three differential equations [16]
x
yz

(4.7a)
y
xay
(4.7b)

zbzxc

(4.7c)
For the choice of parameters a = 0.1; b = 0.1 and c =
14 there is apparent chaotic attractor. The Lyapunov ex-
ponents have been determined by computation simula-
tion to be approximately 0.072, 0 and 13.79. The first
two equations of the system are linear. We begin by
looking at the dynamics in the xy-plane only. Setting z =
0 yields
C. G. CHAKRABARTI ET AL.
Copyright © 2011 SciRes. JMP
625
xy
y
xay


(4.8)
The origin (0,0) is a stationary point. The eigenvalues
of the Jacobian matrix at (0, 0) are
1
2
24aa




. For
a > 0, there is at least one eigenvalue with positive real
part (or positive Lyapunov exponents). So the origin is
unstable. We may consider the system (4.8) to represent
the dynamics of a system of two species x and y having
unstable stationary state (0, 0). The system (4.7) may be
considered as the enlarged form of the system (4.8) when
a new species z is introduced in the original system (4.8).
The measure of complexity of the system (4.7) according
to the formula (3.5) is given by C2 = 0.072 and that of the
system (4.8) is C1 = 0.1. We thus see the decrease of
complexity in spite of the increasing of the number of
differential equations. This is in view of instability of the
stationary state of the original system (4.8). As such
Rossler system (4.7) can not serve as a mathematical
model for biological evolution.
5. Conclusions
The main objectives of the paper is to study the interrela-
tionship between the concepts of dynamical stability and
complexity and to study the importance of this relation-
ship in an evolving biological system on the basis of dy-
namical model (both deterministic and chaotic) of the
system described by a set of kinetic equations. The char-
acteristic features and results of the paper are as follows:
1) We have started with the dynamical model of a bio-
logical system consisting of a number of interacting bio-
polymer species (or ecological species). The study of
dynamical stability and dynamical complexity is con-
fined to the local behaviour or analysis of the system
around a stationary or reference state.
2) A significant step in the characterization of the sta-
tistical and chaotic behaviours of the evolving system
near a stationary or reference state lies in the use of a
Boltzmann-like entropy (3.4) which is valid subject to
the local character of the system around a stationary state
or fixed point with a certain value of the control parame-
ter
[1,23]. A completely different approach to the
Boltzmannlike entropy (3.4) is provided by the entropy
of the evolution matrix [14].
3) Another significant result is the expression of
Boltzmann-entropy production rate leading to the
well-known Pesin-type identity which provides an ele-
gant and simple measure of the dynamical complexity
(3.5) in terms of positive Lyapunov exponents.
4) The dependence of the measure of complexity (3.5)
on the positive Lyapunov exponents (or positivity of real
parts of eigenvalues) makes it very easy to understand
the relationship between the concepts of dynamical sta-
bility and dynamical complexity. This stability-com-
plexity relationship is of significant importance in the
study of evolution of the system.
5) The expression of dynamical complexity (3.5)
which is the sum of positive Lyapunov exponents is very
helpful in the study of the increase of complexity with
the instability resulting from the appearance of a new
species into the system at stable steady state. This is an
advantageous point with the expression of dynamical
complexity (3.5).
6) The increase of the number of species or increase of
the number of differential equations describing the sys-
tem does not always imply the increase of the dynamical
complexity. The increase of dynamical complexity is
subject to two conditions to be satisfied. Violation of any
one of these conditions results in the failure of the in-
crease of complexity. In Section 4 we have explained it
with an illustrative example.
7) In Section 4, using a simple competitive model sys-
tem we have illustrated the increase of dynamical com-
plexity with the successive instabilities. We have also
shown how the increase of dynamical complexity leads
to Gauss-Volterra “principle of natural selection” for the
survival of the fittest [6]. The present study of biological
evolution on the basis of concepts of stability, entropy
and complexity is in the spirit of the principle of “order
through fluctuation” [1-3].
6. Acknowledgements
The authors wish to thank the learned referees for their
valuable comments and suggestions for the modification
and revision of the paper. The work was done under a
major research project sanctioned by U.G.C (India).
5. References
[1] G. Nicolis and I. Prigogine, “Self-Organization in Non-E-
Quilibrium Systems,” Wiley and Sons, New York, 1977.
[2] I. Prigogine and G. Nicolis, “Biological Order, Structure
and Instability,” Quarterly Review of Biophysics, Vol. 4,
No. 2-3, 1971, pp. 107-148.
[3] I. Prigogine, G. Nicolis and A. Babloyantz, “Thermody-
namics of Evolution. Part-1,” Physics Today, Vol. 25, No.
11, 1972, p. 23; “Thermodynamics of Evolution. Part-2,”
Physics Today, Vol. 25, No. 11, 1972, p. 38.
doi:10.1063/1.3071140
[4] L. Demetrius, “Thermodynamics and Evolution,” Journal
of Theoretical Biology, Vol. 206, No. 1, 2000, pp. 1-16.
doi:10.1006/jtbi.2000.2106
[5] B. H. Weber, D. J. Depew and J. D. Smith, “Entropy,
Information and Evolution,” MIT Press, Cambridge,
C. G. CHAKRABARTI ET AL.
Copyright © 2011 SciRes. JMP
626
2000.
[6] R. Feistel and W. Ebeling, “Evolution of Complex Sys-
tem,” Kluwer Academic Publisher, Dordrecht, 1989.
[7] W. Ebeling and R. Feistel, “Theory of Self-Organization
and Evolution. The Role of Entropy, Value and Informa-
tion,” Journal of Non-Equilibrium Thermodynamics, Vol.
17, No. 4, 1992, pp. 303-332.
[8] T. S. Ray, “Evolution, Complexity and Artificial Real-
ity,” Physica D, Vol. 75, No. 1-3, 1994, pp. 239-263.
doi:10.1016/0167-2789(94)90286-0
[9] B. Drossel, “Biological Evolution and Statistical Phys-
ics,” Advances in Physics, Vol. 50, No. 2, 2001, pp.
209-295. doi:10.1080/00018730110041365
[10] N. H. Barton and J. Coe, “In the Application of Statistical
Physics to Evolutionary Biology,” Journal of Theoretical
Biology, Vol. 259, No. 2, 2009, pp. 317-324.
doi:10.1016/j.jtbi.2009.03.019
[11] C. G. Chakrabarti and S. Ghosh, “Statistical Mechanics
of Complex System,” Indian Journal of Theoretical
Physics, Vol. 48, 1995, p. 43; “Entropic Models and
Analysis of Complex Systems,” System Analysis, Model
and Simulation, Vol. 23, 1996, p. 103.
[12] C. G. Chakrabarti and Koyel Ghosh, “Maximum-Entropy
Principle: Ecological Organization and Evolution,” Jour-
nal of Biological Physics, Vol. 36, No. 2, 2009, pp.
175-183.
[13] S. Loyld and H. Pagels: “Complexity and Thermody-
namic Depth,” Annals of Physics, Vol. 188, No. 1, 1988,
pp. 186-213. doi:10.1016/0003-4916(88)90094-2
[14] G. Jumarie, “Maximum Entropy, Information without
Probability and Complex Fractals,” Kluwer Academic
Publishers, Dordrecht, 2000.
[15] S. H. Strogatz, “Non-Linear Dynamics and Chaos,” Ad-
dison-Weslay Publishing Co. Inc, New York, 1994.
[16] K. T. Aligood, T. D. Sauer and J. A. Yorke, “Chaos: An
Introduction to Dynamical System,” Springer, New York,
1997.
[17] R. C. Hilborn, “Chaos and Non-Linear Dynamics,” Ox-
ford University Press, Oxford, 2000.
[18] J. R. Dorfman, “An Introduction to Chaos and
Non-Equilibrium Statistical Mechanics,” Cambridge Uni-
versity Press, Cambridge, 1999.
[19] N. Korabel and Eli Barkai, “Pesin Type Identity for In-
termittent Dynamics with Zero Lyapunov Exponent,”
Physical Review Letters, Vol. 102, No. 5, 2009, Article
ID 050601. doi:10.1103/PhysRevLett.102.050601
[20] V. S. Anishchenko et al., “Non-Linear Dynamics of Cha-
otic and Stochastic Systems,” Springer, New York, 2001.
[21] V. Latora and M. Baranger, “Kolmogorov-Sinai Entropy
Rate Vs Physical Entropy,” Physical Review Letters, Vol.
82, No. 3, 1999, pp. 520-523.
doi:10.1103/PhysRevLett.82.520
[22] C. Beck and F. Sch
 lo
 gl, “Thermodynamics of Chaotic
Systems,” Cambridge University Press, Cambridge, 1997.
[23] H. Haken, “Advanced Synergetics,” Springer, New York,
1983.
[24] C. G. Chakrabarti, S. Ghosh and S. Bhadra, “Non-Equi-
librium Thermodynamics of Lotka-Volterrra Ecosystem:
Stability And Evolution,” Journal of Biological Physics,
Vol. 21, No. 4, 1995, pp. 273-284.
doi:10.1007/BF00700629
[25] R. V. Sole and J. Bascompte, “Self-Organization in Com-
plex Ecosystems,” Princeton University Press, Princeton,
2004.