Evolution of
Evolutionary Computation
Xin Yao
CERCIA
and
Natural Computation Group
School of Computer Science
The University of Birmingham, UK
http://www.cercia.ac.uk
What Is CERCIA
The
C
entre of
E
xcellence for
R
esearch in
C
omputational
I
ntelligence and
A
pplications
•
Focuses on
applications
of Nature Inspired
Computation to
real

world
business problems
•
Knowledge transfer centre in the School of
Computer Science, University of Birmingham
•
Established January, 2003 to bring cutting

edge technology to industry and businesses
The Birmingham Triangle
Basic Research:
Natural Computation
Group
Training and Teaching:
MSc in Natural
Computation
Applied Research
and Knowledge
Transfer:
CERCIA
Outline of My Talk
Evolutionary Optimisation
Theoretical Foundations
Evolutionary Learning
Other Applications
Concluding Remarks
Evolutionary Optimisation
Numerical optimisation
Fast evolutionary algorithms
Hybrid local search and evolutionary algorithms
Constraint handling

stochastic ranking
Combinatorial optimisation
Simulated annealing
Genetic annealing
Hybrid algorithms
Fast Evolutionary Algorithms
Fast evolutionary programming (FEP),
improved FEP (IFEP),
and
Levy EP (LEP)
for
unconstrained optimisation
Based on self

adaptive mixture of Gaussian, Cauchy and
Levy mutation operators
Both analytical and experimental studies have
shown their excellent performance on a wide
range of benchmark functions.
X. Yao, Y. Liu and G. Lin, “Evolutionary programming made faster,”
IEEE Transactions
on Evolutionary Computation
, 3(2):82

102, July 1999.
C. Y. Lee and X. Yao, “Evolutionary programming using the mutations based on the
Levy probability distribution,”
IEEE Transactions on Evolutionary Computation
, 8(1):1

13, January 2004.
Benchmark functions are fine
for testing ideas and
publishing papers. What
about real applications?
Discovering New Physical `Laws’ in Astrophysics

Modelling Radial Brightness Distributions in
Elliptical Galaxies
Empirical laws are widely used in
astrophysics.
However, as the observational data
increase, some of these laws do not seem
to describe the data well.
Can EC evolve/discover new empirical laws
that describe the data better?
Galaxies
The galaxy is the basic building block of the
universe
How galaxies form and evolve is not well
understood
The study of the appearance of a galaxy might
help answer the question (morphological
classification)
One of the exercises in astronomy is to obtain a
radial brightness profile of the galaxy
Two galaxy types:
ellipticals and spirals
Monochromatic (Negative) Images of Galaxies
A typical elliptical galaxy
A typical spiral galaxy
bulge
disk
Current Approach
Pick a functional form in advance
Drawbacks: ad hoc, difficult to determine and may only suit a
smaller number of profiles
Apply fitting algorithms to find suitable
parameters for the function. Usually adopt the
non

linear reduced c2 minimization
Drawbacks: difficult to set initial values and easily trapped in local minima
c
2
=
where
I
obs
(i)
is the individual observed profile value,
I
model
(i)
is the value calculated from the
fitting function,
is the number of degrees of freedom, and
is the standard deviation of the
data.
Our Evolutionary Approach
(1) Finding functional forms using GP :
A data

driven process without assuming a functional
form in advance
A bottom up process which suits modelling a large
number of galaxy profiles without any prior knowledge
of them
(2) Fitting parameters in the form found using FEP:
Not sensitive to initial setting values
More likely to find global minima
J. Li,
X. Yao
, C. Frayn, H. G. Khosroshahi, S. Raychaudhury, ``An Evolutionary Approach
to Modeling Radial Brightness Distributions in Elliptical Galaxies,''
Proc. of the 8th
International Conference on Parallel Problem Solving from Nature (PPSN VIII)
,
Lecture Notes in Computer Science, Vol. 3242, pp.591

601, Springer, September
2004.
Well …
It is debatable whether this is a
real application. It is still in the
scientific research domain, isn’t
it? …
Determining Unified Creep Damage Constitutive
Equations in Aluminium Alloy Modelling
Creep bahaviours of different materials are often
described by physically based unified creep
damage constitutive equations.
Such equations are extremely complex.
They often contain undecided constants
(parameters).
Traditional approaches are unable to find good
near optima for the parameters.
EAs have been shown to be more effective.
Example Equations
Difficulties for Traditional Optimisation Algorithms
Large areas of approximate plateaus
Huge variation of gradients among different
parameters, from 5.69E

6 to 8.23E15.
The objective functions have narrow ridges,
somewhat similar to the Rosenbrock function.
Traditional methods often find a solution which is
the same as the starting point!
Evolutionary Parameter Calibration and
Optimisation
Classical evolutionary programming
(CEP), fast EP (FEP) and improved
fast EP (IFEP) can be used to find
parameters in a complex and non

differentiable space.
B. Li, J. Lin and X. Yao, “A novel evolutionary
algorithm for determining unified creep damage
constitutive equations,”
International Journal of
Mechanical Sciences,
44
(2002) 987
–
1002.
Experimental Results
Model with Evolved Parameters by IFEP
IFEP Worked Well
Evolutionary algorithms can find near optimal
parameter values for the constitutive equations.
Evolutionary algorithms can deal with complex and
hard

to

differentiate functions.
Evolutionary algorithms can be used to discover
models and optimise their parameters.
But …
My problem has lots of constraints.
Existing constraint handling
algorithms look very complicated.
Any simple yet effective method?
Stochastic Ranking for
Constraint Handling
Replace the selection method in an EA by
stochastic ranking, you will get a constrained
optimisation algorithm!
Stochastic ranking is very simple to implement.
It’s basically a stochastic bubble sort algorithm.
It’s very effective.
T. P. Runarsson and X. Yao, “Stochastic Ranking for Constrained Evolutionary
Optimization,”
IEEE Transactions on Evolutionary Computation
,
4
(3):284

294,
September 2000.
T. Runarsson and X. Yao, ``Search Bias in Constrained Evolutionary
Optimization,''
IEEE Transactions on Systems, Man, and Cybernetics, Part C
,
35
(2):233

243, May 2005.
I wish …
The world is simple, but my problem
is really difficult, complicated,
complex, … Everything I’ve tried so
far has failed. …
If only I have simple problems all the
time!
Wish Granted, at Least Partially
How about making a difficult problem
simpler before solving it?
Landscape approximation using a
quadratic function
Local search + EAs
K.

H. Liang, X. Yao and C. Newton, “Evolutionary search of approximated
N

dimensional landscapes,”
International Journal of Knowledge

Based
Intelligent Engineering Systems
,
4
(3):172

183, July 2000.
All Sound Very Interesting, …
But I’m dealing with combinatorial
optimisation problems, not
numerical ones. …
Similar Ideas Can Be Applied to CO
EP algorithm for cutting stock problems, where
knowledge

based mutation operators were used
K.

H. Liang, X. Yao, C. S. Newton and D. Hoffman, ``A New Evolutionary
Approach to Cutting Stock Problems With and Without Contiguity,''
Computers and Operations Research
,
29
(12):1641

1659, Oct. 2002.
Materialised view selection in data warehousing
using a discrete variant of stochastic ranking for
constraint handling
C. Zhang, X. Yao and J. Yang, ``An Evolutionary Approach to Materialized Views
Selection in a Data Warehouse Environment,''
IEEE Transactions on Systems, Man
and Cybernetics, Part C
,
31
(3):282

294, August 2001.
J. X. Yu, X. Yao, C.

H. Choi, and G. Gou, ``Materialized view selection as
constrained evolutionary optimization,''
IEEE Transactions on Systems, Man and
Cybernetics, Part C
,
33
(4):458

467, November 2003.
Real World Applications?
Gritting Truck Route Optimisation
H. Handa, L. Chapman and X. Yao,
``Dynamic Salting Route Optimisation
using Evolutionary Computation,''
Proc.
of the 2005 Congress on Evolutionary
Computation (CEC'05)
, Vol.~1, 2

5
September 2005, IEEE Press,
Piscataway, NJ, USA, pp.158

165.
South Gloucestershire Road Network
OK, OK, …
You’ve made your points: there are
novel EAs that are very good.
But all your results are experimental.
There are no theories behind them …
Are you sure you know what you are
talking?
Hmmm …. Maybe a Small Clue?
While this might be true a few years ago, it is less
so now.
Not only can we prove convergence of EAs, we
can now analyse EA’s computation time
(scalability) for a given class of combinatorial
problems.
J. He and X. Yao, ``Time Complexity Analysis of an Evolutionary Algorithm
for Finding Nearly Maximum Cardinality Matching,'' Journal of Computer
Science and Technology,
19
(4):450

458, July 2004.
J. He and X. Yao, ``From an Individual to a Population: An Analysis of the
First Hitting Time of Population

Based Evolutionary Algorithms,''
IEEE
Transactions on Evolutionary Computation
,
6
(5):495

511, October 2002.
J. He and X. Yao, ``Drift Analysis and Average Time Complexity of
Evolutionary Algorithms,''
Artificial Intelligence
,
127
(1):57

85, March 2001.
Some Theoretical Results
We can show the necessary and sufficient
conditions, under which an EA will need no more
than a polynomial time (or no less than an
exponential time) to find a global optimum for a
given problem class
The particular analytical tool we used is: drift
analysis
J. He and X. Yao, ``A study of drift analysis for estimating computation
time of evolutionary algorithms,''
Natural Computing
,
3
(1):21

35, January
2004.
J. He and X. Yao, ``Towards an analytic framework for analysing the
computation time of evolutionary algorithms,''
Artificial Intelligence
,
145
(1

2):59

97, April 2003.
EA

Hard Problems
There are two and only two types of
EA

hard problems:
1.
The “wide

gap” problems; and
2.
The “long

path” problems.
Wait a Minute!
Optimisation is important, but I’m more
interested in learning!
Learning, adaptation, autonomy, …
These are trendy words that attract
attentions (e.g., funding).
Evolutionary Learning
Evolving neural network ensembles
Negative correlation learning
Artificial speciation and niching
Multi

objective approaches
Co

evolution
Iterated prisoner’s dilemma games
Co

evolving neural networks
Evolutionary Learning in a Picture
Fitness Evaluation
Learning is Different From Optimisation
In optimisation, the fitness function
reflects what is needed. The optimal
value is always better than the second
optimal one.
In learning, it is hard to quantify
generalisation exactly in practice.
Why
select the “best” individual in a
population as the final output?
X. Yao, Y. Liu and P. Darwen, ``How to make best use of evolutionary
learning,''
Complexity International: An Electronic Journal of Complex
Systems Research (ISSN 1320

0682)
, Vol. 3, July 1996.
Use Populations!
Instead of selecting the best individual as
the evolved solution, more (or even all)
individuals should be selected and
combined to form the final solution
The Simplest Approach
Do nothing during the evolution
Then combine all the individuals from
the last generation
X. Yao and Y. Liu, “Making use of population information
in evolutionary artificial neural networks,”
IEEE Trans. on
Systems, Man, and Cybernetics, Part B: Cybernetics
,
28
(3):417

425, June 1998.
Speciation by Fitness Sharing or Negative
Correlation
Introduce diversity generation and maintenance
methods, such as fitness sharing or negative
correlation, during evolution (or gradient descent
training)
The aim is to form a population of diverse
specialists (species) automatically
Md. Monirul Islam, X. Yao and K. Murase, “A constructive algorithm for training
cooperative neural network ensembles,”
IEEE Transactions on Neural
Networks
, 14(4):820

834, July 2003.
Y. Liu, X. Yao and T. Higuchi, “Evolutionary Ensembles with Negative
Correlation Learning,”
IEEE Trans. on Evolutionary Computation
, 4(4):380

387,
2000.
P. J. Darwen and X. Yao, ``Speciation as automatic categorical modularization,''
IEEE Transactions on Evolutionary Computation
,
1
(2):101

108, 1997.
Multi

objective Approaches
Two

objective evolutionary learning :
Accuracy: minimising quadratic/mean square
error;
Diversity: minimising mutual information
—
the
negative correlation penalty term.
A. Chandra and X. Yao, “Ensemble learning using multi

objective evolutionary
algorithms,” Accepted by
Journal of Mathematical Modelling and Algorithms
,
May 2005.
A. Chandra and X. Yao, “DIVACE: Diverse and Accurate Ensemble Learning
Algorithm,”
Proc. of the Fifth International Conference on Intelligent Data
Engineering and Automated Learning (IDEAL’04)
, Lecture Notes in Computer
Science, Springer, Vol. 3177, pp.619

625, August 2004.
Co

evolutionary Learning of Iterated Prisoner’s
Dilemma (IPD) Game Strategies
IPD with more than 2 players
IPD with more than two choices
IPD with reputation
IPD with mistakes/noises
Representations and genotype

phenotype
mapping
N

player IPD Games
Group Size
Multiple Choices
More Choices Discourage Cooperation
Reputation Helps
Noises/Mistakes
Low noise may promote cooperation
While high noise discourages cooperation
Other Applications
Evolutionary computation has “invaded”
into many other domains:
Search

based software engineering (SBSE)
Games and entertainment in general
Creative design
EPSRC (EP/C514297/1).
Nature Inspired Creative Design
.
http://www.nature

inspired.org/
More information:
www.cercia.ac.uk
Concluding Remarks
Evolutionary computation (EC) is evolving in
four major areas: optimisation, learning, design
and theory.
EC is much richer more than any single
algorithms. Never think EC is just a genetic
algorithm.
EC is a problem solving approach as well as a
discovery engine.
This talk has only scratched the surface of EC.
Comments 0
Log in to post a comment