Evolutionary Algorithms

aroocarmineAI and Robotics

Oct 29, 2013 (3 years and 9 months ago)

79 views

Evolutionary Algorithms
Per Kristian Lehre
pkle@imm.dtu.dk
DTU – Technical University of Denmark
IMM – Department of Informatics
9th Estonian Summer School on
Computer and Systems Science,ESSCaSS 2010
Evolutionary Algorithms 1/46
Optimisation
Given a function f:X →R,find an x ∈ X
such that f (x) ≥ f (y) for all y ∈ X.
A general problem with lots of applications!
Can be solved efficiently in many special cases
mathematical optimisation techniques
optimisation variants of problems in P
Optimisation is thought to be hard in general
approximation algorithms
exact exponential algorithms
problem-independent,randomised search heuristics,
e.g.evolutionary algorithms
Evolutionary Algorithms 2/46
Evolution
Selection
Variation
Evolutionary Algorithms 3/46
Evolutionary Algorithms
Generate the initial population P(0) at random,and set t ←0.
repeat
Evaluate the fitness of each individual in P(t).
Select parents from P(t) based on their fitness.
Obtain population P(t +1) by
applying crossover and mutation to parents.
Set t ←t +1.
until termination criterion satisfied.
Basic idea from natural evolution and population genetics.
Survival of the fittest.
Evolutionary Algorithms 4/46
Not A New Idea...
Der Spiegel,November
18th,1964
Evolutionary Strategies (ES),a
type of EAs,invented by Hans-Paul
Schwefel and Ingo Rechenberg at
TUB in 1963.
Optimisation of wing shape using
Konrad Zuse’s Z23.
American and German school first
met at PPSN in Dortmund 1990.
Evolutionary Algorithms 5/46
Nature Inspired Optimisation
Problems
Continuous vs.combinatorial
Single- vs multi-objective
Dynamic and stochastic
...
Algorithms
Evolutionary Algorithms
Genetic Algorithms,Evolutionary Strategies,
Genetic Programming,Estimation of Distribution,...
Swarm Optimisation
Ant Colony Optimisation,PSO,Bee hives...
...
Evolutionary Algorithms 6/46
Outline
1
Introduction
Nature Inspired Optimisation
2
Evolutionary Algorithms
Representations
Genetic Operators
Selection Mechanisms
Diversity Mechanisms
Constraint Handling Techniques
3
Runtime Analysis
The Black Box Scenario and No Free Lunch
Runtime of (1+1) EA on OneMax
Overview of Techniques and Results
4
Summary
Evolutionary Algorithms 7/46
A Simple Evolutionary Algorithm
Simple Evolutionary Algorithm
Generate the initial population P(0) at random,and set t ←0.
repeat
Evaluate the fitness of each individual in P(t).
Select parents from P(t) based on their fitness.
Obtain population P(t +1) by
applying crossover and mutation to parents.
Set t ←t +1.
until termination criterion satisfied.
Basic idea from natural evolution and population genetics.
Survival of the fittest.
Evolutionary Algorithms 8/46
Representations
Representations of candidate solutions
on which the genetic operators can operate.
Bitstrings commonly used in combinatorial optimisation.
Other representations are possible
in conjunction with specialised genetic operators.
trees,permutations etc.
In general,genotype-phenotype mappings φ:G →P
G set of genotypes/chromosomes
P set of phenotypes/solutions
Fitness function f:P →R
Evolutionary Algorithms 9/46
Locality in Representations [Rothlauf,2006]
Rule of thumb
Small genotypic change → small phenotypic change.
Large genotypic change →large phenotypic change.
Evolutionary Algorithms 10/46
Exploration and Exploitation
Exploration of new parts of search space
Mutation operators
Recombination operators
Exploitation of promising genetic material
Selection mechanism
Evolutionary Algorithms 11/46
Mutation operators for bitstrings
The mutation operator introduces small,
random changes to an individual’s chromosome.
Local Mutation
One randomly chosen bit is flipped.
Global Mutation
Each bit flipped independently with a given probability p
m
,
called the per bit mutation rate,which is often 1/n,
where n is the chromosome length.
Pr [k bits flipped] =
￿
n
k
￿
∙ p
k
m
∙ (1 −p
m
)
n−k
.
Mutation rate
Note the difference between per bit (gene)
and per chromosome (individual) mutation rates.
Evolutionary Algorithms 12/46
Recombination operators - One point crossover
The recombination operator generates an offspring individual
whose chromosome is composed from the parents’ chromosomes.
Crossover rate
probability of applying crossover to parents
One point crossover between parents x and y
Randomly select a crossover point p in {1,2,...,n}.
Offspring 1 is x
1
∙ ∙ ∙ x
p
∙ y
p+1
∙ ∙ ∙ y
n
.
Offspring 2 is y
1
∙ ∙ ∙ y
p
∙ x
p+1
∙ ∙ ∙ x
n
.
Example
Parent x:101011 | 1010 Offspring 1:101011 | 1110
Parent y:010100 | 1110 Offspring 2:010100 | 1010
Evolutionary Algorithms 13/46
Recombination operators - Multi-point crossover
k-point crossover between parents x and y
Randomly select k crossover points p
1
< ∙ ∙ ∙ < p
k
in {1,2,...,n}.
Offspring 1 is x
1
∙ ∙ ∙ x
p
1
∙ y
p
1
+1
∙ ∙ ∙ y
p
2
∙ x
p
2
+1
∙ ∙ ∙ x
p
3
∙ ∙ ∙ etc.
Offspring 2 is y
1
∙ ∙ ∙ y
p
1
∙ x
p
1
+1
∙ ∙ ∙ x
p
2
∙ y
p
2
+1
∙ ∙ ∙ y
p
3
∙ ∙ ∙ etc.
Example (2-point crossover)
Parent x:101 | 011 | 1010 Offspring 1:101 | 100 | 1010
Parent y:010 | 100 | 1110 Offspring 2:010 | 011 | 1110
Evolutionary Algorithms 14/46
Recombination operators - Uniform crossover
Uniform crossover between parents x and y
Select a bitstring z of length n uniformly at random.
for all i in 1 to n
if z
i
= 1 then bit i in offspring 1 is x
i
else y
i
.
if z
i
= 1 then bit i in offspring 2 is y
i
else x
i
.
Example
z =1010001110
Parent x:1010111010 Offspring 1:1111001010
Parent y:0101001110 Offspring 2:0000111110
Evolutionary Algorithms 15/46
Selection and Reproduction
Selection emphasises the better solutions in a population
One or more copies of good solutions.
Inferior solutions are much less likely to be selected.
Not normally considered a search operator,
but influences search significantly
Selection can be used either before or after search operators.
When selection is used before search operators,the process of
choosing the next generation from the union of all parents
and offspring is sometimes called reproduction.
Generational gap of EA
refers to the overlap (i.e.,individuals that did not go through
any search operators) between the old and new generations.
The two extremes are generational EAs and steady-state EAs.
1-elitism can be regarded as having a generational gap of 1.
Evolutionary Algorithms 16/46
Fitness Proportional Selection
Probability of selecting individual x from population P is
Pr [x] =
f (x)
￿
y∈P
f (y)
.
Use raw fitness in computing selection probabilities.
Does not allow negative fitness values.
Also known as roulette wheel selection.
Weaknesses
Domination of “super individuals” in early generations.
Slow convergence in later generations.
Fitness scaling often used in early days to combat problem
Fitness function f replaced with a scaled fitness function
˜
f.
Evolutionary Algorithms 17/46
Ranking Selection
1
Sort population from best to worst according to fitness:
x
(λ−1)
,x
(λ−2)
,x
(λ−3)
,...,x
(0)
2
Select the γ-ranked individual x
(γ)
with probability Pr [γ],
where Pr [γ] is a ranking function,e.g.
linear ranking
exponential ranking
power ranking
geometric ranking
Evolutionary Algorithms 18/46
Linear ranking
Population size λ,and rank
γ,0 ≤ γ ≤ λ −1,(0 worst)
Linear ranking
Pr
linear
[γ]:=
α +(β −α) ∙
γ
λ−1
λ
where
￿
λ−1
γ=0
Pr
linear
[γ] = 1 implies
α +β = 2 and 1 ≤ β ≤ 2.
In expectation
best individual reproduced β times
worst individual reproduced α times.
γ
0
λ −1
α
β
Rank
Evolutionary Algorithms 19/46
Other ranking functions
Power ranking
Pr
power
[γ]:=
α +(β −α) ∙
￿
γ
λ−1
￿
k
C
,
Geometric ranking
Pr
geom
[γ]:=
α ∙ (1 −α)
λ−1−γ
C
,
Exponential ranking
Pr
exp
[γ]:=
1 −e
−γ
C
,
where C is a normalising factor and 0 < α < β.
Evolutionary Algorithms 20/46
Tournament Selection
Tournament selection with tournament size k
Randomly sample a subset P
￿
of k individuals from population P.
Select the individual in P
￿
with highest fitness.
Often,tournament size k = 2 is used.
Evolutionary Algorithms 21/46
(µ +λ) and (µ,λ) selection
Origins in Evolution Strategies.
(µ +λ)-selection
Parent population of size µ.
Generate λ offspring from randomly chosen parents.
Next population is µ best among parents and offspring.
(µ,λ)-selection (where λ > µ)
Parent population of size µ.
Generate λ offspring from randomly chosen parents.
Next population is µ best among offspring.
Evolutionary Algorithms 22/46
Selection pressure
Degree to which selection emphasises the better individuals.
How can selection pressure be measured and adjusted?
Take-over time τ

[Goldberg and Deb,1991,B¨ack,1994].
1
Initial population with unique fittest individual x

.
2
Apply selection operator repeatedly with no other operators.
3
τ

is number of generations until pop.consists of x

only.
Higher take-over time → lower selection pressure.
Fitness prop.τ


λlnλ
c
assuming fitness f (x) = exp(cx)
Linear ranking τ


2 ln(λ−1)
β−1
1 < β < 2
Tournament τ


lnλ+lnlnλ
lnk
tournament size k
(µ,λ) τ

=
lnλ
ln(λ/µ)
Evolutionary Algorithms 23/46
Diversity Mechanisms
Fitness sharing
g(x):=
f (x)
￿
y,d(x,y)≤σ
s(x,y)
s(x,y):= 1 −
d(x,y)
σ
Crowding
Standard Crowding
Deterministic Crowding
[Sareni and Krahenbuhl,1998]
[Friedrich et al.,2009]
Evolutionary Algorithms 24/46
Constraint Handling Techniques
Constrained optimisation
f:X →R objective function
g
i
:X →R inequality constraint(s)
Maximise f (x) while g
i
(x) ≤ 0.
feasible
infeasible
Penalty approaches:death,static,dynamic,adaptive,...
Multi-objective optimisation
Repair approaches
Decoders
[Coello,2002]
Evolutionary Algorithms 25/46
Analysis of Evolutionary Algorithms
Criteria for evaluating algorithms
1
Correctness
Does the algorithm always give the correct output?
2
Computational Complexity
How much computational resources does
the algorithm require to solve the problem?
Same criteria also applicable to evolutionary algorithms
1
Correctness.
Discover global optimum in finite time?
2
Computational Complexity.
Time (number of function evaluations)
most relevant computational resource.
Evolutionary Algorithms 26/46
Computational Complexity of EAs
Prediction of resources needed for a given instance.
Usually runtime as function of instance size.
Number of fitness evaluations before finding optimum.
Exponential runtime =⇒ Inefficient algorithm.
Polynomial runtime =⇒ “Efficient” algorithm.
Evolutionary Algorithms 27/46
Black Box Scenario
Function class F
Photo:E.Gerhard (1846).
f (x
1
),f (x
2
),f (x
3
),...
x
1
,x
2
,x
3
,...
f (x
1
),f (x
2
),f (x
3
),...,f (x
t
)
x
1
,x
2
,x
3
,...,x
t
A
f
Worst case runtime
max
f ∈F
T
A,f
Average case runtime is
￿
f ∈F
Pr [f ] T(A,f )
[Droste et al.,2006]
Evolutionary Algorithms 28/46
Black Box Scenario
Function class F
Photo:E.Gerhard (1846).
f (x
1
),f (x
2
),f (x
3
),...
x
1
,x
2
,x
3
,...
f (x
1
),f (x
2
),f (x
3
),...,f (x
t
)
x
1
,x
2
,x
3
,...,x
t
A
f
Worst case runtime
max
f ∈F
T
A,f
Average case runtime is
￿
f ∈F
Pr [f ] T(A,f )
[Droste et al.,2006]
Evolutionary Algorithms 28/46
Black Box Scenario
Function class F
Photo:E.Gerhard (1846).
f (x
1
),f (x
2
),f (x
3
),...
x
1
,x
2
,x
3
,...
f (x
1
),f (x
2
),f (x
3
),...,f (x
t
)
x
1
,x
2
,x
3
,...,x
t
A
f
Worst case runtime
max
f ∈F
T
A,f
Average case runtime is
￿
f ∈F
Pr [f ] T(A,f )
[Droste et al.,2006]
Evolutionary Algorithms 28/46
No Free Lunch
Theorem
([Wolpert and Macready,1997,Droste et al.,2002b])
Let F be a set of functions f:S →B,
where S and B are finite sets,B totally ordered.
If F is closed under permutations,
then the average case runtime over F
is the same for all search heuristics.
No search heuristic best on all problems.
Need to consider algorithms on specific problem classes F.
Function classes closed under permutation are not interesting...
(NB!See [Auger and Teytaud,2008] for continuous spaces.)
Evolutionary Algorithms 29/46
Expected Runtime and Success Probability
The runtime T
A,f
is a random variable.
Expected runtime E[T
A,f
] =
￿

t=1
tPr [T
A,f
= t].
Success probability within t(n) steps Pr [T
A,f
≤ t(n)].
Evolutionary Algorithms 30/46
(1+1) Evolutionary Algorithm
1:Sample x uniformly at random from {0,1}
n
.
2:repeat
3:x
￿
←x.
4:Flip each bit of x
￿
independently with probability 1/n.
5:if f (x
￿
) ≥ f (x) then
6:x ←x
￿
.
7:end if
8:until termination condition met.
Special case of the (µ+λ) EA.
Starting point for rigorous runtime analysis of EAs,e.g.
[M¨uhlenbein,1992,Garnier et al.,1999,Droste et al.,2002a]
Evolutionary Algorithms 31/46
Artificial Fitness Levels - Upper bounds
Search space partitioned into m subsets A
1
,A
2
,...,A
m
,
with increasing fitness,i.e.f (A
i
) < f (A
j
) for all i < j,
and f (A
m
) optimal.
Fitness
A
1
A
2
A
3
.
.
.
A
m−1
A
m
p
i
:Probability of jumping from
A
i
to any A
j
,i < j.
T
i
:Time to jump from
A
i
to any A
j
,i < j.
Expected runtime
E[T] ≤ E[T
1
+T
2
+∙ ∙ ∙ +T
m
]
= E[T
1
] +E[T
2
] +∙ ∙ ∙ +E[T
m
]
≤ 1/p
1
+1/p
2
+∙ ∙ ∙ 1/p
m
.
Evolutionary Algorithms 32/46
Artificial Fitness Levels - Upper bound on OneMax
Partitioning of search space in fitness levels
OneMax(x):= x
1
+x
2
+∙ ∙ ∙ +x
n
.
A
i
:all bitstrings with i 0-bits.
p
i
:probability of decreasing number of 1-bits,from within A
i
(at least prob.of flipping one 0-bit,and no other bits)
p
i
≥ i ∙
1
n

￿
1 −
1
n
￿
n−1
￿
￿￿
￿
≥1/e

i
en
.
Expected runtime
E[T
OneMax
] ≤
n
￿
i=1
1
p
i
=
n
￿
i=1
en
i
= O(n lnn).
Evolutionary Algorithms 33/46
Artificial Fitness Levels - Upper bound on OneMax
Partitioning of search space in fitness levels
OneMax(x):= x
1
+x
2
+∙ ∙ ∙ +x
n
.
A
i
:all bitstrings with i 0-bits.
p
i
:probability of decreasing number of 1-bits,from within A
i
(at least prob.of flipping one 0-bit,and no other bits)
p
i
≥ i ∙
1
n

￿
1 −
1
n
￿
n−1
￿
￿￿
￿
≥1/e

i
en
.
Expected runtime
E[T
OneMax
] ≤
n
￿
i=1
1
p
i
=
n
￿
i=1
en
i
= O(n lnn).
Evolutionary Algorithms 33/46
Artificial Fitness Levels - Exercise
Estimate an upper bound on the expected runtime of (1+1) EA on
LeadingOnes(x):=
n
￿
i=1
i
￿
j=1
x
i
.
x =
Leading 1-bits.
￿
￿￿
￿
1111111111111111 0
Random bitstring.
￿
￿￿
￿
∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗∗.
First 0-bit.
Artificial Fitness Levels,with level given by#leading 1-bits
Probability of increasing p
i
≥ 1/en for all i.
E[T
LeadingOnes
] ≤
n
￿
i=1
1
p
i
= O(n
2
).
Evolutionary Algorithms 34/46
Artificial Fitness Levels - Exercise
Estimate an upper bound on the expected runtime of (1+1) EA on
LeadingOnes(x):=
n
￿
i=1
i
￿
j=1
x
i
.
x =
Leading 1-bits.
￿
￿￿
￿
1111111111111111 0
Random bitstring.
￿
￿￿
￿
∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗∗.
First 0-bit.
Artificial Fitness Levels,with level given by#leading 1-bits
Probability of increasing p
i
≥ 1/en for all i.
E[T
LeadingOnes
] ≤
n
￿
i=1
1
p
i
= O(n
2
).
Evolutionary Algorithms 34/46
Analytical Tool Box
Artificial Fitness Levels
[Wegener and Witt,2005]
Markov’s Inequality,Chernoff Bounds
[Motwani and Raghavan,1995]
Typical Runs
Expected Multiplicative Weight Decrease
[Neumann and Wegener,2007]
Drift Analysis [Hajek,1982]
Branching Processes [Lehre,2010]
Yao’s Minimax Principle
[Motwani and Raghavan,1995]
Evolutionary Algorithms 35/46
State of the Art [Oliveto et al.,2007b]
OneMax
(1+1) EA
O(n log n)
[M¨uhlenbein,1992]
(1+λ) EA
O(λn +n log n)
[Jansen et al.,2005]
(µ+1) EA
O(µn +n log n)
[Witt,2006]
1-ANT
O(n
2
) w.h.p.
[Neumann and Witt,2006]
(µ+1) IA
O(µn +n log n)
[Zarges,2009]
Linear Functions
(1+1) EA
Θ(n log n)
[Droste et al.,2002a] and
[He and Yao,2003]
cGA
Θ(n
2+ε
),ε > 0 const.
[Droste,2006]
Max.Matching
(1+1) EA
e
Ω(n)
,PRAS
[Giel and Wegener,2003]
Sorting
(1+1) EA
Θ(n
2
log n)
[Scharnow et al.,2002]
SS Shortest Path
(1+1) EA
O(n
3
log(nw
max
))
[Baswana et al.,2009]
MO (1+1) EA
O(n
3
)
[Scharnow et al.,2002]
MST
(1+1) EA
Θ(m
2
log(nw
max
))
[Neumann and Wegener,2007]
(1+λ) EA
O(nλlog(nw
max
)),λ = ￿
m
2
n
￿
[Neumann and Wegener,2007]
1-ANT
O(mn log(nw
max
))
[Neumann and Witt,2008]
Max.Clique
(1+1) EA
Θ(n
5
)
[Storch,2006]
(rand.planar)
(16n+1) RLS
Θ(n
5/3
)
[Storch,2006]
Eulerian Cycle
(1+1) EA
Θ(m
2
log m)
[Doerr et al.,2007]
Partition
(1+1) EA
PRAS,avg.
[Witt,2005]
Vertex Cover
(1+1) EA
e
Ω(n)
,arb.bad approx.
[Friedrich et al.,2007] and
[Oliveto et al.,2007a]
Set Cover
(1+1) EA
e
Ω(n)
,arb.bad approx.
[Friedrich et al.,2007]
SEMO
Pol.O(log n)-approx.
[Friedrich et al.,2007]
Intersection of
(1+1) EA
1/p-approximation in
[Reichel and Skutella,2008]
p ≥ 3 matroids
O(|E|
p+2
log(|E|w
max
))
UIO/FSM conf.
(1+1) EA
e
Ω(n)
[Lehre and Yao,2007]
Evolutionary Algorithms 36/46
Summary
Evolutionary Algorithms
Representations
Genetic Operators
Selection Mechanisms
Runtime Analysis
No Free Lunch Theorem
Expected Runtime & Success Probability
Evolutionary Algorithms 37/46
References I
Auger,A.and Teytaud,O.(2008).
Continuous lunches are free plus the design of optimal optimization algorithms.
Algorithmica.
B¨ack,T.(1994).
Selective pressure in evolutionary algorithms:A characterization of selection
mechanisms.
In Proceedings of the 1st IEEE Conference on Evolutionary Computation
(CEC 1994),pages 57–62.IEEE Press.
Baswana,S.,Biswas,S.,Doerr,B.,Friedrich,T.,Kurur,P.P.,and Neumann,F.
(2009).
Computing single source shortest paths using single-objective fitness.
In FOGA 09:Proceedings of the tenth ACM SIGEVO workshop on Foundations
of genetic algorithms,pages 59–66,New York,NY,USA.ACM.
Coello,C.C.(2002).
Theoretical and numerical constraint-handling techniques used with evolutionary
algorithms:A survey of the state of the art.
Computer Methods in Applied Mechanics and Engineering,
191(11-12):1245–1287.
Evolutionary Algorithms 38/46
References II
Doerr,B.,Klein,C.,and Storch,T.(2007).
Faster evolutionary algorithms by superior graph representation.
In Proceedings of the 1st IEEE Symposium on Foundations of Computational
Intelligence (FOCI 2007),pages 245–250.
Droste,S.(2006).
A rigorous analysis of the compact genetic algorithm for linear functions.
Natural Computing,5(3):257–283.
Droste,S.,Jansen,T.,and Wegener,I.(2002a).
On the analysis of the (1+1) Evolutionary Algorithm.
Theoretical Computer Science,276:51–81.
Droste,S.,Jansen,T.,and Wegener,I.(2002b).
Optimization with randomized search heuristics–the (a)nfl theorem,realistic
scenarios,and difficult functions.
Theoretical Computer Science,287(1):131–144.
Evolutionary Algorithms 39/46
References III
Droste,S.,Jansen,T.,and Wegener,I.(2006).
Upper and lower bounds for randomized search heuristics in black-box
optimization.
Theory of Computing Systems,39(4):525–544.
Friedrich,T.,Hebbinghaus,N.,Neumann,F.,He,J.,and Witt,C.(2007).
Approximating covering problems by randomized search heuristics using
multi-objective models.
In Proceedings of the 9th annual conference on Genetic and evolutionary
computation (GECCO 2007),pages 797–804,New York,NY,USA.ACM Press.
Friedrich,T.,Oliveto,P.S.,Sudholt,D.,and Witt,C.(2009).
Analysis of diversity-preserving mechanisms for global exploration*.
Evol.Comput.,17(4):455–476.
Garnier,J.,Kallel,L.,and Schoenauer,M.(1999).
Rigorous hitting times for binary mutations.
Evolutionary Computation,7(2):173–203.
Evolutionary Algorithms 40/46
References IV
Giel,O.and Wegener,I.(2003).
Evolutionary algorithms and the maximum matching problem.
In Proceedings of the 20th Annual Symposium on Theoretical Aspects of
Computer Science (STACS 2003),pages 415–426.
Goldberg,D.E.and Deb,K.(1991).
A comparative analysis of selection schemes used in genetic algorithms.
In Foundations of Genetic Algorithms,pages 69–93.Morgan Kaufmann.
Hajek,B.(1982).
Hitting-time and occupation-time bounds implied by drift analysis with
applications.
Advances in Applied Probability,14(3):502–525.
He,J.and Yao,X.(2003).
Towards an analytic framework for analysing the computation time of
evolutionary algorithms.
Artificial Intelligence,145(1-2):59–97.
Evolutionary Algorithms 41/46
References V
Jansen,T.,Jong,K.A.D.,and Wegener,I.(2005).
On the choice of the offspring population size in evolutionary algorithms.
Evolutionary Computation,13(4):413–440.
Lehre,P.K.(2010).
Negative drift in populations.
In To appear in Proceedings of PPSN 2010 - 11th International Conference on
Parallel Problem Solving From Nature.
Lehre,P.K.and Yao,X.(2007).
Runtime analysis of (1+1) EA on computing unique input output sequences.
In Proceedings of 2007 IEEE Congress on Evolutionary Computation
(CEC 2007),pages 1882–1889.IEEE Press.
Motwani,R.and Raghavan,P.(1995).
Randomized Algorithms.
Cambridge University Press.
Evolutionary Algorithms 42/46
References VI
M¨uhlenbein,H.(1992).
How genetic algorithms really work I.Mutation and Hillclimbing.
In Proceedings of the Parallel Problem Solving from Nature 2,(PPSN-II),pages
15–26.Elsevier.
Neumann,F.and Wegener,I.(2007).
Randomized local search,evolutionary algorithms,and the minimum spanning
tree problem.
Theoretical Computer Science,378(1):32–40.
Neumann,F.and Witt,C.(2006).
Runtime analysis of a simple ant colony optimization algorithm.
In Proceedings of The 17th International Symposium on Algorithms and
Computation (ISAAC 2006),number 4288 in LNCS,pages 618–627.
Neumann,F.and Witt,C.(2008).
Ant colony optimization and the minimum spanning tree problem.
In Proceedings of Learning and Intelligent Optimization (LION 2008),pages
153–166.
Evolutionary Algorithms 43/46
References VII
Oliveto,P.S.,He,J.,and Yao,X.(2007a).
Evolutionary algorithms and the vertex cover problem.
In In Proceedings of the IEEE Congress on Evolutionary Computation
(CEC 2007).
Oliveto,P.S.,He,J.,and Yao,X.(2007b).
Time complexity of evolutionary algorithms for combinatorial optimization:A
decade of results.
International Journal of Automation and Computing,4(1):100–106.
Reichel,J.and Skutella,M.(2008).
Evolutionary algorithms and matroid optimization problems.
Algorithmica.
Rothlauf,F.(2006).
Representations for Genetic and Evolutionary Algorithms.
Springer-Verlag Berlin Heidelberg.
Sareni,B.and Krahenbuhl,L.(1998).
Fitness sharing and niching methods revisited.
IEEE Transactions on Evolutionary Computation,2(3):97–106.
Evolutionary Algorithms 44/46
References VIII
Scharnow,J.,Tinnefeld,K.,and Wegener,I.(2002).
Fitness landscapes based on sorting and shortest paths problems.
In Proceedings of 7th Conf.on Parallel Problem Solving from Nature
(PPSN–VII),number 2439 in LNCS,pages 54–63.
Storch,T.(2006).
How randomized search heuristics find maximum cliques in planar graphs.
In Proceedings of the 8th annual conference on Genetic and evolutionary
computation (GECCO 2006),pages 567–574,New York,NY,USA.ACM Press.
Wegener,I.and Witt,C.(2005).
On the analysis of a simple evolutionary algorithm on quadratic pseudo-boolean
functions.
Journal of Discrete Algorithms,3(1):61–78.
Witt,C.(2005).
Worst-case and average-case approximations by simple randomized search
heuristics.
In In Proceedings of the 22nd Annual Symposium on Theoretical Aspects of
Computer Science (STACS 05),number 3404 in LNCS,pages 44–56.
Evolutionary Algorithms 45/46
References IX
Witt,C.(2006).
Runtime Analysis of the (µ + 1) EA on Simple Pseudo-Boolean Functions.
Evolutionary Computation,14(1):65–86.
Wolpert,D.H.and Macready,W.G.(1997).
No free lunch theorems for optimization.
IEEE Transactions on Evolutionary Computation,1(1):67–82.
Zarges,C.(2009).
On the utility of the population size for inversely fitness proportional mutation
rates.
In FOGA 09:Proceedings of the tenth ACM SIGEVO workshop on Foundations
of genetic algorithms,pages 39–46,New York,NY,USA.ACM.
Evolutionary Algorithms 46/46