GAintro

jinksimaginaryAI and Robotics

Nov 7, 2013 (4 years and 5 days ago)

110 views

Evolu&onary  Compu&ng

Many  different  algorithms  
(or  
classes
 of  algorithms)

Gene&c  Algorithm  (
GA
)

Gene&c  Programming  (
GP
)

Learning  Classifier  System  (
LCS
)

Mul&-­‐Objec&ve  Gene&c  Algorithm  (
MOGA
)

Ar&ficial  Immune  System  (
AIS
)

Evolu&on  Strategies  (
ES
)

Gene  Expression  Programming  (
GEP
)

Es&ma&on  of  Distribu&on  Algorithm  (
EDA
)

Evolu&onary  Programming  (
EP
)

Etc,  etc
F. Oppacher, Evolutionary Computation
2
GA (and EC in general)
evolution is a strategy to explore
and adapt to complex, time-varying
fitness landscapes, based on
selection
,
inheritance
, and
blind variation
...
randomly generate an initial population P(0);
compute and save fitness f(p) for each individual
p in current population P(t);
define selection probabilities pr(p), for each p,
proportional to f(p);
generate P(t + 1) by probabilistically selecting
individuals from P(t) to produce offspring via
genetic operators (mutation. xover);
F. Oppacher, Evolutionary Computation
3
Darwin’s theory of evolution
o
Darwin’s theory (influenced by
Lyell
, Malthus, & natural
selection)

# of orgs of any one type can increase geometrically …. but

actual # of orgs of any one type remains nearly constant for long
periods

no 2 individuals of a type are identical; some of this variation is
inherited (but Darwin knew no mechanisms of inheritance)
thus

because orgs can produce more offspring than their surroundings can
support, there must be a struggle to survive among them

in this struggle, the orgs whose variations best adapt them to their
environments survive, the others die

because variations are heritable, there is change in the proportion of
the variations from generation to generation, i.e.
evolution
F. Oppacher, Evolutionary Computation
4
theory of evolution
read:
C. Darwin, “Origin of Species”;
S. Jones , “Darwin’s Ghost”;
D. Dennett, “Darwin’s Dangerous Idea”,
"
Freedom Evolves
";
“Breaking the Spell”;
R. Dawkins, “The Selfish Gene”,
“The Blind Watchmaker”,
“The Extended Phenotype”;
"
The Ancestors' Tale
"
;
E. Mayr, “This is Biology”; etc.
J. Holland, “Adaptation in Natural
and Artificial Systems”,
A. Moya, E. Font, “Evolution: From
Molecules to Ecosystems”,
M. Pagel, ed., “Encyclopedia of Evolution”
etc. etc.
F. Oppacher, Evolutionary Computation
5
Lamarck’s theory of evolution
o
inheritance of acquired characteristics

changes in environment act on individuals

this produces new needs which must be met

some individuals actively respond to new needs, creating new habits,
new uses of body parts etc.

this changes their somatic structure …
assumption: these
changes inscribed in germ line


these structural changes / habits are transmitted to offspring

Mendelian laws are compatible with Lamarckianism (!) and thus cannot be a
foundation for ET

epigenetics!!!
F. Oppacher, Evolutionary Computation
6
conditions for evolution to happen
o
heredity
: hi-fi copying

offspring
similar
to parents
o
variability
: imperfect copying

offspring
not identical
to parents
o
fecundity
: variants leave different numbers of offspring; specific
variations affect behavior; behavior affects reproductive success
given: scarce resources;
change: individuals die;
small differences affect longevity and fecundity --> poorer
replicators decrease
genotype copying
F. Oppacher, Evolutionary Computation
the basis of Darwin’s theory
7
nonrandom
survival of
small

random
hereditary changes leads
cumulatively
to
nonrandom

adaptations
• not all evolution is adaptive (Kimura’s
neutral
mutations)

evolution of adaptations is
gradualistic
consider multi-dimensional landscape of all possible combinations
of all genes in all genomes (set of all possible ‘animals’ ;-) )
• proportion of
viable
organisms is
very
small
• given any starting point, #near neighbors << #distant neighbors
so: finding viable organisms in this space is like finding a few needles in a
nearly infinite haystack; any mutation step must start from a viable organism
but any large mutation has a negligible chance of landing on another viable
form; so chance of finding another viable or improved form is hugely increased
by searching in the immediate neighborhood --> gradualism.
F. Oppacher, Evolutionary Computation
8
simple evolution (non-living)
o
evolution happens
whenever
there is…

competition for resources

reproduction

inheritance

variability
o
life or self-replication is not necessary for evolution to occur
e.g. ...
Q
β
replicase and RNA experiments (L. Orgel, Selection in vitro.
Proc. Royal Soc. London, 79)
SELEX (C. Tuerk, L. Gold, Systematic evolution of ligands by exponential enrichment.
Science,
249:505-510, 90
)
D. Bartel, J. Szostak, Isolation of new ribozymes from a large pool of random sequences.
Science,
261:1411-1418, 93
.
F. Oppacher, Evolutionary Computation
9
Q
β
replicase
o
The enzyme Q
β
replicase copies strands of RNA, given sufficient
monomers.
o
experiment:
o
many test tubes with Q
β
solution + monomers
o
put one type of RNA into test tube 1; Q
β
starts copying the RNA
template right away
o
after 30 minutes, put 1 drop from tube 1 into tube 2… then into tube
3, etc.
o
results:
o
RNA
evolves
; e.g. RNA molecules in tube 75 only 1/10 as long as
original template, much faster replication
o
different initial conditions result in different final mixes:
with
antibiotic in test tubes, final RNA was resistant to this antibiotic!
o
common end product:
fast RNA
, only 218 bases long (template >
4000 bases!)
F. Oppacher, Evolutionary Computation
10
SELEX
o
S
ystematic
E
volution of
L
igands by
EX
ponential Enrichment

initial mix with many different RNA molecules is passed through an
‘affinity column’ where RNA molecules can bind (even weakly) to
target
molecule

the binding RNA molecules -
selected
by the affinity column - are
then replicated. Repeat….

after ca. 4 ‘generations’, RNA molecules have strong, selective
binding to target molecule!
o
iterative in vitro selection (Bartel, Szostak)

SELEX-style evolution

of ribozymes that can catalyze a chosen
reaction up to 3 orders of magnitude faster than best ribozyme found
by random search
F. Oppacher, Evolutionary Computation
11
simple evolution, cont.
o
RNA experiments show:
simple evolution does
not
need

living organisms

self-replicating individuals

sexual reproduction (crossover)

ontogenesis (individual development)

phenotype / genotype distinction
o
simple evolution
does
need

scarce resources in a
somewhat
regular environment

a population of individuals

hi-fi (but imperfect) copying (reproduction)

variability from mutation, leading to fitness differences

‘natural selection of the fittest’
F. Oppacher, Evolutionary Computation
12
genotype / phenotype
o
genotype

genetic instructions (linear sequences of nucleotide bases, DNA)

machine specifications
; local behavioral repertoire; context sensitive

unordered bag of low-level instructions describing
local
interactions
o
phenotype

organism itself;
behaviors / structures resulting from executing GT
rules in environment
;
hierarchical

PT traits at org level result from many
nonlinear
interactions among
genes
o
nonlinear interactions among context-sensitive genes lead to
global PT behaviors:
development / morphogenesis
o
PTs are not predictable from GTs!
F. Oppacher, Evolutionary Computation
13
phenotype fitness =
# of offspring surviving to reproduce
o
offspring are duplicates of fit individuals?

fitness preserved but no improvement

in biota, no 2 individuals have identical chromosomes
o
offspring are randomly produced?

novelty but past advances are not preserved

in biota, most possible allele combinations could not survive
o
thus:
fitness is not preserved by duplication and observed variety is
not due to random variation
o
how to retain past successes and use them to increase the
probability of fit, novel variants?
F. Oppacher, Evolutionary Computation
14
retaining past successes and increasing the
probability of fit, novel variants
o
fitness-proportional reproduction & genetic operators

change / redistribute alleles, no direct effect on population size

alleles close together are rarely separated
(xover frequency
increases with distance!)
consider:
adjacent alleles, occurring
only
in several
fit
chromosomes;
each of these chromosomes will be duplicated above average
# of times, so alleles’ proportion in population increases;
close-together alleles
will be
transferred intact
into many
new chromosome contexts, to be tested further:
if correlation between these alleles and fitness was spurious,
sustained association with extra fit individuals is less likely,
otherwise new chromosomes will be extra fit…
F. Oppacher, Evolutionary Computation
15
stability and variability of inheritance
o
stability


redundancy
of DNA base pair structure: e.g., CAG and CAA both
code for glutamine

repair
: some DNA consists of instructions for DNA repair

homologous

recombination
: crossover prevents fixing of negative
mutations and helps repair damaged DNA
o
variability

mutation
: side effect of entropy, copy errors
additions, deletions of base pairs
large DNA sequence rearrangements (usually lethal)

homologous and non-homologous transfer of genetic material
bacterium injects copy of genetic material into other bacterium (possibly
other species!)

homologous recombination contributes more variability than
mutation
F. Oppacher, Evolutionary Computation
16
homologous recombination
o
homologous crossover causes stability
and
variability

occurs only between
very
similar DNA segments => sexual
reproduction leads to clearly defined species

occurs only if 2 DNA segments can be matched up so that swap point
is at functionally identical points (function of transcription segments
and lengths of DNA molecules is preserved)

non-homologous crossover ≈ massive mutation


is nearly universal

advantageous for quickly accumulating good mutations

requires fitness-decreasing ‘tags’ to recognize mates (peacock’s tail)
and other energy costs of sexual reproduction ;)
o
GA has, GP lacks homologous crossover.
F. Oppacher, Evolutionary Computation
17
GAs
o
use coding of parameters
o
avoid false peaks by working with a
population
of potential
solutions
o
use ‘objective’ function but no auxiliary info (such as
derivatives)
o
use
probabilistic genetic operators
o
seek
improvement - ‘satisficing’ - rather than optimum
‘directed serendipity’
:
o
direction
comes from
fitness-proportional reproduction
o
novelty
comes from
mutation
and from randomly
juxtaposing
best partial solutions
F. Oppacher, Evolutionary Computation
18
basic GA
selection
solutions
1100101010
1011101110
0011011001
1100110001
chromosomes
110010
|1010
101110|
1110

110010|1110
101110|1010
crossover
00110
1
1001

00110
0
1001
mutation
offspring:
1100101110
1011101010
0011001001
roulette wheel
new
population
decoding
fitness
evaluation
ranking or
scaling
F. Oppacher, Evolutionary Computation
19
when to use a GA?
• multi-modal domain
(else hill-climb)
• some epistasis
(effect of alleles
depends on other
alleles; output of a
sequence catalyzes
intermediate step!)
• detectable regularity
in domain (must be
captured by encoding)
F. Oppacher, Evolutionary Computation
20
components of GA
o
representation

bit strings
, lists, TSP structures etc.

each string represents a complete idea; substrings are
notions
of
what’s important to a task

GA reproduces hi-quality notions and crosses them over with other hi-
quality notions ==> population gets better and better…
o
initialization

random
, or

perturbing results of greedy algorithm or human solution
o
evaluation / fitness function (“impoverished environment”)

single value
, more complex feedback, normalization

implicit fitness
o
genetic operators, selection procedures

incorporate heuristics …
o
parameters: pop size, xover prob, mut prob, ….
F. Oppacher, Evolutionary Computation
21
GA, hill climbing, simulated annealing
o
hill climbing

exploits
best solution, neglects
exploration
o
random search

explores search space, neglects exploiting promising regions
o
simulated annealing

if new point is better than current point, its prob of acceptance, p, is 1
else
p > 0 {important!}
;

p depends on
temperature
T (lowered in steps during execution): the
lower T, the smaller prob of accepting a new point
o
GA

uses a
population
of potential solutions and
exchanges information
among relatively good (surviving) solutions
:
directed
and
stochastic
Hill  Climbing

Number  of  neighbors

Depends  on  the  problem

Depends  on  the  varia&on  operator

Examina&on  of  all  neighbors  oLen  
prohibi&vely  inefficient  or  impossible

Example:

TSP:  Traveling  Salesperson  Problem

1000  ci&es

Varia&on  operator:  permute  four  ci&es

Number  of  neighbors:  
952,593,869,250

Evaluate  100  
neighbors  per  
second  

302  
years  per  step
F. Oppacher, Evolutionary Computation
23
hill climbing
t := 0;
repeat
local := false;
pick
randomly
current string v
c
;
evaluate fitness f(v
c
);
repeat
consider
all
v' in neighborhood of v
c
(flipping single bits);
pick best v
n
among v';
if f(v
c
) < f(v
n
) then v
c
:= v
n
else local := true
until local;
t++;
until t = max.
F. Oppacher, Evolutionary Computation
24
SA
t := 0; initialize temperature T;
select v
c
randomly;
evaluate fitness f(v
c
);
repeat
repeat
select v
n
in neighborhood of v
c
;
if f(v
c
) < f(v
n
) then v
c
:= v
n

else if random{0,1] < exp{(f(v
n
)-f(v
c
)) / T} then v
c
:= v
n

until thermal equilibrium;
T := g(T,t); // T decreases with each time step t
t++
until T is so small there are virtually no changes…
F. Oppacher, Evolutionary Computation
25
a trivial example
f(v) = | 11 * ones(v) - 150 |,
length(v) = 30
v
1
: 1 1 0 1 1 0 1 0 1 1 1 0 1 0 1 1 1 1 1 1 1 0 1 1 0 1 1 0 1 1
ones(v
1
) = 22, f(v
1
) = | 11 * 22 - 150 | = 92
global max v
g
: 1 1 1 …………………………………………..1,
f(v
g
) = 30*11 - 150 = 180
local max v
l
: 0 0 0 …………………………………………..0,
f(v
l
) = 150
13
ones: 13*11 = |143 - 150 | =
7
;
14
*11 = 154, 154-150 =
4
;
so any 1-step towards global max is bad
but
12
ones: 12*11 = 132, |132-150|=
18
!
depending on starting point, hill climber gets stuck..
no problem for the GA
F. Oppacher, Evolutionary Computation
26
GAs often used for parameter optimization
o
maximize f(x
1
, …, x
k
):
R
k
-->
R

each x
i
takes values from D
i
= [a
i
, b
i
] in
R
, f(x
1
, …,x
k
) > 0 for all x
i
in D
i

suppose we want 6 decimal places of precision:
each D
i
is divided into (b
i
- a
i
) * 10
6
ranges

m
i
:= smallest int s.t. (b
i
- a
i
) * 10
6
≤ 2
mi
- 1;
each x
i
is coded as a string of length m
i

interpreting x
i
x
i
= a
i
+ decimal(101…0
2
) * [(b
i
- a
i
) / (2
mi
- 1)]

length m of chromosome = ∑
i=1
k
m
i
chromosome: 1011…….110010………….011010…………….
F. Oppacher, Evolutionary Computation
27
"bed of nails"
o
maximize f(x
1
, x
2
) = 21.5 + x
1
* sin(4πx
1
) + x
2
* sin(20πx
2
),
where -3.0 ≤ x
1
≤ 12.1, 4.1 ≤ x
2
≤ 5.8

4 decimal places
precision
: length of domain of x
1
= 15.1,
so [-3.0, 12.1] is split into 15.1 * 10000 equal size ranges;
x
1
needs 18 bits because 2
17
< 151000 ≤ 2
18
length of domain of x
2
= 1.7, so x
2
needs 15 bits because
2
14
< 17000 ≤ 2
15



chromosome length m = 18 + 15 = 33

randomly
initialize popSize chromosomes of length 33:
010001001011010000
|
111110010100010

decoding
:
x
1
= -3.0 + decimal(01…0
2
)*(12.1-(-3.0))/(2
18
-1)=
-3.0+70352*(15.1/262143)=-3.0+4.052426=
1.052426
x
2
= 4.1+decimal(…)*(5.8-4.1)/(2
15
-1)=4.1+31906*1.7/32767 =
5.755330

chromosome: [
1.052426, 5.755330
];

fitness
(chromosome)= 20.252640
….

after 400 generations, best individual: 38.827553
F. Oppacher, Evolutionary Computation
28
no convergence guarantee!!!
o
convergence without guaranteed optimality

premature convergence
vs
finding interesting areas of a space
quickly
GAs find interesting hills, and locally convergent schemes can climb
local hills
o
generally good:

high xover prob: ≈ 0.6

low mutation prob: ≈ 0.1 to 0.001

population size:
usually
larger is better

normalizing fitness function usually good
o
string size determines search space

e.g. for length = 30, 2
30
= 1.07 billion points in space

e.g. after 6 generations, best individuals are in top 0.19% of space!!!
F. Oppacher, Evolutionary Computation
29
why premature convergence?
o
encoding may force GA to search in space ≠ problem space
o
selective pressure (
exploitation
)

--> diversity (
exploration
)


loss of diversity --> premature convergence
o
selective pressure

(
or pop size

!!) -->
exploration



possibly ineffective search
o
sampling / selection issues

non-0 selection prob for each individual?

best individuals prevented from reproduction? (against 'super
individuals')

replace worse child with better parent? (and turn up mut/xover probs)

elitism? (preserve best chromosome)
o
improve use of available storage, i.e. population

steady state GAs

reduce # of exact copies (e.g. crowding model)
F. Oppacher, Evolutionary Computation
30
o
used to model arms races etc
o
it is always better to defect, except in iterated games ...

how can reciprocal cooperation arise?
o
Axelrod’s round-robin tournament: 63 programs (embodying
different strategies, eg Markov processes, Bayesian inference
etc); each plays 200 games with all others.
o
the winner was
always
the simplest: TIT FOR TAT
player B
player B
player B
player B
player B
player A

cooperate
defect

player A
cooperate
3, 3
0, 5

player A
defect
5, 0
1, 1

Prisoner's Dilemma
F. Oppacher, Evolutionary Computation
31
Prisoner's Dilemma background
o
Important for evolutionary theory:
Mutual cooperation
can
evolve
in a world of egoists without central control


by starting - via mutation or migration - with a small cluster of cooperators
(who can recognize previous partners and remember what happened).
o
There is no universally best strategy

TIT FOR TWO TATS would have won 1. tournament and did poorly in 2.
tournament. Against unresponsive players, ALL D is best.

often better than TIT FOR TAT: be better at discovering nonresponsiveness
and then switch to ALL D.
o
TIT FOR TAT beats no one (!) but elicits cooperation


Nice, provokable, forgiving, easily recognizable strategies are
robust
in
unpredictable, changing environments
o
Cooperation is an ESS (Evolutionarily Stable System):

‘The gearwheels of social evolution have a ratchet.’
o
Think about it: Prisoner’s Dilemma
==
Tragedy of the Commons....
F. Oppacher, Evolutionary Computation
32
evolving strategies with a GA
o
how to represent a strategy?

each player remembers one previous game:
<me
opponent
>,
with 4 possible cases: cc, cd, dc, dd.
TIT FOR TAT: cc
→
c
, cd
→
d
, dc


c
, dd


d
.

for canonically ordered cases, TIT FOR TAT =
cdcd
.

to use a string as strategy, record previous game (eg dc), find its
position i in table (eg i=3), select ith letter in strategy string as next
move (eg c).

Axelrod’s strategies remember 3 previous games, with 2
6
= 64
possible cases: cc cc cc, cc cc cd, ... , dd dd dc, dd dd dd.

strategy: 70 letter string (64 next moves + 6 letters to record 3
previous games.
# of strategies = 2
70
F. Oppacher, Evolutionary Computation
33
evolving strategies …
o
fixed environment

GA strategies play against 8 best human strategies
excluding
TIT FOR
TAT

GA usually finds (variants of) TIT FOR TAT

GA often finds
better
strategies than TIT FOR TAT by exploiting
weaknesses of its fixed opponents
(after testing only 1000
individuals!)
o
variable environment

each strategy plays against all others in the population -
here the
fitness landscape always changes!

many cooperators die out in the first 10-20 generations, but
eventually variants of the ‘generalist’ TIT FOR TAT always evolve

even better strategies evolve when memory of previous games is
allowed to change and when each strategy has a small probability
(0.05) of making the opposite move...
F. Oppacher, Evolutionary Computation
34
iterated prisoner's dilemma
o
success of a strategy depends on

number of iterations

precise makeup of other strategies

balance of nice and nasty strategies
o
ecological games

scores are converted to # of offspring (implicit fitness);
population
changes… fitnesses fluctuate...
o
tit-for-tat variants
(slightly better)

generous tit-for-tat
:
occasionally cooperate after defection

joss
:
occasionally defect after cooperation
o
pavlov
: (nasty) win-stay, lose-shift strategy

when dealing with a nice strategy,
pavlov
keeps on defecting…
o
pavlov’s success suggests:

being nice not good enough? evol of cooperation more complicated
than was thought?….
F. Oppacher, Evolutionary Computation
kinds of problems...
35
known
model
?
specifi
ed
input
output
optimization & design:

TSP, time tables, function optimization;
jet nozzles...
?
model
known
known
input
output
modeling, system identification:
evolutionary machine learning;
predictive theory...
known
model
known
?
input
output
simulation:
agent-based economics; incest is bad
also in EC; multi-parent recombination
is good....
F. Oppacher, Evolutionary Computation
shape optimization
36
design
vibration-resistant
ladder or boom to connect satellite body
with communications dish.
Traditional design
:
no air to damp vibrations...
Evolutionary design
:
20,000% better!!!
F. Oppacher, Evolutionary Computation
evolving an antenna for NASA
37
A
n
Evolve
d
A
ntenna
for
a
NASA
M
ission
1
1
(a)
(
b
)
Figur
e
1.5.
Photographs
of
protot
yp
e
e
v
o
lv
ed
an
tennas:
(a)
S
T5-3-10;
(b)
S
T5-4W-
0
3
A
n
Evolve
d
A
ntenna
for
a
NASA
M
ission
1
1
(a)
(
b
)
Figur
e
1.5.
Photographs
of
protot
yp
e
e
v
o
lv
ed
an
tennas:
(a)
S
T5-3-10;
(b)
S
T5-4W-
0
3
the antenna should be able to receive signals of many frequencies,
from many directions...
Explora&on  vs.  Exploita&on

Explora2on

Widely  sampling  the  search  space

Trying  new  possibili&es

Taking  “big”  steps  in  the  search  space

Exploita2on

Climbing  the  local  peak  in  the  fitness  landscape

Trying  “small”  varia&ons  on  best  solu&on(s)  found  so  
far  –  refining  and  fine-­‐tuning
Evolu&onary  Algorithms  (EAs)  aZempt  to  strike  a  balance  
between  these  two  needs
explora&on
exploita&on
F. Oppacher, Evolutionary Computation
typical GA behavior
39
*
*
*
*
*
*
*
*
*

tness
individuals
initially
, population is randomly spread out
over the search space:
exploration
*
*
*
*
*
*
*
*
*

tness
individuals
after a few generations
, population climbs
hills because of selection and variation:
exploration & exploitation
*
*
*
*
*
*
*
*
*

tness
individuals
later
, most of the population converges on
a few hills:
exploitation
F. Oppacher, Evolutionary Computation
GA is an
anytime
algorithm
40
best fi
tness
time
heuristic - instead of random - initialization
may not be worth the effort: only a few, k,
generations are needed to ‘catch up’....
best fi
tness
time
random init
heuristic init
k
long runs may not be worth the effort:
better to restart the evolution...
best fi
tness
time
early progress
late progress
because you can stop it at
any time
and you
get
some
solution, more or less good...
F. Oppacher, Evolutionary Computation
Evolutionary Algorithms in general...
what about No Free Lunch?
41