# Lecture 7: Genetic Algorithms

Τεχνίτη Νοημοσύνη και Ρομποτική

23 Οκτ 2013 (πριν από 4 χρόνια και 6 μήνες)

80 εμφανίσεις

Lecture7:GeneticAlgorithms
CognitiveSystemsII-MachineLearning
SS2005
PartII:SpecialAspectsofConceptLearning
GeneticAlgorithms,GeneticProgramming,
ModelsofEvolution
Lecture7:GeneticAlgorithmsp.1
Motivation
learningmethodsismotivatedbyanalogyto
biologicalevolution
ratherthansearchfromgeneral-to-specicorfrom
simple-to-complex,geneticalgorithmsgeneratesuccessor
hypothesesbyrepeatedlymutatingandrecombiningpartsofthe
bestcurrentlyknownhypotheses
ateachstep,acollectionofhypotheses,calledthecurrent
population,isupdatedbyreplacingsomefractionbyoffspringofthe
mosttcurrenthypotheses
Lecture7:GeneticAlgorithmsp.2
Motivation
reasonsforpopularity
evolutionisknowntobeasuccessful,robustmethodfor
geneticalgorithmscansearchspacesofhypothesescontaining
complexinteractingparts,wheretheimpactofeachparton
overallhypothesistnessmaybedifculttomodel
geneticalgorithmsareeasilyparallelized
geneticprogramming≈entirecomputerprogramsareevolvedto
certaintnesscriteria
evolutionarycomputation=geneticalgorithms+genetic
programming
Lecture7:GeneticAlgorithmsp.3
GeneticAlgorithms
problem:searchaspaceofcandidatehypothesestoidentifythe
besthypothesis
thebesthypothesisisdenedastheonethatoptimizesa
predenednumericalmeasure,calledtness
couldbedenedasthenumberofgameswonbytheindividual
whenplayingagainstotherindividualsinthecurrentpopulation
basicstructure:
iterativelyupdatingapoolofhypotheses(population)
oneachiteration
hypothesesareevaluatedaccordingtothetnessfunction
anewpopulationisgeneratedbyselectingthemostt
individuals
somearecarriedforward,othersareusedforcreatingnew
offspringindividuals
Lecture7:GeneticAlgorithmsp.4
GeneticAlgorithms
GA(Fitness,Fitness_threshold,p,r,m)
Fitness:tnessfunction,Fitness_threshold:terminationcriterion,
p:numberofhypothesesinthepopulation,r:fractiontobereplacedbycrossover,
m:mutationrate
Initializepopulation:P←Generatephypothesesatrandom
Evaluate:ForeachhinP,computeFitnes(h)
While[max
h
Fitness(h)]<Fitness_threshold,Do
S
2.Crossover:Probalisticallyselect
r∙p
2
pairsofhypothesesfromP.Foreachpair
<h
1
,h
2
S
3.Mutate:ChoosempercentofthemembersofP
S
withuniformprobability.For
each,invertonerandomlyselectedbit
4.Update:P←P
S
5.Evaluate:foreachh∈P,computeFitness(h)
ReturnthehypothesisfromPthathasthehighesttness.
Lecture7:GeneticAlgorithmsp.5
Remarks
asspeciedabove,eachpopulationPcontainsphypotheses
(1−r)∙phypotheses
S
withoutchanging
theselectionisprobabilistically
theprobabilityisgivenbyPr(h
i
)=
Fitness(h
i
)
￿
p
j=1
Fitness(h
j
)
r∙p
2
pairsofhypotheses
S
afterapplyingthecrossover
operator
theselectionisalsoprobabilistically
⇒(1−r)∙p+2∙
r∙p
2
=pwherer+(1−r)=1
Lecture7:GeneticAlgorithmsp.6
RepresentingHypotheses
hypothesesareoftenrepresentatedasbitstringssothattheycan
easilybemodiedbygeneticoperators
representatedhypothesescanbequitecomplex
eachattributecanberepresentatedasasubtringwithasmany
positionsastherearepossiblevalues
toobtainaxed-lengthbitstring,eachattributehastobe
considered,eveninthemostgeneralcase
(Outlook=Overcast∨Rain)∧(Wind=Strong)
isrepresentatedas:Outlook011,Wind10⇒01110
Lecture7:GeneticAlgorithmsp.7
GeneticOperators
generationofsuccessorsisdeterminedbyasetofoperatorsthat
recombineandmutateselectedmembersofthecurrentpopulation
operatorscorrespondtoidealizedversionsofthegeneticoperations
foundinbiologicalevolution
thetwomostcommonoperatorsarecrossoverandmutation
Lecture7:GeneticAlgorithmsp.8
GeneticOperators
Single-point crossover:
11101001000 00001
010101
11111000000
11101010101
Two-point crossover:
11
101001000
0000101
0101
00111110000
11001011000
10011010011
Uniform crossover:
Point mutation:
111
010
010
00
0
0001
01
0101
10001000100
111010
01000
111010
11000
00101000101
00001001000
01101011001
Lecture7:GeneticAlgorithmsp.9
GeneticOperators
Cossover:
producestwonewoffspringfromtwoparentstringsbycopying
selectedbitsfromeachparent
bitatpositioniineachoffspringiscopiedfromthebitat
positioniinoneofthetwoparents
choicewhichparentcontributesbitiisdeterminedbyan
single-pointcrossover:e.g.11111000000
two-pointcrossover:e.g.00111110000
uniformcrossover:e.g.01100110101
mutation:producesbitwiserandomchanges
Lecture7:GeneticAlgorithmsp.10
IllustrativeExample(GABIL)
propositionalrules
Representation:
eachhypothesisisencodedasshownabove
hypothesisspaceofrulepreconditionsconsistsofaconjunction
ofconstraintsonaxedsetofattributes
setsofrulesarerepresentatesbyconcatenation
e.g.a
1
,a
2
booleanattributes,ctargetattribute
IFa
1
=T∧a
2
=FTHENc=T;
IFa
2
=TTHENc=F
⇒1001111100
Lecture7:GeneticAlgorithmsp.11
IllustrativeExample(GABIL)
GeneticOperators:
usesstandardmutationoperator
crossoveroperatorisatwo-pointcrossovertomanage
variable-lengthrules
Fitnessfunction:
Fitness(h)=(correct(h))
2
basedonclassicationaccuracywherecorrect(h)isthe
percentofalltrainingexamplescorrectlyclassiedby
hypothesish
Lecture7:GeneticAlgorithmsp.12
HypothesisSpaceSearch
methodisquitedifferentfromothermethodspresentedsofar
neithergeneral-to-specicnorsimple-to-complexsearchis
performed
geneticalgorithmscanmoveveryabruptly,replacingaparent
sothismethodislesslikelytofallintosomelocalminimum
practicaldifculty:crowding
someindividualsthattbetterthanothersreproducequickly,so
thatcopiesandverysimilaroffspringtakeoveralargefraction
ofthepopulation
⇒reduceddiversityofpopulation
⇒slowerprogressofthegeneticalgorihms
Lecture7:GeneticAlgorithmsp.13
GeneticProgramming
individualsintheevolvingpopulationarecomputerprogramsrather
thanbitstrings
hasshowngoodresults,despitevastH
representingprograms
typicalrepresentationscorrespondtoparsetrees
eachfunctioncallisanode
argumentsarethedescendants
tnessisdeterminedbyexecutingtheprogrammonthetraining
data
crossoverareperformedbyreplacingarandomlychosen
subtreebetweenparents
Lecture7:GeneticAlgorithmsp.14
GeneticProgramming
^
sin
x
y
2
+
x
+
^
sin
x
y
2
+
x
+
^
sin
x
y
2
+
x
+
sin
x
y
+
x
+
^
sin
x
y
2
+
x
+
^
2
Lecture7:GeneticAlgorithmsp.15
ModelsofEvolutionandLearning
observations:
timeframeofmanygenerations
interestingquestion:Whatistherelationshipbetweenlearning
affordedbyevolution?
Lecture7:GeneticAlgorithmsp.16
ModelsofEvolutionandLearning
LamarckianEvolution:
propositionthatevolutionovermanygenerationswasdirectlyinuencedbythe
directinuenceofthegeneticmakeupoftheoffspring
Lamarckianprocessescansometimesimprovetheeffectivenessofgenetic
algorithms
BaldwinEffect:
aspeciesinachangingenvironmentunderliesevolutionarypressurethatfavors
individualswiththeabilitytolearn
suchindividualsperformasmalllocalsearchtomaximizetheirtness
thus,theysupportamorediversegenepool,relyingonindividuallearningto
overcomemissingornotquitewelltraits
Lecture7:GeneticAlgorithmsp.17
Summary
methodforconceptlearningbasedonsimulatedevolution
evolutionofpopulationsissimulatedbytakingthemostt
individualsovertoanewgeneration
someindiviualsremainunchanged,othersarethebaseforgenetic
operatorapplication
hypothesesarecommonlyrepresentatedasbitstrings
searchthroughthehypothesisspacecannotbecharacterized,
becausehypothesesarecreatedbycrossoverandmutation