A Study of

libyantawdryΤεχνίτη Νοημοσύνη και Ρομποτική

23 Οκτ 2013 (πριν από 3 χρόνια και 9 μήνες)

61 εμφανίσεις

A Study of

Genetic Algorithms for
Parameter Optimization

Mac Newbold


Many algorithms have constant values that
affect the way they work

Sometimes we choose them arbitrarily or
based on some experimentation

Their interactions are often not well

Use Genetic Algorithm to optimize the
parameters to an algorithm


Utah Network Testbed (www.emulab.net)

Map a “virtual” topology graph to the
physical topology graph

Hard, 30+ degrees of freedom


Simulated Annealing (AI algo.)

19 constants control behaviors

1 boolean, 4 integers, 15 floating point

1 integer and 3 floats are scaling factors

15 parameters need to be optimized

Genetic Algorithm

Evolution, “survival of the fittest”

Genetic Algorithm control


Calls object methods

Replaceable object


returns a random object


calculate fitness of object


crossover (returns 2 objs)




Show the object

Very Flexible Framework

Parameter Optimization

“Params” object

Specialized for “assign”

Contains the 15 variables we want to tune

One extra value caches fitness calculations

Insures that values “make sense” using
domain specific constraints

Uniform crossover

Random mutation

Performance based fitness measure

Fitness Function

For “assign”, we care about running time

Choice of constants has huge effect

Fitness calculation:

Run “assign” on a set of N problems, using the
object’s parameters

Allow S seconds for each run

X=Sum of execution times

Fitness = (S*N)


S*N = maximum possible total time

Higher scores are better

Could take a long time, so cache result

G.A. Results

Tested genetic algorithm with “random”

Same as “Params” object, except for fitness

Random fitness, updated after cross/mutate

1000 member population

Crossover rate of 0.50

Mutation rate of 0.30

Threshold = 999.995/1000

Took 7 generations, about 5 seconds


What’s Next

Finish setting up actual scoring using
“assign” runtimes…