How parameters have been chosen in metaheuristics: A survey

peaceshiveringAI and Robotics

Oct 24, 2013 (3 years and 10 months ago)

59 views




How parameters have been chosen in met
a
he
uristics
:

A survey



Eddy Janneth Mesa Delgado



Applied computation.
Systems

School.
Faculty of Mines.

National University of Colombia. Medellín.
Cra. 80 x Cl. 65. Barrio Robledo.

ejmesad@unal.edu.co





Abstract
:
Metaheuristics and heuristics use parameters to control

the

research

and
increa
se their flexibility. Heuristic

method’s
performance is

hold to the
set of parameters

chosen
. T
his paper is a survey of the theoretical approximation of the methods to
parameterize metaheuristics and


example of
adaptive parameters to differential
evolution metaheuristic. A futures works are presented



Keywords
:
Global Optimization, Heuristic P
rogramming
, nonlinear programming,
parameters
.



1.

INTRODUC
TION

Direct methods are used to solve optimization
problem
s
, despite they can only guarantee a good
solution
. The first direct methods

developed

were
a
systematic
search over
solutions. They

evolve
d

to

be

more
and more effective
methods
until to arrive to

artific
ial intelligence
. Those new methods were

called
heuristics and

m
etaheuristics

(Himmelblau,
1972)
.



Heuristics mean “to find” or “to discover by trial and
error”. Metah
euristics mean a better way “to find”

(Yang, 2010)
. Besides to intelligence,
m
etaheuristics
differs from other
kind of
direct method
s

because
they

use parameters to adjust the search and the
strategy to enhance themselves. Parameters let
algorithms

to get
good answers

in a bigger spam of
problems that initial direct methods with less effort

(Zanakis
and

Evans, 1981; Zilinskas
and

T
ö
rn, 1989)

i
.e. evolutionary strategies are used to get an initial
solution for an unknown problem. By the way,
parameters are a

weakness too, because they need to
be tuning and controlling for every problem. So, a
wrong set of parameters can make metaheuristic
s fail

(Angeline, 1995; E. Eiben, et al., 1999; Kramer,
2008; Yang, 2010)
.


The aim of
this paper is
to
summarize the
information available for parameterization of
metaheuristics
, how them are controlled and tuned
and the effect over performance in some cases
. To do
tha
t, Initially, three key words (
parameters,
metaheuristics and
heuristics)
was used to search in
SCOPUS d
ata base. A
fter that,
a
filter
was made
manually including papers that

contain
:



Applied metaheuristics with adaptive and
self
-
adaptive parameters



Presentation and comparison of new
methods to adapt or to automate parameters.



Classifications and formalizati
on of methods
to control and tune parameters.

With these
criteria
49 papers w
ere

chosen
.


This paper is organized as follows
: p
arameters are
introduced in section 2.
The p
arameters classification

found

in the literature

are
describe
d

in section
3

and
Section
4
, the first
part present

the parameters
relevance in the heuristics to categorize the
parameters choice and the second
part

use
s

the
technique to choose the parameters to indicate the
type of setting.

An

example

of adaptation and self
-
adapta
tion are show
n

in Section
6
. Finally, general
conclusion
s

and future works

are presented
in section
7
.


2.

PARAMETERS



Stochastic methods

have not the same pathway
each
run of algorithms.

Also,

t
he intelligence immerse in
metaheuristics

change dynamical
ly

from the
characteristic
s

found in the
search
. Parameters
give
to
metaheuristics
robustness and flexibility

(Talbi,
2009)
.


To choose

parameters
is

not a simple task because a
wrong set of parameter
s would make that
metaheuristics have premature
convergence or even
no convergence at all

(B
ä
ck, 1996; Talbi, 2009)
.


Metaheuristic
s

need

to have enough parameters to be
flexible
,

but
each parameter increase the complexity
to the tune and control of the method and the
necessary changes to optimize a ne
w problem.
Each
metaheuristic is a complete different world, has
different parameters, and they influence the
metaheuristic in different ways. There is not a unique
right to choice parameters in metaheuristics. In
literature two points of view are related,

but they are
only for

Evolutionary C
omputing (EC).


Traditionally, EC includes genetic algorithms (GA)
,
evolution strategies

(ES), and evolutionary
programming (EP
). By the way, other metaheuristics
met the conditions propose by Angeline (1995) to be
a
n

E
C metaheuristic, method that iterated until
specific stopping criteria. So, we would extend this
classification to another metaheuristics and heuristics
in general, but they
do
no
t

use the same operators
that EC
( Fogel
,

D. B.
,
et al
, 1991)
. Specifically,
most

part of another heuristics
has

not operators

formalized as operators, they use a
complete rules
with parameters inside, so maybe it is not a direct
analogy

(Zapfel,
et al.
, 2010)
. In the next two
sections the formal parameter classification proposed
i
n literature was discussed and follow an example of
other heuristic adaptation are presented.


3
.

PARAMETERS

CLASSIFICATION



Each metaheuristic ha
s

a parameter’
s set to control
the search.

Although there
isn’t consensus

about a
unique classificat
ion to
metaheuristic parameters, one
ap
proximation had been developed for EC

(Angeline,
1995; Hinterding,
et al.
, 1997; Kramer, 2008)
.
According

with the definition provide
d

by Angeline
(1995) this classification

could be extended to others
paradigms.


The most complete classification
is

propose
d

by
Kramer

(2008) who links Angeline

(1995) point of
view and
Hinterding
et al
. (1997)


2.1.

Exogenous
.


These parameters are those
whose are
affect
ed by

metaheuristic

performan
ce, but they are external to it
,
fo
r example
: constrain changes, problems with parts
functions
.


2.2.

Endogenous


These are internal to the method and could be change
by the user or method itself. E
ven though
,
Endogenous parameters are our focus, we cannot
forget the exogenous ones
because they will affect
the choice.


Population level
:

In this level the parameters are
global for the optimization.
Operator of
EC

that use
this type or parameters control the next generation.
For example: population size, stopping criteria,
etc
.



Indi
vidual level

.
This kind of parameters only affects
each individual
, for example: It could be the pass for
each individual.


Component level
. In this level
,

the parameters affect
part of the individual like a gene

of a chromosome in

Generic Algorithms (GA).


Also, i
t is important to notice that, authors propose
this classification, but
they do not
talk about a right
or unique
manner to adapt and automatize each level
of parameters.
The techniques are addressed by next
classification.



4
.

PARAMETERS SETTING
S


In last section
,

parameters were changed according to

their level

of

sort in the metaheuristic
. In this case,
the parameters are chosen in two stages the first is
before to use metaheuristic, and it is called “to tune”
and the second one is called “to c
ontrol” and both
have different ways to
be
select.


2.1.

Tuning
.

Parameters are tuned b
efo
re to use metaheuristic.
Those initial parameters could be chosen by three
different levels accord Eiben, et al (1999).


Manually
:

The initial parameters could be chosen by
and expert
ad hoc

it is a right manner but lately are
not the most recommended
(Kramer, 2008; Talbi,
2009)

Design of experiments (DOE)
:

It implies design test
to show the behavior, and use
or define a

metric

analyz
e the result and take a decision about a value
or a range of values for each parameter.
(Kramer,
2008; Talbi, 2009)
.


Metaevolutionary
:

It means that parameters are
chosen by other metaheuristics, in literature a good
example is given by bacterial chemota
xis method
proposed

in

(M
ü
ller,
et al.
, 2000)
. Additionally,
Kramer (2008) extend this procedure to control like
in Hyper
-
metaheuristics
(Hamadi, et al., 2011)


2.2.

Control

Parameters can change while metaheuristic is
running.
Control process is this

change in “real
time”
. Accord to Hinterding
et al
. (1997) and Eiden
et al
(1999), Kramer

(2008) describe
s

and
discusses

about three methods to control parameters.


Deterministic
:

It could be static or dynamics. Statics
means there is no change at all and dynamic it
change with a specific rule like a dynamic penalty
than change with the distance to feasible zone
(Kramer, 2008; Mesa, 2010)


Adaptive
:

In this case, parameter changes
agree a rule
like if it happens, then do this. A classic example of
adaptive parameters is the 1/5
th

rule for mutation use
in ES

(B
ä
ck, 1996; Kramer, 2008)
.


Self
-
ada
p
tive:

In this type of control, p
arameters

evolves agree to the problem

for example the se
lf
-
adaptation of the mutation step
-
size with the
direction to mutation

(Kramer, 2008; Schwefel,
1995)


5
.
AN EXAMPLE

OF THE ADAPTATIVE AND
SELF
-
ADAPTATIVE PARAMETERS

FOR
DIFFERENTIAL EVOLUTION (DE)
.


Besides

the metaheuristic
s

covered
by
EC, there are
at least
twenty

different metaheuristics. Additionally,
they have
different approaches and hybrid
s
.
In

the
sake of brevity,
only Differential Evolution (DE)
parameters
adaptation is

review and
just
one
approach
of self
-
adaptation
is present.


DE is a metaheuristic
proposed by
Price
(
1996)
. It
was classified as evolutionary metaheuristic, but
DE
does not use formal operators

(crossover, mutation)

like other EC algorithms.

This metaheuristic have
certain characteristics that make a good example

to
study: conceptually are near to EC, have one
complete rule but they identify same operators that
have EC, mutation and crossover; have few
parameters, it is widely known and have self
-
adaptive parameter approach.


This

metaheuristic

(
DE
)

has different
versions
; five
classic

versions for this metaheuristics

are present in
(Mezura
-
Montes,
et al.
, 2006)
.

Brest,
et al.
( 2006)

use the
version

called
DE
/
rand/1/bin

to propose their
idea of self
-
adaptive parameters
.


DE parameters for
DE
/
rand/1/bin

version are:



Size of population

(population
-
level)



Scale factor

(Individual
-
level)



Crossover parameters

(component
-
level)



Minimal border




Maximal border




Number of generation


(population
-
level)



Dimensions




Trial vector


(individual
-
level)



Randomly chosen index
(an random integer






)


(

)












Random number

(

)


Figure 1 shows the pseudocode for DE,


is a
solution vector and

(

)

is the value of goal function.
Line 04 is mutation operator

(individual
-
level). L
ine
06
is
the complete heuristic rule. Lines 07
-
09 are a
borders handle when there a component outside the
feasible zone is forced to go in feasible solution
(
exogenous
-
level
)
.



01
Start

02
for




until


03

for




until


04










(







)














05


for




until


06






{





(

)






(

)





(

)






(

)

07



if















then

08


















09



end if

08








09


end for

10




(



)

{





(


)


(


)




11

end for

12

end for

13

end algorithm

Fig
.
1
.
Pseudocode of DE.

Original algorithm tuned parameter manually based
on a few test,
they
recommended























,















the other
parameters are random or given by the problem.

The
parameters control deterministically (statics).


Later, o
ne of the
approaches

to

control

parameter
s

is
proposed

by

Ali
and

A. T
ö
rn
(
2004)

they fixed






, a
nd






and adapt


using:





{

(





|




|
)


|




|



(





|




|
)


(1)


Where



and



are the minimum and
maximum value found for current generation. And




is

the lower value feasible for


in this case






.


Brest,
et al.
( 2006)

proposed self
-
adaptive strategies
for


and


in this case, the values changes for each
individual. Initially a parent random vector are select


with size



.

For


a range between
[






,




0.9] are proposed
.





(2)






Where










. The best advantage presented
is the possibility to obtain the best result without run
the metaheuristics many times to
find the best
parameter to optimize each problem.


In the previous paragraphs indicated in parentheses
the

level of

slot to

the

different

parameters.

Although, it is not at exactly as in EC could be
possible extend


5
. CONCLUSION
S

T
here is
not
a general
categorization

for parameters
selection in metaheuristics. EC has a good
framework
for parameter setting. It is important to
notice that
the
se
metaheuristics
have a high
degree of
formalization.


In the previous
section

the

level of

slot to

the

different

parameters

was

indicated in parentheses.

Although, it is not at exactly as in EC could be
possible extend. The possibility to have a formal
framework to set the parameters is important because
it could help to determine which
parameters need

more
control

than other
s

before long tests.

Anyway,
this review just gives an initial idea of the
possibilities. There
is a lot of work

to do in this way.




REFEREN
CES


Ikeda, M. and Siljak, D.D. (1992).
Robust
stabilization of nonlinear systems via state
feedback. I
n:
Robust Control Systems and
Applications, Control and Dynamic

Systems
, Ed
C.T. Leondes, Vol. 51, pp. 1
-
30. Academic
Press, New York.

Ogata, K. (1987).
Discrete
-
Time Control Systems
.
Prentice
-
Hall, Englewood Cliffs, NJ.

Tadmore, G. (1989). Uncertain fe
edback loops and
robustness.
Automatica
,
27
, 1039
-
1042.

Ali, M.,
and

Törn, A. (2004).
Population se
t
-
based
global optimization al
gorithms: Some
modifications and numerical studies.
Comput.
Operational Research
,
31
(10), pp. 1703

1725.

Angeline, P. (1995).
Adaptative and Self
-
Adaptative
Evolutionary Computations.
IEEE,
Computational Intelligence

(pp. 152
-
163).
IEEE.

Brest, J., Greiner, S., Boskovic, B., Mernik, M.,
and

Zumer, V. (2006). Self
-
Adapting Control
Parameters in Differential Evolution: A
Comparativ
e Study on Numerical Benchmark
Problems.
IEEE Transactions on Evolutionary
Computation
,
10
(6), 646
-
657.

Bäck, T. (1996).
Evolutionary Algorithms in Theory
and Practice
. Oxford University press.

Eiben, E., Hinterding, R.,
and

Michalewicz, Z.
(1999). Paramet
er control in evolutionary
algorithms.
IEEE Transactions on Evolutionary
Computation
,
3
(2), 124
-
141.

Fogel, D. B., Fogel, L. J.,
and

Atmar, J. W. (1991).
Meta
-
evolutionary programming.
Signals,
Systems and Computers, 1991. 1991
Conference Record of the Twe
nty
-
Fifth
Asilomar Conference on

p
p. 540

545. IEEE.

Hamadi, Y., Monfroy, E.,
and

Saubion, F. (2011).
What Is Autonomous Search? In
:

Hybrid
optimization
.Ed

M. Milano
and P. Van
Hentenryck .


pp. 357
-
392. Springer.

Himmelblau, D. (1972).
Applied Nonlinear
Programming

(p. 498). McGraw hill.

Hinterding, R., Michalewicz, Z.,
and

Eiben, A. E.
(1997). Adaptation in evolutionary
computation: a survey.
Proceedings of 1997
IEEE International Conference on
Evolutionary Computation (ICEC ’97)

(pp. 65
-
69). I
EEE
.

Krame
r, O. (2008).
Self
-
adaptative Heuristics for
Evolutionary Computation
.
Design

(First Ed.,
p. 175).
Berlin: springer.

Mesa, E. (2010).
Supernova : un algoritmo novedoso
de optimización global
.
National University.
Retrieved from
http://www.bdigital.unal.ed
u.co/2035/.

Mezura
-
Montes, E., Velázquez
-
Reyes, J.,
and

Coello
Coello, C. A. (2006).
A comparative study of
differential evolution variants for global
optimization.
Proceedings of the 8th annual
conference on Genetic and evolutionary
computation

(p. 485

49
2). ACM.

Müller, S., Airaghi, S., Marchetto, J.,
and

Koumoutsakos, P. (2000). Optimization
algorithms based on a model of bacterial
chemotaxis.
in Proc. 6th Int. Conf. Simulation
of Adaptive Behavior: From Animals to
Animats,, SAB 2000 Proc. Suppl

(pp. 375
-
384).
Citeseer.

Price, K. V. (1996). Differential evolution: a fast and
simple numerical optimizer.
Proceedings of
North American Fuzzy Information Processing

(pp. 524
-
527). Ieee. doi:
10.1109/NAFIPS.1996.534790.

Schwefel, H. P. (1995).
Evolution and Opt
imun
Seeking

(p. 435). N.Y. Wiley.

Talbi, E. (2009).
Metaheruistics
.
Search

(p. 618).
Wiley.

Yang, X.
-
S. (2010).
Engineering Optimization An
Introduction with Metaheuristics Applications
.
Engineering Optimization
. Hoboken: Wiley.
Zanakis, S. H.,
and

Evans, J. R. (1981).
Heuristic “Optimization”: Why, When, and
How to Use It.
Interfaces
,
11
(5), 84
-
91.

Zapfel, G., Braune, R.,
and

Bogl, M. (2010).
Metaheuristic Search Concepts
.
Search

(1st
ed., p. 316). Berlin: Springer.

Zilinskas, A.,
and

Törn, Aimo.

(1989). Global
Optimization. In G. Goos
and

J. Hartmanis
(Eds.),
Lecture Notes in Computer Science vol
350

(p. 255).
Berlin: Springer.