11
Global Optimization
i
n Practice:
State

of

the

Art
a
nd Perspectives
János D. Pintér
Pintér Consulting Services, Inc
.
129 Glenforest Drive, Halifax, NS, Canada B3M 1J2
jdpinter@hfx.eastlink.ca
www.pinterconsulting.com
Summary.
Global optimization
–
the theory and methods of finding the best
possible
solution in
multi

extremal models
–
has become a subject of interest in recent decades.
K
ey theoretical results
and
basic
algorithmic approaches
have been followed by software implementations
that are no
w
used to handle
a growing range of applications.
This work
discusses
some practi
c
al aspects
of
global optimization
. Within this framework, we highlight
viable
solution approaches,
modeling
environments,
software
implementations
,
numerical examples,
and
re
al

world
applications.
Key words:
N
onlinear
systems a
nalysis and management
;
g
lobal optimization
strategies
;
m
odeling
environments
and
global
s
o
lve
r
implementations;
numerical examples; current
applications
and
future
perspectives
.
1
1.1
I
ntroduction
N
onlinearity
plays a
fundamental
role
in
the development of
natural
and man

made
objects
, formations
and processes.
Consequently, n
onlinear
descriptive
models
are of
key
relevance
across
the range of quantitative
scien
tifi
c
studie
s.
For
related
discussions
t
hat
illustrate this point
, consult
for instance
Bracken and
McCormick (1968), Rich (1973), Mandelbrot (1983),
Murray (1983),
Casti
(1990),
Hansen and Jørgensen (1991),
Schroeder (1991),
Bazaraa, Sherali and
Shetty (1993),
Stewart (1995),
Grossmann (1996
),
Pardalos, Shalloway and Xue
(1996),
Pintér (1996a),
Aris (1999),
Bertsekas (1999)
,
Corliss and Kearfott
(1999)
,
Floudas et al. (1999),
Gershenfeld (1999),
Papalambros and Wilde
(2000),
Chong and Zak (2001),
Edgar, Himmelblau and Lasdon (2001),
Gao,
Ogden
and Stavroulakis (2001),
Jacob (2001),
Pardalos and Resende (2002),
Schittkowski (2002),
Tawarmalani and Sahinidis (2002),
Wolfram (2002)
,
Diwekar (2003),
Stojanovic (2003),
Zabinsky (2003),
Bornemann, Laurie,
Wagon and Waldvogel (2004)
,
Fritzson (2004),
N
eumaier (2004),
Bartholomew

Biggs (2005),
Hillier and Lieberman
(2005)
,
Lopez (2005),
Nowak (2005),
Pintér
(200
6
a
,
2007
)
,
Kampas and Pintér (200
7
),
as well as many other
topical
work
s.
366
Advances in Mechanics & Mathematics
, Vol. III., 2006, Gao and S
herali (Eds.)
Decision
support
(
control, management, or
optimization) models that
incorporate a
n underlying
nonlinear system description frequently have multiple
local and global
optima. The objective of global optimization (GO) is to find
the
“
absolu
tely best
”
solution of nonlinear optimization models under such
circumstances.
W
e shall consider
the
general
continuous
global optimization
(CGO)
model
defined by the following ingredients:
x
decision vector, an element of the real Euclidean
n

space
R
n
;
l, u
explicit, finite
n

vector bounds of
x
th
a
t define a
“box” in
R
n
;
f(x)
continuous objective function,
f:
R
n
R
;
g(x)
m

vector of continuous constraint functions,
g:
R
n
R
m
.
Applying th
is
notation, the CGO model is stated as
min
f
(
x
)
(11.1)
x
D
:
={
x
:
l
≤
x
≤
u
g
(
x
)
≤
0}.
(11.2)
I
n (
11.
2) all vector inequalities are
interpreted
component

wise (
l,
x
,
u,
are
n

component
vectors and the zero denotes an
m

component
vector).
T
he set of
the
additional constraints
g
could
be empty, thereby leading to
box

constrained
GO
models.
L
et us
also
note that formally more general
optimization
models that
include also
= and
constraint
relations and/or
explicit
lower
bounds
on the
constraint
function value
s can be simply reduced to t
he model form (
11.
1)

(
11.
2).
Th
e CGO
model is very general: in fact, it
evident
ly
subsumes
linear
programming and convex nonlinear programming models
,
under corresponding
additional specifications. Furthermore,
CGO
also
subsumes
(formally)
the entire
clas
s of pure and mixed integer programming problems
. To see this, notice
that
all bounded integer variables can be represented by a corresponding set of binary
variables
,
and
then
every binary variable y
{0,1} can be equivalently represented
by its continuous
extension y
[0,1] and the non

convex constraint
y
(1
–
y
)≤0. Of
course, th
is
reformulation
approach
may not be
best
or even suitable
for “all”
mixed integer
optimization models: however, it certainly shows the generality of
the CGO model
framework
.
Without going into details, note finally that m
odels
with m
ultiple (partially conflicting) objectives are also often deduced to suitably
parameterized collections of
CGO (or simpler optimization)
models: this remark
also hints at the interchangeability of the objective
f
and one of the
(active)
constraints from
g
.
11 J.D. Pintér: Global Optimization in Practice: State

of

the

Art and Perspective
s
367
Let us
observe
n
ext
that
if
D
is non

empty, then
the above stated
basic
analytical
assumptions guarantee
that the optimal solution set
X*
in the CGO
model is non

empty
. This result directly follows by the classical theorem of
Weierstrass that states the existence of the
global
minimiz
er
point
–
or, in
general, a
s
et
of such points
–
of a continuous function over a non

empty
,
bounded and closed
(
compact
)
set
.
For
reasons of numerical tractability, the following additional requirements
are
also
often postulated:
D
is a full

dimensional subset (
“
body
”
) in
R
n
;
the set of globally optimal solutions to (
1
1
.1
)

(
11.
2) is at most countable;
f
and
g
(component

wise) are Li
pschitz

continuous functions on [
l
,
u
].
Without going into
technical
details, notice that t
he first
of these
assumption
s
(
the set
D
is the closure of its non

empty interior
)
makes algorithmic search
easier
–
or at all
possible
–
within
D
.
T
he second assump
tion
support
s
theoretical
convergence results
:
note that
in most well

posed
practical
GO
problems the set
of global optimizers consists
either
of a single point
x
*
or at most of several
points.
T
he
third
assumption is a sufficient condition for estimating
f*
=
f
(
x
*
)
on
the basis of a finite set of
generated
feasible
search points. (
Recall that t
he real

valued function
h
is Lipschitz

continuous on its domain
of definition
D
R
n
, if

h
(
x
1
)

h
(
x
2
)
≤
L

x
1

x
2
 holds for all pairs
x
1
D
,
x
2
D
; here
L
=
L
(
D,h
) is a suit
able
Lipschitz

constant of
h
on the set
D
.)
We emphasize that the
exact
knowledge of
the smallest suitable Lipschitz

constant
for each model function
is not required,
and in practice
such information is
typically un
available
.
At the same time, all
models d
efined by continuously differentiable functions
f
and
g
belong to the
CGO or even to the Lipschitz
model
class
.
Th
e notes
presented
above
impl
y
that
the
CGO
model

class
covers
a
very
broad range
of
optimization
problems
.
As a consequence of this generalit
y,
i
t
includes also
many
model
instances
that are difficult to solve numerically
. For
illustration, a
merely
one

dimensional, box

constrained
GO
model
–
based on the
formulation (
11.
3)
–
is shown in Figure 1
1
.
1.
min
2
cos( )sin( )
x x x
0
≤
x
≤
10
.
(
11.
3)
Model complexity
often
increase
s
dramatically
–
in fact, it can grow
exponentially
–
a
s the model size expressed by the number of variables and
constraints grow
s.
To illustrate this point
, Figure
11.2
shows the objective
function in the model
(11.4) that is simply generalized from (11.3) as
min
2 2
cos( )sin( ) cos( )sin( )
x y x y x y
0≤x≤
10,
0≤y≤
10.
(11.4)
368
Advances in Mechanics & Mathematics
, Vol. III., 2006, Gao and Sherali (Eds.)
Figure 1
1.1
The objective function in
model
(
11.
3).
Figure
11.
2
The objective function in
model
(
11.
4)
.
11 J.D. Pintér: Global Optimiz
ation in Practice: State

of

the

Art and Perspective
s
36
9
T
he
presented
two
(
l
ow

dimensional,
and
only box

constrained)
model
s
already
indicate
that GO models
–
for instance,
further
extensions of model
(
11.
3)
, perhaps with added
complicated
nonlinear
constraints
–
could be
come
trul
y difficult to
handl
e numerically.
One should also
point
out
here
that a direct
analytical solution approach is viable only in
very
special cases
, since in general
(under
further
structural assumptions) one should investigat
e all Kuhn

Tucker
points
–
minimizers, maximizers, and saddle points
–
of the CGO model
. (Think
of
carrying out
this analysis for the model depicted on Figure 11.2
–
or for its
100

dimensional extension
.
)
Arguably
,
not all GO models are as difficult as in
dicated by Figure
s
1
1.1
and
11.
2.
At the same time, we
typically
do
not have the possibility to
directly
inspect
,
visualize
, or
estimate
the
overall numerical
difficulty of
a
complicated
n
onlinear (global)
optimization
model. A
practically
important case i
s when
one
needs to optimize the parameters of a model
th
at has been developed by someone
else
. T
h
e
model may
b
e
confidential
,
or
just
visibly complex
;
it c
ould
even
be
presented to the
optimiz
ation
engine
as
a
compiled
(object,
library
, or similar)
softwa
re
module
. I
n such
situations
,
direct
model inspection
and structure
verification
are
not
possible
.
In other
practically relevant
cases, the evaluation of
the optimization model functions may require the numerical solution of a system
of
embedded
different
ial
and/or algebraic
equations,
the evaluation of special
functions
,
integrals,
the execution of other
deterministic computational
procedures or
stochastic simulation
modules
,
and so on.
T
raditional
nonlinear
optimization methods
discussed in m
ost topi
cal
textbooks
such as e.g.
Bazaraa, Sherali, and Shetty (1993), Bertsekas (1999)
,
Chong and Zak (2001)
,
Hillier and Lieberman (2005)
search only for local
optima
.
This
generally followed
approach is based on the
tacit
assum
ption that
a
“sufficiently good
” initial solution
(
that is located
in the region of attraction of
the “true”
global
solution
) is available
. Figure
s
11.
1 and
11.
2
–
a
nd
the
practical
situations mentioned above
–
suggest that this may not
always
be a realistic
assumption
.
Nonlinear m
odels
with less
“
dramatic
”
difficulty, but in (perhaps
much) higher dimensions
may
also
require
global
optimization
. For instance, in
advanced engineering design,
optimization
models with hundreds
,
thousands
, or
more
variables and constraints are
analyzed
and n
eed to be solved
. In similar
cases, even an approximate
ly completed
, but genuine
ly
global
scope
search
strategy may
(
and typically will
)
yield
better results than the most sophisticated
local search approach
“
started from the wrong valley
”
…
This fact has m
otivated
research to develop
practical
GO strategies.
370
Advances in Mechanics & Mathematics
, Vol. III., 2006, Gao and Sherali (Eds.)
11.
2
Global Optimiza
tion
Strategies
As of
today
,
well
over a hundred textbooks and a
n increas
ing
number of
website
s
are devoted
(partly or completely)
to
global op
t
imization
.
A
dded to this
massive
amount of
information
is a
very
s
ubstantial
body of literature on combinatorial
optimization (
CO)
,
the latter being
–
at least in
theory
–
a
“
subset of GO
”
.
Th
e
most
importa
nt
global op
t
imization
model types and
(
mostly
exact
, but also
several prominent heuristic
)
solution approaches are discussed in
detail by
the
Handbook of Global Optimization
volumes, edited by Horst and Pardalos (1995),
and by Pardalos and Romeijn (2002).
We also refer to the topical website of
Neumaier (200
6
)
, with
numerous links to other useful information sources. The
concise
review of GO
strategies
presented here draws on these sources
,
as well as
on the
more detailed
expositions in
(Pintér, 2001a
, 200
2b
).
Let us point out that
some of the methods
listed below
are more often used in solving
CGO
models,
while others have been mostly applied
so far
to
handle
CO
models.
Since
CGO
formally
includes
CO
,
it should not be surprising that approaches suitable fo
r
certain specific
CO model

classes can (or could) be put to good use
to
solv
e
CGO models.
Instead of a
more
detailed
(
but
still
not unambiguous) classification,
here
we
simply
classify
GO
methods
into
two
primary
categories: exact and heuristic.
Exact met
hods possess t
heoretically
established
–
deterministic or stochastic
–
global convergence properties. That is, if
such a me
th
od
could be carried out
completely
as
an infinite iterati
ve
process, then the generated limit point(s) would
be
long
t
o the set of g
lobal solutions
X*
.
(For a single global solution
x*
, this
would be the only limit point.)
In the case
of stochastic GO methods
, th
e
above
statement is valid
“only”
with probability one.
I
n practice
–
after a finite number
of algorithmic search steps
–
one
can
only
expect
a
numerically
validated or
estimated
(deterministic or stochastic) lower bound for the global optimum
value
z*
=
f
(
x*
)
, as well as a
best
feasible
or near

feasible
global
solution estimate.
We
emphasize that
to produce such estimates
is not
a trivial
task
, even for
implementations of
theoretically well

established algorithms. As a cautionary
note,
one can
conjecture that there is no GO method
–
and never will be one
–
that can solve
“
all
”
CGO models
with
a certain
number of variables
to a
n
ar
bitrarily
given precision
(in terms of the argument
x*
)
, within a given time
frame
,
or
within a
preset
model function evaluation
count
. To support this
statement
,
please
recall Figures
11.
1 and
11.
2
: both of the
objective functions
displayed
could be made
arbitrarily more difficult
,
simply
by
changing
the
frequencies and amplitudes of the
embedded
trigonometric terms
. We do not
attempt to display
such
“monster”
functions
,
since
even
the
best
visualization
11 J.D. Pintér: Global Optimization in Pract
ice: State

of

the

Art and Perspective
s
37
1
software will soon become inadequate: think
for instance
of a function like
2
1000cos(1000 )sin(1000( ))
x x x
.
For
a m
ore practical
ly motivated example
,
one can also
think of solving a difficult system of nonlinear
equations
: here
–
after a
prefixed
finite number of model function evaluations
–
we may not have
an
“
acceptable
”
approximate
n
um
er
ical
solution
.
Heuristic methods do not possess similar convergence guarantees
to
those of
exact
meth
o
d
s
. At the same time,
t
hey may provide good quality solutions in
many difficult
GO problems
–
assuming that the method
in question
suits
well
the specific model type
(structure)
solved. Here a
different
cautionary note is in
order
. S
ince s
uch methods are often based on
some
gene
ric meta

heuristics
,
over
ly
optimistic claim
s
regarding the “universal” efficiency of
their
implementations
are often not
supported
by results in solving truly difficult
–
especially
nonlinearly
constrained
–
GO models.
In addition, heuristic
meta

strategi
es
are often more difficult to adjust to new model types than some of the
solver
implement
ations
based on exact algorithms.
Exact stochastic methods
based on direct sampling are a good example
for the latter category
, since these
ca
n be applied
to
“all” GO
models directly, without a need for essential
code
adjustments
and tuning
.
This is in contrast e.g. to
m
ost
population

based search
method
s
in which the
actual
steps of generating new trial solutions may depend
significantly on the
structure
of the
model

instance solved
.
11.
2.1
Exact
Method
s
“Naïve” approaches (grid search, pure random search): these are obviously
convergent, but in general “hopeless” as the problem size grows.
Branch

and

bound
methods
: these
include
interval arithmetic based
strategie
s,
as well as
customized
approaches for
Lipschitz
global optimization
a
nd
f
or
certain
classes of
difference of
convex functions (D.C.)
model
s
. Such
methods can also be applied to
constraint satisfaction problems and
to (general)
pure and
m
ixed
i
nteger
p
rog
ramming
.
Homotopy
(
path following, deformation,
c
ontinuation
, trajectory, and related
other
)
methods
: these are aimed at
find
ing
the set of global solutions in smooth
GO models.
Implicit enumeration techniques: examples are vertex enumeration in
concave m
inimization models, and
generic
dynamic programming in
the context
of
combinatorial optimization
.
Stochastically convergent
sequential
sampling methods
: these include
adaptive
random search
es,
single

and
multi

start methods, Bayesian search
strategies
, a
nd their combinations.
372
Advances in Mechanics & Mathematics
, Vol. III., 2006, Gao and Sherali (Eds.)
For detailed expositions related to deterministic
GO
techniques
–
in addition to
the
Handbooks
mentioned earlier
–
consult
for example
Horst and Tuy
(1996),
Kearfott (1996)
,
Pintér (1996a), Tawarmalani and Sahinidis (2002), Neumaier
(2004),
and Nowak (2005). On stochastic GO strategies, consult
e.g.
Zhigljavsky
(1991), Boender and Romeijn (1995),
Pintér (1996a),
and
Zabinsky (2003)
.
11.
2.2
Heuristic
Method
s
Ant colony optimization
is
based on
individual search steps and “ant

like”
interaction
(communication)
between search agents
.
Basin

hopping strategies
are
based on a sequence of perturbed local searches
,
in an effort to find improving optima
.
Con
vex underestimation
attempts
are
based on
a
limited sampling
effort
that
is used to estimate
a postulated (approximate
)
convex objective function model
.
Evolutionary search methods
model
the behavioral linkage
among the
adaptively changing set of candidate
solutions (
“parents” and their
“children”
, in
a sequence of “generations”).
G
enetic algorithms
emulate specific genetic operat
ion
s
(selection, crossover,
mutation)
as
these are
observed in nature
, similarly to evolutionary methods.
Greedy adaptive search
strategies
–
a meta

heuristics often used
in
combinatorial optimization
–
construct “quick
and promising
”
initial
solutions
which are then refined by a su
itable
local optimization procedure.
Memetic algorithms
are
inspired by analogies to cultural
(as op
posed to
natural)
evolution
.
Neural networks
are
based on
a model of
the parallel architecture of
the
brain
.
Response surface methods
(directed sampling techniques
) are
often used in
handling
expensive “black box”
optimization
models
by postulating and the
n
gradually
adapting a surrogate function model
.
Scatter search
is
similar in
its
algorithmic structure
to
ant colon
y
, genetic
,
and
evolutionary searches
, but without their “biological inspiration”
.
Simulated annealing
methods are
based on the analogy of c
ooling crystal
structures
that will attain a (low

energy level
, stable
) physical equilibrium state.
Tabu search
forbids or penalizes search moves which take the solution in the
next few
iterations to points in the solution space that have been previously
v
isited
. (T
abu search
as outlined here
has been t
ypically applied in the context of
combinatorial optimization.)
11 J.D. Pintér: Global Optimization in Practice: State

of

the

Art and Perspective
s
37
3
Tunneling strategies
, filled function methods
,
and
other
similar methods
attempt to sequentially find an improving sequence of local optima
, by gradually
modifying the objective function
to escape from the solutions found
.
In addition to the earlier mentioned topical GO books,
we
refer
here
to
se
veral
wo
rks
that discuss
mostly combinatorial
(
but also some continuous)
global
optimization
models
and heuristic strategies
. For detailed discussions of
theory and applications,
consul
t
e.g.
M
ichalew
icz (1996),
Osman and Kelly
(1996),
Glover and Laguna (1997),
Vo
ss, Martello, Osman, and Roucairol
(1999),
Jacob (2001),
Ferreira (2002),
Rothlauf (2002),
Jones and Pevzner
(2004)
.
It is worth pointing out that Rudolph (1997)
discusses
the typically
missing
theoretical foundations f
or
evolutionary algorithms, including
stochastic
convergence studies
. (The underlying key convergence results for adaptive
stochastic search methods are discussed also in Pintér (1996a)
.
)
The topical
chapters in
Pardalos and Resende (2002)
also offer expositions related to both
exact and heur
istic GO approaches.
To conclude this
very
concise review, let us
emphasize again
that
numerical
GO can be tremendously difficult. Therefore it
can
be good practice to try several
–
perhaps
even
radically different
–
search
approaches
to
tackle
them, whene
ver
this is possible.
T
o do t
his
, one
needs ready

to

use
model development and
optimiz
ation
software tools.
11.
3
Nonlinear
Optimiza
tion
in Modeling Environments
A
dvances in modeling
techniques
,
solver engine implementations
and computer
technology
have
led to a
rapidly
growing interest
in
modeling environments.
For
detailed discussions c
onsult
e.g.
, the topical
Annals of Operations Research
volumes edited by Maros and Mitra (1995), Maros, Mitra, and Sciomachen
(1997), Vladimirou, Maros, and Mitra (2000),
Coullard, Fourer, and Owen
(2001)
,
a
s well as
the volume
s
edited by
Voss and Woodruff (2002), and by
Kallrath
(2004)
. Additional
useful
information
is
provided by
the
websites
of
Fourer (200
6
), Mittelmann (200
6
), and Neumaier (200
6
)
, with
numerous
further
links
.
Prominent e
xamples of widely used
modeling
systems that
are
focused on
optimization include
AIMMS (Paragon Decision Technology, 200
6
),
AMPL
(Fourer, Gay, and Kernighan, 1993), the Excel P
remium
S
olver
P
latform
(Frontline Systems, 200
6
),
GAMS (Brook
e, Kendrick, and Meeraus, 1988),
ILOG (2004),
the LINDO Solver Suite (LINDO Systems,
200
6), MPL (
Maximal
Software, 200
6
)
,
and
TOMLAB (200
6
)
.
(Please note that the literature references
cited may not
always
reflect the
c
urrent status of the modeling systems
discussed
here
: for the latest information, contact the developers
and/or visit their website
.)
374
Advances in Mechanics & Mathematics
, Vol. III., 2006, Gao and Sherali (Eds.)
T
here exists also a large
variety
of
core
compiler platform

based
solver
s
ystems
with
more or less
built

in model development functionality: in principle,
such
solvers
can be linked to the modeling languages listed above.
At the other
end of the spectrum, there is also
notable
development
in
relat
ion
to integrated scientific an
d technical computing
(ISTC) systems
such as
M
aple
(Maple
soft
, 200
6
),
Mathematica
(Wolfram
Research
,
200
6
),
Mathcad
(Mathsoft, 200
6
)
and MATLAB (The MathWorks, 200
6
).
From among the
many
hundreds of books discussing
ISTC
systems,
w
e mention
here
as example
s the
works of
Birkeland (1997),
Bhatti (2000),
Parlar (2000),
Wright
(2002)
, Wilson,
Turcotte, and Halpern (2003),
Moler (2004),
Wolfram (2003), Trott (2004),
and
Lopez (2005)
.
The ISTC
s
ystems
offer
a growing range of optimization

related
features
,
eithe
r a
s
built

in
function
s
or
as add

on
products.
T
he modeling environments
listed above
are aimed at meeting the
needs
of
(
arguably, different
types of)
users
.
User categories
include educational
(instructors and students); research scientists, engineers, c
onsultants
,
and
other
practitioners
(possibly, but not necessarily equipped with an in

depth
optimization related background); optimization experts,
software
application
developers, and other
“
power users
”
.
(
Ob
ser
v
e
that
t
he user categories listed are
not
necessarily disjoint
.
)
The pros and cons of the individual software products
in terms of
their
hardware and software demands,
ease of
usage,
model
prototyping
options
,
detailed code development
and maintenance
features
,
optimization model
checking and
pr
ocessing tools, availability of solver
option
s
and other
auxiliary tools, program execution speed, overall level of
system
integration,
quality of related documentation and support
, customization
options,
and
communication with end users
make
the
corresp
onding
modeling and
solver
approaches
more or less attractive for
the
various
user groups.
Given the almost overwhelming amount of
topical
information,
in short,
wh
ich
are the
currently
available platform and solver engine choices for the GO
researcher or
practitioner? The
decade

old
software review (Pintér, 1996b;
available also at the
web
site of Mittelmann, 2006
) list
ed
a few dozens of
individual software products
,
including several
websites
with further software
collections. Neumaier’s (200
6
) webpage
cu
rrently
lists more than 100
software
development projects
.
Both of these
websites
include general purpose solvers, as
well as application

specific
products.
(
It is noted that q
uite a few of the links in
th
ese
software listings are now obsolete, or have bee
n changed.)
The user’s preference obviously depends on many facto
r
s
. A key question is
whether one
prefers
to use “free” (non

commercial, research,
o
r even
open
source) code, or looks for a
“ready

to

use”
professional
ly
supported
commercial
product. There
is a significant body of freely available
solvers
, although the
quality of solvers and their documentation
arguably
varies
.
(
Of course, t
his
remark
could well apply also to commercial products
.
)
11 J.D. Pintér: Global Optimization in Practice: State

of

t
he

Art and Perspective
s
37
5
Instead of trying to
impose
personal
judgment
on any of the products
mentioned in this work
, the reader is encouraged to do some web browsing and
experimentation, as
his/her
time and resources allow.
Both
Mittelmann (200
6
)
and Neumaier (200
6
)
provide more
extensive
information
on non

commercial
–
as opposed to commercial
–
systems.
Also motivated by this tendency, w
e shall
mention
here
several
software products that are part of commercial systems
,
typically as an add

on opt
ion, but in some cases as a built

in option
. Needless to
say, although
th
is
author (being a
lso a
professional software developer)
may have
opinions, the
alphabetical
listing
presented
below
is
strictly matter

of

fact. We
list
only currently available produ
cts that are explicitly targeted towards global
optimization, as advertised by the websites of the listed companies.
For this
reason, nonlinear (local) solvers are
–
as a rule
–
not listed here
; furthermore, we
will not list modeling environments that curr
ently have no global solver options
.
AIMMS
, by Paragon
Decision Technology
(
www.aimms.com)
: the BARON
and LGO global solver
engine
s are offered with this modeling system
as add

on
options
.
Excel Premium Solver Platform
(PSP)
, by
Frontline Systems
(www.sol
ver.com)
.
T
he developer
s
of the PSP
offer
a global pre

solver
option
to
be used with
several of their
local
optimization engines
: t
hese currently include
LSGRG, LSSQP,
and
KNITRO
. Frontline Systems also offers
–
as genuine
global solvers
–
an
Interval Glob
al Solver
,
an
Evolutionary Solver
,
LGO
and
OptQuest.
GAMS
, by the GAMS
Development Corporation
(www.gams.com):
currently,
BARON, DICOPT, LGO, MSNLP, OQNLP,
and
SBB are offered
as
solver options
for global optimization
.
LIN
D
O
, by LINDO Systems
(www.lindo.c
om): both the LINGO modeling
environment and What’sBest! (the
company’s
spreadsheet solver) ha
ve
b
uilt

in
global solver function
ality
.
Maple
, by Maplesoft (www.maplesoft.com) offers the Global Optimization
Toolbox
as an add

on product
.
Mathematica
, by
Wolf
ram Research (www.wolfram.com) has a built

in
function (
called
NMinimize)
for
numerical
global optimization. In addition, there
are several third

party
GO
packages that can be
directly
linked to Mathematica:
these are Global Optimization, MathOptimizer, an
d MathOptimizer Professional.
MPL
, by
Maximal Software
(www.maximal

usa.com): the LGO solver
engine
is offered
as an add

on
.
376
Advances in Mechanics & Mathematics
, Vol. III., 2006, Gao and Sherali (Eds.)
TOMLAB
, by TOMLAB Optimization AB (www.tom
op
t.com
)
is an
optimization platform for solving MATLAB models. The
TOMLAB
global
solvers includes
CGO, LGO, MINLP, and OQNLP.
Note that MATLAB’s
own
Genetic Algorithm and Direct Search Toolbox
es
also
ha
ve
heuristic global solver
capabilities.
T
o illustrat
e
the functionality and usage of global optimization software
,
next
we
review the key features of
the LGO solver engine, and then
apply
its Maple
platform

specific
implementation in several
numerical
examples.
11.
4
The
LGO
Solver
Suite
and its Implementa
tions
11.
4
.
1
L
GO
: Key Features
The Lipschitz Global Optimizer (LGO)
solver suite
has been developed and used
for more than a decade.
The
top

level
de
sign
of LGO
is based on the seamless
combination of
theoreticall
y convergent global and efficient local s
earch
strategies.
Currently
,
LGO
offers
the following solver options:
Adaptive partition and search (
branch

and

bound
)
based global search
(BB)
A
daptive global random search
(single

start) (GARS)
A
daptive global random search
(multi

start) (MS)
Constrain
ed
local search
by the generalized
reduced gradient
(GRG) method
(LS)
.
In a typical LGO
optimization
run,
the user selects one of the global
(BB,
GARS, MS)
solver
options; this search phase is then automatically followed by
the LS option. It is also possi
ble to
apply
only the LS solver option, making use
of
an automatically set
(default) or a user

supplied
initial solution.
The global search methodology implemented in LGO is based on the
detailed
exposition in (Pintér, 1996a), with
many
added numerical fea
tures. The well

known GRG method is discussed in
numerous
articles and textbooks, consult
for
instance
Edgar, Himmelblau, and Lasdon (2001).
Therefore
only a very brief
overview of the
LGO
component algorithms
is provided
here
.
BB, GARS, and MS are
all
bas
ed on
globally convergent
search methods
.
Specifically, in Lipschitz

continuous
models
–
with
suitable
Lipschitz

constant
(over)
estimates for all model functions
–
BB theoretically generates a sequence
of search points that will converge to the global solu
tion point. If there is a
countable set of such optimal points, then a convergent search point sequence
will be generated in association with each of these.
11 J.D. Pintér: Global Optimization in Practice: State

of

the

Art and Perspective
s
37
7
In a
GO model with a
c
ontinuous structure (but without postulating access to
Lipschitz information), b
oth GARS and MS are globally convergent
,
with
probability one
(w.p. 1)
. In other words,
the sequence of points that is associated
with the
generated
sequence o
f
global optimum estimate
s
will converge to a point
which
belongs to
X*
, with probability
one.
(Again,
if
several
such
convergent
point sequences are generated
by the stochastic search procedure
, then each of
these
sequences
has a
corresponding
limi
t point in
X*
, w.p. 1.
)
The LS method
(
GRG
)
is aimed at finding a locally optimal solution
that
satisfies the Karush

Kuhn

Tucker system of
necessary local optimality
conditions
, assuming standard
model
smoothness and regularity conditions.
I
n all three gl
obal search modes the model functions are aggregated by an
exact penalty (merit) function
. By contrast, in the local
search
phase all model
functions are considered and
handl
ed individually. The global search phases
incorporate
both
deterministic and
stoch
astic sampling procedure
s: the latter
support the usage of statistical bound estimation methods
,
under basic continuity
assumptions.
A
ll
LGO component
algorithms are derivative

free
. In the global
search phase, BB, GARS, and MS use only direct sampling inf
ormation based on
generated
points and corresponding model function values. I
n the
LS
phase
central differences are used to approximate
function
gradients
(under a postulated
locally smooth model structure)
. This
direct search
approach
reflects our
objecti
ve to handle
also
models
defined by
merely computable, continuous
functions, including
completely
“
black box
”
systems.
In
numerical
practice
–
with
finite runs,
and
user

defined or default option
settings
–
the
LGO global solver options
generate a global
solution estimate that
is subsequently refined by the local search mode.
If t
he LS mode
i
s used without
a preceding global search phase
, then
LGO serves as a general purpose local
solver engine.
The expected practical outcome of using LGO to solve a model
(barring numerical problems
which could impede any numerical
method
) is a
global search based feasible solution that meets at least the local optimality
conditions. Extensive numerical tests and
a range of practical
applications
demonstrate that LGO ca
n
lo
cat
e
the global solution
not only in the usual
academic test problems, but also
in
more complicated
, sizeable GO models: this
point
will be illustrated later on
in Sections 5 and 6
.
(At the same time, keep in
mind the caveats mentioned earlier regarding th
e performance of any global
solver: nothing will “always” work satisfactorily, under resource limitations.)
11.
4
.
2
L
GO
Implementations
The current platform

specific implementations include the following:
378
Advances in Mechanics & Mathematics
, Vol.
III., 2006, Gao and Sherali (Eds.)
LGO with a
t
ext
input/output i
nterface
,
for C and Fortran compiler platforms
LGO
i
ntegrated
d
evelopment
e
nvironment
with a Microsoft Windows style
menu interface
,
for C and Fortran compiler platforms
AIMMS /
LGO
s
olver
e
ngine
GAMS /
LGO
s
olver
e
ngine
Global Optimization Toolbox
for Maple
MathOptimizer Professional, with an
LGO
s
olver
e
ngine
link to
Mathematica
MPL /
LGO
s
olver
e
ngine
TOMLAB /LGO, for MATLAB users.
T
echnical descriptions
of these software implementations
–
including
detailed numerical tests and a range of applications
–
have
appeared elsewhere
.
For
implementation
details
and illustrative results
,
consult Pintér (1996
a
, 1997,
2001a,
b,
2002a,b, 2003b, 2005
)
, as well as
Pintér and Kampas (2003),
Pintér,
Holmstr
öm, Göran and Edvall (2004),
Pintér,
Linder, and Chin (200
6
)
.
The
compiler

based
LGO solver s
uite
can be
used
in stand

alone mode, and
also
as a
solver
option in
various
modeling environments.
In its core
–
text
input
/
output
based
–
implementation
versio
n, LGO reads an input text file that
contains application

specific
(model descriptor)
information, as well as a few key
solver options (global solver type, precision settings, resource and time limits).
During the program run, LGO makes calls to
an
applica
tion

specific model
function file that returns function values for the algorithmically chosen sequence
of arguments.
Upon completing the LGO run,
automatically generated
summary
and detailed report file
s
are available. As it can be expected, this LGO versi
on
has the lowest demands for hardware, it also runs fastest, and it can be directly
embedded into various decision support systems, including proprietary user
applications.
The same core LGO system is also available in directly callable
form, without read
ing and writing text file: this version is frequently used as a
built

in solver module in other (general purpose or customized modeling)
systems.
LGO
can
also
be equipped
as a readily available
(
implement
ed)
option
with
a
M
icrosoft
Windows
style menu
interface
.
This
enhanced
version is
referred to as the
LGO Integrated Development Environment (IDE).
The LGO
IDE supports model development, compilation, linking, execution, and the
inspection of results
, together with built

in
basic
help facilities
.
In th
e two
LGO implementations
mentioned
above,
models can be connected
to LGO using one of several programming languag
es
that are
available on
personal computers and workstations. Currently supported platforms include
, in
principle, “
all
”
professional Fortran
77/90/95 and C
/C++
compilers.
Examples of
supported compilers include Compag, Intel, Lahey, and Salford
Fortran,
as
well
11 J.D. Pintér: Global Optimization in Practice: State

of

the

Art and Perspective
s
37
9
as g77 and g95, and Borland and Micro
soft C/C++. O
ther customized versions
(to
use with other compilers or software applications)
can also be made available
upon request.
In the optimization modeling language
(AIMMS, GAMS, and MPL)
or ISTC
(
Maple, Mathematica,
and
TOMLAB)
environments
the co
re LGO solver engine
is
seamlessly
linked to the corresponding modeling platform, as a dynamically
callable or shared library, or as an executable program.
The key
advantage of
using LGO within a modeling
or ISTC
environment is
the
combination of
model
ing
system
specific features
–
model
prototyping and detailed
development,
model
consistency checking, integrated documentation,
visualization, and other
platform

specific features
–
with
a
numerical
performance comparable to that of the stand

alone LGO
solver
suite
.
F
or peer reviews
of several
of the listed
implementations
,
th
e
reader is
refer
red
to
Benson and Sun (2000)
on
the core
LGO
solver suite,
Cogan (2003)
on MathOptimizer Professional, and Castillo (2005)
, Henrion (2006) and Wass
(2006)
on
the
Global
Optimization Toolbox for
Maple. Let us also
me
nt
ion
here
that LGO
serves
to
illustrate global optimization software
–
in connection with a
demo version of the MPL modeling
system
–
in the prominent
O.R.
textbook by
Hillier and Lieberman (2005)
.
11.
5
Illu
strative Examples
In order to pr
esent
some
small

scale, yet non

trivial
numerical
examples, in this
section we
illustrate the functionality of the LGO software as it is implemented in
the Global Optimization Toolbox
(GOT)
for Maple.
Maple (Maplesoft, 200
6
) enables the development of
fully
interactive
documents
called worksheets. Maple worksheets
can incorporate
technical
model
description,
combined with
computing,
programming,
and
visualization
features
.
Maple includes
several thousand
s
of
built

in
(direc
tly callable)
functions
that
support
a
broad
range of
the
modeling and computational needs of
scienti
sts
and
engineers
. Maple
also
offer
s
a d
etailed
on

line
help and
documentation
system
with
ready

to

use examples, topical
tutorials
,
manuals and web links
.
Further
system components are:
a built

in mathematical dictionary
,
debugging
tools
,
and
automated (ANSI C, Fortran 77, Java, Visual Basic and
MATLAB
)
code
generation
.
D
ocument production
features
includ
e
HTML, MathML, TeX, and
RTF converters
.
T
he
se
capabi
lities accelerate and expand the scope of
the
optimization
model development and solution
process
.
Maple
–
similarly to other
modeling environments
–
is portable across a
ll major
hardware platforms and
operating systems
(
including
Windows, Macintosh, Linux
, and Unix).
380
Advances in Mechanics & Mathematics
, Vol. III., 2006, Gao and Sherali (Eds.)
Without
going into
further
details
on Maple itself
, we refer to
the
web site
www.maplesoft.com
that
offers
in

dept
h
topical information
, including product
demos and downloadable technical materials
.
The core of the
Global Optimization
Toolbox
(GOT)
for Maple
is
a
customized implementation of
the LGO solver suite
(Maple
soft
, 2004)
that
–
as
an add

on product, upon ins
tallation
–
can be
fully integrated with Maple. The
advantage of this approach is that, in principle, the GOT can
readily
handle
“
all
”
(continuous) model
functions that
can be
defined in Maple, including
also new
(
programm
abl
e
)
functions
.
We do not wish t
o go into programming details
here
, and assume that the key
ideas shown by the illustrative
Maple
code snippets will be
easily
understandable
to
all
readers with some programming experience.
Maple commands will be
typeset in
Courier
bold
font,
following th
e
so

called
classic Maple input
format
. T
he
input
commands
are typically followed by Maple output lines, unless
the latter
are
suppressed by using
the symbol “
:
”
instead of
“
;
”
at the end of an
input line.
In the numerical experiments described below, an A
MD
Athlon
64
(3200+,
2GHz)
processor
based desktop computer
has been
used that runs under
Windows XP Professional
(Version 2002, Service Pack 2)
.
11.
5.
1
Getting Started
with the Global
Optimization Toolbox
To
i
llust
r
at
e the basic usage of the T
oolbox
, le
t us
revisit
m
odel (
11.
3)
. The
Maple command
> with(GlobalOptimization);
makes possible the direct invocation of
the subsequently issued
,
GOT related
,
commands.
The
n the
next Maple command
numerically
solves model (
11.
3):
the
response
line
below
the com
mand
display
s the
approximate
optimum
value
, and
the
corresponding
solution argument.
> GlobalSolve(cos(x)*sin(x^2

x), x=1..10);
[

.990613849411236758, [x = 9.28788130421885682]]
The detailed runtime information indicates that the
total number of functi
on
evaluations
is
1262
;
the associated runtime is
a small fraction of a second.
11 J.D. Pintér: Global Optimization in Practice: State

of

the

Art and Perspective
s
3
81
Recall
here
Figure
11.
1
which
–
after careful inspection
–
indicates that this
is
indeed
the (approximate) global solution. There are several local solutions that
are fairly close to the global one: two of these solutions are
[

.979663995439954860, [x = 3.34051270473064265]]
,
and
[

.969554320487729716, [x = 6.52971402762202757]]
.
Si
milarly, the next statement
returns
a
n approximate
global
solution in
the
visibly non

trivial
model (
11.
4):
> GlobalSolve(cos(x)*sin(
y
^2

x)+cos(y)*sin(
x
^2

y),
x=1..10, y=1..10);
[

1.95734692335253380,
[x = 3.27384194476651214, y = 6.02334184076140478]]
.
The
result
shown above ha
s
been
obtained using GOT default settings
:
t
he
total number of function evaluations
in this case
i
s
25
87
, and the runtime is
still
practically zero. Recall now
also
Figure
11.
2 and the discussion related to the
possibly numeric
al difficulty of GO models. The solution found by the GOT is
obviously global search based, but without a
rigorous
deterministic guarantee of
its quality. Let us emphasize that to obtain such guarantees
–
e.g., by using
interval arithmetic based solution t
echniques
–
can be a very resource demanding
exercise
,
especially
in more complex
and/or higher

dimensional
models,
and
that
it
may
not be possible e.g. in “black box” situations.
A
s
traightforward
way to
attempt finding a better quality solution is to inc
rease the allocated global search
effort
.
T
heoretically
–
using
an
“infinite” global search effort
–
this will lead to
an arbitrarily close
numerical
estimate of the global optimum value
. In the next
statement we set the global search effort to 10
00
000 ste
ps (this
limit
is
applied
only
approximate
ly,
due to the possible activation of other stopping criteria)
:
> GlobalSolve(cos(x)*sin(
y
^2

x)+cos(y)*sin(
x
^2

y),
x=1..10,
y=1..10,
evaluationlimit=10000
00
,
noimprovementlimit=10000
00
);
[

1.98122769882222882,
[
x = 9.28788128193757068, y = 9.28788127177065270]]
.
382
Advances in Mechanics & Mathematics
, Vol. III., 2006, Gao and Sherali (Eds.)
As
it can be seen
, we have found
a
n improved
solution
,
at the expense of
a
significantly increased
global search effor
t
.
(
Now t
he
total number of function
evaluations
is
942439
, and
the runtime is approximately 5 seconds.)
In general
,
more
search
effort
can
always
be added, in order to verify or perhaps improve
the
incumbent numerical
solution
.
Comparing
now
th
e
solution
obtained
to that of model (
11.
3)
–
and observing
the obvious
formal
connection between the two models
–
one can deduce that
now we
have
found a close numerical approximation of the true global solution.
Simple modeling insight also tells us that the globa
l solution
in model (
11.
4)
is
bounded from below by

2
. H
ence,
even without Figure
s
11.
1
and
11.
2
we would
know that the solution
e
s
timates
produced above must be fairly
close to the best
possible solution.
Th
e
pre
se
nted
examples illustrate several import
ant points:
Global optimization models
can be
truly
difficult
to solve numerically
, even
in
(very)
low dimensions
.
It is not always possible to
“guess”
the level of difficulty
. O
ne can not always
(or at all)
generate
model visualizations
similar
to Figure
s
11.
1 and
11.
2, even in
chosen variable subspaces
,
since it could be too expensive numerically
–
even if
we have access to suitable graphics facilities
.
Insight and model

specific
expertise can help
significantly,
a
nd these should be used whenever p
o
ssibl
e
.
There is no solver that will handle al
l
possible instances from
the
general
C
GO model class with
in an
a
rbitrary
prefixed amount of search effort
.
In
practice, o
ne needs to select
and
recommend
default solver parameters and
options that
“
work
well
in mos
t cases
,
based
on
an acceptable amount of
effort
”
.
Considering
the fact that practical
ly motivated
model
ing
s
tudies
are often
supported
by noisy
and/or
scarce data, this pragmatic approach is often justified.
The default
solver
settings should return
a
glo
bal search based high

quality
feasible solutions
(arguably, the models (
11.
3) and (
11.
4) c
an
be considered as
difficult instances for their
low
dimensionality)
.
Furthermore, i
t should be easy to
modify the default
solver
settings
and to repeat runs
, if
thi
s is
deemed necessary
.
T
he
GO
T
software
implementation
automatically sets default parameter
values for its operations, partly based on the model to solve
. T
hese
settings
are
suitable in most cases, but the user can
always
assign (i.e., override) them.
Spe
cifically
, one can select the following options and parameter values:
M
inimization or maximization model
S
earch method (BB+LS, GARS+LS, MS+LS, or stand

alone LS)
I
nitial solution vector setting (used by the LS operational mode, if available)
11 J.D. Pint
ér: Global Optimization in Practice: State

of

the

Art and Perspective
s
3
83
C
onstraint penalty multiplier: this is used by BB, GARS and MS, in an
aggregated merit function
(recall that
the LS method handles all model functions
individually
)
M
aximal nu
mber of merit function evaluations
in the selected global search
mode
M
aximal number of merit function evaluations
in the global search mode,
without merit function value improvement
A
cceptable target value for the merit function, to trigger a
n
“
operationa
l
switch”
from global to local search mode
F
easibility tolerance used in LS mode
Karush

Kuhn

Tucker local optimality tolerance in LS mode
Solution (computation) time limit.
F
or f
urther inf
ormation regarding the
GOT
,
consult
the product web page
(Maple
sof
t
, 2004),
the
article
(Pintér, Linder, and Chin, 2006)
and
t
he related
Maple help system entries.
The product page also includes
links to
detailed
interactive
demos
, as well as
to
downloadable application examples.
11.
5.2
Handling
(
General
)
Constrained Gl
obal Optimization Models
Systems of nonlinear equations play a
fundamental
role in
quantitative
studies
,
since
equations
are often used to
characterize
the
equilibrium states and
optimality conditions of physical, chemical, biological
,
or other systems.
In t
he
next
example
we formulate and solve a system
of equations
.
A
t the same time
,
we also
illustrate the use of
a
general
model development style
that is easy to
follow
(in Maple, but
–
mutatis mutandis
–
also in other
modeling
systems)
.
Consider the equ
ations
> eq1 := exp(x

y)+sin(2*x)

cos(y+z)=0:
(
11.
5)
eq2 := 4*x

exp(z

y)+5*sin(6*x

y)+3*cos(3*x*y)=0:
eq3 := x*y*z

10=0:
To solve this system of equations, l
et us d
efine
t
he
optimization
model
components
as shown below (notice the dummy objective f
unction).
> constraints := {eq1,eq2,eq3}:
> bounds := x=

2..2, y=

1..3, z=2..4:
> objective:=0:
Then the next
Maple
command
is aimed at
generat
ing
a numerical solution
to
(
11.
5)
,
if such solution exists
.
384
Advances in Mechanics & Mathematics
, Vol.
III., 2006, Gao and Sherali (Eds.)
> solution:=
GlobalSolve(objective, constraints, bounds);
solution:=[0.,
[x=1.32345978290539557,y=2.78220763578413344,z=2.71581206431678090]]
.
This solution satisfies all
three
equations with less than 10

9
error, as
verified
by the next statement:
> eval(constraints,
solution[2]);
{

0.1
10

9
=0,

0.6
10

9
=0, 0=0}
Without going into details,
let us
note that multiple solutions
to (
11.
5)
can be
found e.g. by iteratively adding constraints that will exclude
the solutio
n(s) found
previously.
Furthermore, if a system of equations has no solutions, then
using the
GOT
we could obtain an approximate solution that has
globally
minimal error
over the box search region
, in a given norm
: consult Pintér (1996a) for details.
Next
, we will illustrate the us
ag
e of the GOT in interactive mode.
The
statement
shown below directly
leads to the Global Optimization Assistant
dialog, see Figure
11.
3.
> solution:=
Interactive(objective, constraints, bounds);
Using the dialog,
one can also
directly
edit
(
modify
)
the model formulation if
necessary.
The figure shows that the default
(
MS+LS
)
GOT
solver mode returns
the solution presented above.
Let us point out here
that
none of the
local solver
options indicated in the
Global Optimization Ass
istant
(see the radio buttons
under Solver)
is
a
ble to find a
feasible
solution to this model.
This finding is not
unexpected: rather, it shows the need for a global scope search approach to
handle this model and many other similar problems.
Following the
numerical solution step, one can
press
the Plot button
(shown
in
the lower right corner in Figure 11.1). T
his will invoke the
Global
Optimization Plotter dialog s
hown in
Figure
11.
4.
In the given subspace (
x
,
y
)
–
that can be selected by the GOT user
–
t
he
surface
plot shows the identically zero
objective function
.
Furthermore, o
n its surface level one can see the constraint
curves and the location of the global solution found
:
in the
original
color
figure
this is
a light green dot close to the boundary as i
ndicated by the
numerical
values found above
. Notice
also
the option to select alternative subspace
s
(defined by variable
pairs
)
for visualization.
11 J.D. Pintér: Global Optimization in Practice: State

of

the

Art and Perspective
s
3
85
Figure
11
.
3
Global Optimization Assistant dialog for model (
11.
5).
Figure
11.
4
Global
Optimization Plotter dialog for model (
11.
5).
386
Advances in Mechanics & Mathematics
, Vol. III., 2006, Gao and Sherali (Eds.)
The figures can be flexibly manipulated (r
otated), thereby offering the
possibility of detailed inspection of model functions. Such inspection can help
users to increase their understanding of the model.
11.
5.
3
Optimization
Models
with Embedded Computable Functions
It was pointed out earlier (i
n
Section 1
1.1)
that in
advanced decision models
some
model functions may require the
execution of
various
computational
procedures.
One of the advantages of using an ISTC
system
such as Maple is that
the needed
functionality
to perform these operations
is
often
readily available
(or
directly programmable)
. To illustrate this point, i
n
the
next
example we wi
ll
find
the
globally
optimized argument value of an objective function defined by Bessel
functions. As it is known,
the function
BesselJ(
v
,
x
)
satisfies
Bessel's differential
equation
x
2
y''
+
x y'
+ (
x
2
–
v
2
)
y
= 0.
(
11.
6
)
In (
11.
6
)
x
is
the function argument, and
the real value
v
is
the order (or
index parameter) of the function.
The evaluation of BesselJ requires the solution
function of
the
differential equation
(
11.
5), for the given value of
v
, and then the
calculation of the corresponding function value for argument
x
. For example,
BesselJ(0,2)~0.2238907791; consult
Maple's
h
elp
system
for further details.
Consider now t
he optimization
model
defined
and solved
b
elow:
> obj
ective
:=BesselJ(2,x)*BesselJ(3,y)

(
11.
7)
B
esselJ(5,y)*BesselJ(7,x)
:
> bounds := x=

10..20, y=

15..10:
> solution:=GlobalSolve(objective, bounds);
solution := [

.211783151218360000,
[x =

3.0621056409143
8720, y =

4.20467390983796196]]
.
The
corresponding external solver runtime is about 4 seconds. The next figure
visualizes the box

constrained
optimization
model
(
11.
7)
.
Here
a
simple
inspection and rotation of Figure
11.
5 helps to verify that the global
solution is
found indeed. Of course, this
would not be
directly
possible in general
(
higher

dimensional
or more complicated) models: recall the related earlier discussion
and recommendations from Section
11.
5.1.
However, we could study also other
sub

spac
es
in higher

dimensional models
.
11 J.D. Pintér: Global Optimization in Practice: State

of

the

Art and Perspective
s
3
87
Figure
11.
5
Optimization model
objective
defined by Bessel functions.
11.
6
Global Optimization
:
Applications and Perspectives
In recent decades
,
global optimization
gradually
has be
come an
est
a
blished
discipline that is now taught worldwide at
leading
academic institutions. GO
methods and software are also increasingly applied
in
vari
ous
research
contexts
,
in
cluding
industrial
and consulting practic
e.
The currently available professional
software implementations are routinely used to solve models with tens, hundreds,
and
sometimes
even
t
housand
s
of
variables and constraints.
Recall
again
the
caveats mentioned earlier regarding t
he po
tential
numerical difficulty of
m
odel
instances
: if one is interested in a
guaranteed high

quality solution, then the
necessary runtimes could become hours
(or
days, or
more)
–
even
on today’
s
high

performance
computers. One can expect further speed

u
p due to
both
algorithmic improvements and progress in hardware/software technology, but the
theoretical
ly
exponential “curse of dimensionality” associated with the subject of
GO
will always be there
.
I
n
the most general
terms,
global optimization
techno
logy
i
s
well

suited to
analyze and solve
complex, sophisticated
model
s in advanced
(
acoustic,
aerospace,
chemical,
control
, electrical,
environmental
,
and other
)
engineering,
biotechnology, econometrics
and
financial
modeling, medical
and
pharmaceutical
st
udies,
process industries, telecommunication
s
and other
areas.
388
Advances in Mechanics & Mathematics
, Vol. III., 2006, Gao and Sherali (Eds.)
For d
etailed discussions of examples
and case studies
consult, e.g.
Grossmann (1996
),
Pardalos, Shallowa
y and Xue (1996),
Pintér (1996a),
Corliss
and Kearfott (1999)
,
Papalambros and Wilde (2000), Edgar, Himmelblau and
Lasdon (2001),
Gao, Ogden and Stavroulakis (2001),
Schittkowski (2002),
Tawarmalani and Sahinidis (2002), Zabinsky (2003), Neumaier (200
6
),
N
owak
(2005),
Pintér (2006a),
as well as other
topical work
s.
Recent
numerical studies and applications in which LGO implementations
have been used
are described
e.g.
in
the following
works
:
cancer therapy planning
(
Tervo
,
Kolmonen, Lyyra

Laitinen, Pinté
r and
Lahtinen
,
2003
)
combined finite element modeling and optimization in sonar equipment
design (Pintér and Purcell, 2003)
laser
equipment
design
(
Isenor, Pintér, and Cada
,
2003
)
m
odel
calibration
(Pintér, 2003
a
, 2006b
)
numerical performance analysis on
a collection of test and “real

world”
models (Pintér, 2003b
, 2006b
)
physical
object
c
onfiguration analysis and design (
Kampas and Pintér, 200
6
)
potential energy models in computational chemistry
(
Pintér
,
2000,
2001b
)
,
(
Stortelder, de Swart and Pintér
,
2001
)
circle packing models and their industrial applications
(
Kampas and Pintér,
2004)
,
(
Pintér
and Kampas, 2005
a,
b)
,
(
Castillo,
Kampas and Pintér, 200
5
)
.
T
he forthcoming volumes (Kampas and Pintér, 200
7
;
Pintér
,
200
7
)
also
discuss a large variety of GO appl
ications, with extensive further references.
11.
7
Conclusions
G
lobal optimization
is
a subject of growing practical interest
as
indicated
by
recent
software implementations and
by
a
n increasing
range
of applications. In
this work we
have
discussed som
e of
these developments, with an emphasis on
practical aspects
.
In spite of remarkable
progress
, g
lobal optimization remain
s
a field of
extreme numerical
chal
l
enge
s
, not only when considering “all possible” GO
models, but also in practical attempts to han
dle complex, sizeable problems
with
in an acceptable timeframe. The
present
discussion advocates a practical
solution
approach that combines
theoretically
rigorous global
search
strategies
with efficient local search methodology, in integrated, flexible sol
ver suites. The
illustrative examples
presented here
–
as well as the applications referred to
above
–
indicate
the practical
viability
of such an approach.
11 J.D. Pintér: Global Optimization in Practice: State

of

the

Art and Perspective
s
3
89
The
practice of g
lobal optimization is
expected to grow
d
ynamic
ally.
We
welcome
feedback
regarding
current and
future development directions
, new t
est
challenges,
and
new
application areas
.
Acknowledgements
First of all,
I wish to thank David Gao and Hanif
Sherali for
their kind
invit
at
i
o
n
to t
he CDGO 2005 conference
(Blacksburg, VA, August 2005)
, as well as for the
invitation
to
contribute to
the present volume
dedicated to Gilbert Strang on the
occasion of his 70
th
birthday.
Thanks are due to an anonymous
reviewer for
his/her careful reading of the manuscript
,
and for the
suggest
ed corrections and
modifications
.
I
also
wish to
thank my
past and present
developer partners and colleagues
–
including
AMPL LLC,
Frontline Systems,
the GAMS Development Corporatio
n,
Frank
Kampas
,
Lahey Computer Systems,
LINDO Systems, Maplesoft,
Mathsoft,
Maximal Software,
Paragon Decision Technology
,
The Mathworks,
TOMLAB AB, and
Wolfram Research
–
for cooperation,
quality software
and
related
documentation,
and technical support
.
In addition to professional contributions and in

kind support offered by
developer partners, the research work summarized and reviewed in this paper has
received
partial
financial support
in recent years
from the following
organizations:
DRDC
Atlantic Reg
ion, Canada
(Contract W7707

01

0746),
the
Dutch Technology Foundation (STW Grant CWI55.3638)
,
the Hungarian
Scientific Research Fund (OTKA Grant T 034350)
, Maplesoft, the
National
Research Council of Canada
(
NRC IRAP Project 362093)
,
the University of
Kuop
io,
and Wolfram Research.
Special thanks are due
to
our
growing
clientele
,
and to all reviewers and
testers
of
our
various
software
implementations
, for valuable feedback,
comments and suggestions
.
References
Aris, R. (1999)
Mathematical Modeling: A Ch
emical Engineer’s Perspective
. Academic Press, San
Diego, CA.
Bartholomew

Biggs
, M.
(2005)
Nonlinear Optimization with Fina
n
cial Applications.
Kluwer
Academic Publishers
,
Dordrecht
.
Bazaraa, M.S., Sherali, H.D. and Shetty, C.M. (1993)
Nonlinear Programming
: Theory and
Algorithms.
Wiley, New York.
390
Advances in Mechanics & Mathematics
, Vol. III., 2006, Gao and Sherali (Eds.)
Bhatti, M. A. (2000)
Practical Optimization Methods with Mathematica Applications.
Springer

Verlag, New York.
Benson, H.P.,
and Sun, E. (2000) LGO
Versatile
t
ool for
g
lobal
o
ptimization.
OR/MS Today 27
(5)
, 52

55.
See www.lionhrtpub.com/orms/orms

10

00/swr.html.
Bertsekas, D.P. (1999)
Nonlinear Programming.
(2
nd
Edition) Athena Scientific, Cambridge, MA.
Birkeland, B. (1997)
Mathematics with Mathcad
. Studentlitteratur / Chartwell Bratt, Lund.
Boender, C.G.E. and Romeijn, H.E. (1995) Stochastic
m
ethods. In: Horst and Pardalos,
E
ds.
Handbook of Global Optimization. Volume 1
, pp. 829

869
.
Kluwer Academic Publishers,
Dordrecht.
Bornemann, F., Laurie, D., Wagon, S. and Waldvogel, J. (2004)
The SIAM 100

Digit Challenge. A
Study in High

Accuracy Numerical Computing
. SIAM
,
Philadelphia, PA.
Bracken, J. and McCormick, G.P. (1968)
Selected Applications of Nonlinear Programming.
Wiley,
New York.
Brooke, A., Kendrick, D. and Meeraus, A. (1988)
GAMS: A User's Guide.
The Scientific Press,
R
edwood City, CA. (Revised versions are available from the GAMS Corporation.) See also
www.gams.com.
Casti, J.L. (1990)
Searching for Certainty.
Morrow &
Co., New York.
Castillo, I. (2005) Maple and the Global Optimization Toolbox.
ORMS Today
,
32 (6) 56

60.
See
also www.lionhrtpub.
com/orms/orms

12

05/frswr.html.
Castillo, I., Kampas, F.J., and Pintér, J.D. (2005) Solving circle packing problems by global
o
ptimization: numerical results and industrial applications. (Submitted
for publication
.)
Cogan, B. (2003) How to get the best out of optimization software.
Scientific Computing World
71
(2003) 67

68. See
also
www.scientific

computing.com/scwjulaug03review_
optimisation.html.
Corliss, G.F. and Kearfott, R.B. (1999) Rigorous
g
lobal
s
earch:
i
ndustrial
a
pplications. In: Csendes,
T., ed.
Developments in Reliable Computing,
pp. 1

16
.
Kluwer Academic Publishers,
Dordrecht
.
Coullard, C., Fourer, R. and Owen, J. H.,
E
ds. (2001)
Annals of Operations Research
Volume 104:
Special Issue on Modeling Languages and Systems.
Kluwer Academic Publishers,
Dordrecht.
Chong, E.K.P. and Zak, S.H. (2001)
An Introduction to Optimization
. (2
nd
Edition) Wiley, New
York.
Diwekar, U. (2
003)
Introduction to Applied Optimization
. Kluwer Academic Publishers,
Dordrecht.
Edgar, T.F., Himmelblau, D.M., and Lasdon, L.S. (2001)
Optimization of Chemical Processes.
(
2
nd
E
d
itio
n) McGraw

Hill, New York.
Ferreira, C. (2002)
Gene Expression Programmin
g
. Angra do Hero
í
smo, Portugal.
Floudas, C.A., Pardalos, P.M., Adjiman, C.,Esposito, W.R., Gumus, Z.H., Harding, S.T., Klepeis,
J.L., Meyer, C.A., and Schweiger, C.A. (1999)
Handbook of Test Problems in Local and
Global Optimization
. Kluwer Academic Publi
shers, Dordrecht.
Fourer, R. (200
6
)
Nonlinear Programming Frequently Asked Questions
. Optimization Technology
Center of Northwestern University and Argonne National Laboratory
.
See
www

unix.mcs.anl.gov/otc/Guide/faq/nonlinear

programming

faq.html.
11 J.D.
Pintér: Global Optimization in Practice: State

of

the

Art and Perspective
s
3
91
Fourer, R., Gay, D.M. and Kernighan, B.W. (1993)
AMPL
A Mo
deling Language for
Mathematical Programming.
The Scientific Press, Redwood City, CA. (Reprinted by Boyd
and F
raser, Danvers, MA, 1996.)
S
ee also www.ampl.com.
Frontline Systems (200
6
)
Premium Solver Platform
Solver Engines.
User Guide.
Frontline
Systems, Inc. Incline Village, NV. See www.solver.com
.
Fritzson
, P.
(2004)
Principles of Object

Oriented Modeling and
Simulation with Modelica 2.1
.
IEEE Press, Wiley

Interscience, Piscataway, NJ.
Gao
, D.Y.,
Ogden
,
R.W.
and
Stavroulakis
,
G.E.
,
Ed
s
.
(2001)
Nonsmooth/Nonconvex Mechanics:
Modeling, Analysis and Numerical Methods
.
Kluwer Academic Publishers, Dordrecht
.
Gersh
enfeld, N. (1999)
The Nature of Mathematical Modeling.
Cambridge University Press,
Cambridge.
Glover, F
.
and Laguna, M. (1997)
Tabu Search
. Kluwer Academic Publishers, Dordrecht
.
Grossmann, I.E.
,
Ed. (1996)
Global Optimization in Engineering Design.
Kluw
er Academic
Publishers,
Dordrecht.
Hansen, P.E. and Jørgensen, S.E.,
E
ds. (1991)
Introduction to Environmental Management.
Elsevier, Amsterdam.
Henrion, D. (2006) A review of the Global Optimization Toolbox for Maple.
IEEE Control Systems
Magazine
26 (Octo
ber 2006 issue), 106

110.
Hillier
, F.J.
and Lieberman
, G.J.
(2005)
Introduction to Operations Research
. (8
th
Edition)
McGraw

Hill,
New York.
Horst, R. and Tuy, H. (1996)
Global Optimization
–
Deterministic Approaches.
(3
rd
Edition
)
Springer, Berlin.
Hors
t, R. and Pardalos, P.M.,
E
ds. (1995)
Handbook of Global Optimization. Volume 1.
Kluwer
Academic Publishers,
Dordrecht.
ILOG (2004)
ILOG OPL Studio and Solver Suite.
www.ilog.com.
Isenor, G., Pintér, J.D. and Cada, M (2003) A global optimization approach t
o laser design.
Optimization and Engineering
4, 177

196.
Jacob, C. (2001)
Illustrating Evolutionary Computation with Mathematica.
Morgan Kaufmann
Publishers, San Francisco.
Jones
, N.C.
and Pevzner
, P.A. (2004)
An
Introduction to
Bioinformatics Algorithms.
The MIT
Press, Cambridge, MA.
Kallrath, J., Ed. (200
4
)
Modeling Languages in Mathematical Optimization.
Kluwer Academic
Publishers, Dordrecht.
Kampas, F.J. and Pintér, J.D. (2004) Generalized circle packings: model formulations and
numerical results.
Pro
ceedings of the International Mathematica Symposium
(
Banff
,
AB
,
Canada, August 2004).
Kampas, F.J. and Pintér, J.D. (200
6
)
Configuration
a
nalysis and
d
esign by
u
sing
o
ptimization
t
ools
in
Mathematica
.
The Mathematica Journal
10 (1)
, 128

154
.
Kampas, F.J. a
nd Pintér, J.D. (200
7
)
Advanced Optimization: Scientific, Engineering, and
Economic Applications with Mathematica Examples
. Elsevier, Amsterdam
.
(
T
o appear)
Kearfott, R.B. (1996)
Rigorous Global Search: Continuous Problems.
Kluwer Academic
Publishers,
Dord
recht.
Lahey Computer Systems (
200
6
)
Fortran 9
5
User's Guide
. Lahey Computer Systems, Inc., Incline
Village, NV.
www.lahey.com
.
LINDO Systems (1996)
Solver Suite.
LINDO Systems, Inc., Chicago, IL.
See also
www.lindo.com
.
Lopez
, R.J.
(2005)
Advanced Enginee
ring Mathematics
with Maple
.
(
Electronic
book
edition
.
)
Maple
soft
, Inc., Waterloo, ON.
See
www.maplesoft.com/products/ebooks/AEM/
.
Mandelbrot, B.B. (1983)
The Fractal Geometry of Nature.
Freeman & Co., New York.
39
2
Advances in Mechanics & Mathematic
s
, Vol. III., 2006, Gao and Sherali (Eds.)
Maplesoft (2004)
Global Optimization Toolbox
for Maple
.
Maple
soft
, Inc. Waterloo, ON.
See
www.maplesoft.com/products/toolboxes/globaloptimization/
.
Maple
soft
(200
6
)
M
aple
. Maple
soft
, Inc., Waterloo, ON. www.ma
ple
soft
.com
.
Maros, I., and Mitra, G.,
E
ds. (1995)
Annals of Operations Research
Volume 58: Applied
Mathematical Programming and Modeling II (APMOD 93).
J.C. Baltzer AG, Science
Publishers, Basel.
Maros, I., Mitra, G., and Sciomachen, A.,
E
ds. (1997)
Annals
of Operations Research
Volume 81:
Applied Mathematical Programming and Modeling III (APMOD 95).
J.C. Baltzer AG,
Science Publishers, Basel.
Mathsoft (200
6
)
Mathcad
.
Mathsoft Engineering & Education, Inc.
,
Cambridge, MA.
Maximal Software (200
6
)
MPL Modeling
System.
Maximal Software, Inc. Arlington, VA.
www.maximal

usa.com.
Michalewicz, Z. (1996)
Genetic Algorithms + Data Structures = Evolution Programs.
(3
rd
Ed
itio
n)
Springer, New
York.
Mittelmann, H.D. (200
6
)
Decision Tree for Optimization Software.
See
pl
ato.la.asu.edu/guide.html
.
(
This
website
was started and maintained jointly for several years with Peter Spellucci.
)
Moler, C
.
B.
(2004)
Numerical Computing with Matlab.
SIAM, Philadelphia, 2004.
Murray, J.D. (1983)
Mathematical Biology
.
Springer

Verlag, Be
rlin.
Neumaier, A. (2004) Complete
s
earch in
c
ontinuous
g
lobal
o
ptimization and
c
onstraint
s
at
is
faction.
In: Iserles, A.
,
Ed.
Acta Numerica 2004
, pp.
27
1

36
9
.
Cambridge University Press,
Cambridge.
Neumaier, A. (200
6
)
Global Optimization.
www.mat.univie.a
c.at/~neum/glopt.html.
Nowak
, I.
(2005)
Relaxation and Decomposition Methods for
Mixed Integer Nonlinear
Programming.
Birkhäuser, Basel.
Osman, I.H. and Kelly, J.P., Eds. (1996)
Meta

Heuristics: Theory and Applications.
Kluwer
Academic Publishers, Dordrech
t.
Papalambros, P.Y. and Wilde, D.J. (2000)
Principles of Optimal Design
. Cambridge University
Press, Cambridge.
Paragon Decision Technology (200
6
) AIMMS
.
Paragon Decision Technology
BV, Haarlem, The
Netherlands.
See www.aimms.com.
Pardalos, P.M., Shallowa
y, D. and Xue, G.
, Eds.
(1996)
Global Minimization of Nonconvex Energy
Functions: Molecular Conformation and Protein Folding
. DIMACS Series, Vol. 23,
American Mathematical Society, Providence, RI.
Pardalos, P.M. and Re
sende, M.G.C.
,
E
ds. (2002)
Handbook of
Applied
Optimization.
Oxford
University Press, Oxford.
Pardalos, P.M. and Romeijn, H.E.,
E
ds. (2002)
Handbook of Global Optimization. Volume 2.
Kluwer Academic Publishers,
Dordrecht.
Parlar, M. (2000)
Interactive Operations Research with Maple.
Birkhäuser
, Boston.
Pintér, J.D. (1996
a
)
Global Optimization in Action.
Kluwer Academic Publishers,
Dordrecht.
Pintér, J.D. (1996b) Continuous global optimization software: A brief review.
Optima
52, 1

8.
(Web version is available at plato.la.asu.edu/gom.html.)
Pi
ntér, J.D. (1997) LGO
—
A Program System for Continuous and Lipschitz Optimization. In:
Bomze, I.M., Csendes, T., Horst, R. and Pardalos, P.M.,
E
ds
. Developments in Global
Optimization
, pp. 183

197. Kluwer Academic Publishers,
Dordrecht
.
Pintér, J.D. (2000
) Extremal
e
nergy
m
odels and
g
lobal
o
ptimization. In: Laguna, M. and González

Velarde, J

L.,
E
ds.
Computing Tools for Modeling, Optimization and Simulation
, pp. 145

160.
Kluwer Academic Publishers,
Dordrecht.
Pintér, J.D. (2001a)
Computational Global Opti
mization in Nonlinear Systems
. Lionheart
Publishing Inc., Atlanta, GA.
11 J.D. Pintér: Global Optimization in Practice: State

of

the

Art and Perspective
s
3
93
Pintér, J.D. (2001b) Globally
o
ptimized
s
pherical
p
oint
a
rrangements:
m
odel
v
ariants and
i
llustrative
r
esults.
Annals of Operations Research
104, 213

230.
Pintér, J.D. (2002a)
MathOptimizer

An Advanced
Modeling
and Optimization System for
Mathematica Users. User Guide
. Pintér Consulting Services, Inc., Halifax, NS. For a
summary, see
also
www
.wolfram.com/products/
applications/mathoptimizer/
.
Pintér, J.D. (2002b) Global
o
ptimization:
s
oftware,
t
est
p
roblems, and
a
pplications. In: Pardalos
and Romeijn,
E
ds.
Handbook of Global Optimization. Volume 2
, pp. 515

569.
Kluwer
Academic Publishers, Dord
recht.
Pintér, J.D. (2003a)
Globally optimized calibration of nonlinear models: techniques, software, and
applications.
Optimization Methods and Software
18
,
335

355.
Pintér, J.D. (200
3
b
) GAMS /LGO
n
onlinear
s
olver
s
uite:
k
ey
f
eatures,
u
sage, and
n
umerica
l
p
erformance
.
Available
at
www.gams.com/solvers/lgo.
Pintér, J.D. (200
5
)
LGO
A Model Development System for Continuous Global Optimization.
User’s Guide
.
(Current revision)
Pintér Consulting Services, Inc., Halifax, NS. For
summary
information
, see www
.pinterconsulting.com.
Pintér, J.D., Ed. (200
6
a
)
Global Optimization
S
cientific and Engineering
Case Studies.
Springer
Science + Business Media, New York
.
Pintér, J.D. (2006b)
Global Optimization
with Maple:
An Introduction with Illustrative Examples
.
A
n electronic book p
ublished and distributed by Pintér Consulting Services Inc., Halifax, NS,
Canada and Maplesoft, a division of Waterloo Maple Inc., Waterloo, ON, Canada.
Pintér, J.D
.
(200
7
)
Applied Nonlinear Optimization in Modeling
E
nvironments
.
CRC Pr
ess, B
oca
R
aton
, FL. (To appear)
Pintér,
J.D.,
Holmström, K., Göran, A.O., and Edvall, M.M. (2004)
User’s Guide for TOMLAB
/LGO.
TOMLAB Optimization AB, Västerås, Sweden.
See
www.
tom
opt.com
/docs/
TOMLAB_LGO.pdf
.
Pintér, J.D. and Kampas, F.J. (2003)
MathO
ptimizer Professional
–
An Advanced Modeling and
Optimization System for Mathematica Users with an External Solver Link
.
User Guide.
Pintér
Consulting Services, Inc., Halifax, NS, Canada.
For a summary, see
also
www.wolfram.com/products/
applications/matho
pt
p
r
o
/
.
Pintér,
J.D.
and Kampas, F.J. (2005
a
)
M
odel development and optimization
with
Mathematica
.
In:
Golden, B., Raghavan, S. and Wasil, E., Eds.
Proceedings of the 2005 INFORMS Computing
Society Conference
(Annapolis, MD, January 2005)
, pp. 285

302.
Spr
inger Science +
Business Media, New York
.
Pintér,
J.D. and Kampas, F.J. (2005b)
Nonlinear optimization in
Mathematica
with
MathOptimizer
Professional
.
Mathematica in Education and Research
10, 1

18
.
Pintér,
J.D.
, Linder, D., and Chin, P. (200
6
)
Global Opti
mization Toolbox for Maple:
a
n
i
ntroduction with
i
llustrative
a
pplications
.
Optimization
Methods and Software
21 (4) 565

582
.
Pintér, J.D. and Purcell, C.J. (2003) Optimization of finite element models with
MathOptimizer
and
ModelMaker
.
P
resented at the
2
003 Mathematica Developer Conference
, Champaign, IL.
A
vailable
at
library.wolfram
.com/
infocenter/Articles/5347/
.
Rich, L.G. (1973)
Environmental Systems Engineering.
McGraw

Hill, Tokyo.
Rothlauf, F. (2002)
Representations for Genetic and Evolutionary Alg
orithms.
Physica

Verlag,
Heidelberg.
Rudolph
, G.
(1997)
Convergence Properties of Evolutionary Algorithms.
Verlag Dr. Kovac,
Hamburg.
Schittkowski, K. (2002)
Numerical Data Fitting in Dynamical Systems
. Kluwer Academic
Publishers, Dordrecht.
Schroeder, M.
(1991)
Fractals, Chaos, Power Laws.
Freeman & Co., New York.
Stewart, I. (1995)
Nature's Numbers
. Basic Books / Harper and Collins, New York.
39
4
Advances in Mechanics & Mathematics
, Vol. III., 2006, Gao and Sherali (Eds.)
Stojanovic, S. (2003)
Comp
utational Financial Mathematics Using Mathematica
. Birkhäuser,
Boston.
Stortelder, W.J.H., de Swart, J.J.B., and Pintér, J.D. (2001) Finding
e
lliptic Fekete
p
oint
s
ets:
t
wo
n
umerical
s
olution
a
pproaches.
Journal of Computational and Applied Mathematics
130
, 205

216.
Tawarmalani, M. and Sahinidis, N.V. (2002)
Convexification and Global Optimization in
Continuous and Mixed

integer Nonlinear Programming.
Kluwer Academic Publishers,
Dordrecht.
Tervo, J., Kolmonen, P., Lyyra

Laitinen, T., Pintér, J.D., and Lah
tinen, T. (2003) An optimization

based approach to the multiple static delivery technique in radiation therapy.
Annals of
Operations Research
119, 205

227.
TOMLAB Optimization (200
6
) TOMLAB. TOMLAB Optimization AB, Västerås, Sweden.
See
www.tomopt.com
.
The MathWorks (200
6
)
MATLAB.
The MathWorks, Inc., Natick, MA.
See
www.mathworks.com.
Trott
, M. (2004)
The Mathematica GuideBooks
,
Volumes 1

4
. Springer Science + Business Media,
New York.
Vladimirou, H., Maros,
I. and Mitra, G.,
E
ds.
(2000)
Annals of Opera
tions Research
Volume 99:
Applied Mathematical Programming and Modeling IV (APMOD 98)
. J.C. Baltzer AG, Science
Publishers, Basel, Switzerland.
Voss, S., Martello, S., Osman, I.H. and Roucairol, C.,
E
ds. (1999)
Meta

Heuristics: Advances and
Trends in Local
Search Paradigms for Optimization.
Kluwer Academic Publishers, Dordrecht.
Voss
, S.
and Woodruff
, D.
L.
, Eds.
(2002
)
Optimization Software Class Libraries
.
Kluwer
Academic Publishers, Dordrecht.
Wass, J. (2006)
Global Optimization with Maple
–
An add

on to
olkit for the experienced scientist
.
Scientific Computing
, June 2006
issue.
Wilson,
H.B.,
Turcotte,
L.H.,
and Halpern
, D.
(2003)
Advanced
M
athematics and
M
echanics
Applications Using MATLAB
. (3
rd
Edition) Chapman and Hall / CRC
Press
, Boca Raton, FL.
Wolf
ram, S. (
2003
)
The Mathematica Book
. (
4
th
Ed
itio
n) Wolfram Media, Champaign, IL, and
Cambridge University Press, Cambridge.
Wolfram Research (200
6
)
Mathematica
.
Wolfram
Research, Inc.,
Champaign, IL
.
www.wolfram.com.
Wright,
F.
(
2002
)
Computing with Maple
.
Chapman and Hall / CRC
Press
, Boca Raton, FL.
Zabinsky, Z.B. (2003)
Stochastic Adaptive Search for Global Optimization.
Kluwer Academic
Publishers,
Dordrecht.
Zhigljavsky, A.A. (1991)
Theory of Global Random Search.
Kluwer Academic Publishers,
Dordrecht.
Comments 0
Log in to post a comment