Global Optimization Software
Doron Pearl
Jonathan Li
Olesya Peshko
Xie Feng
What is global optimization?
Global optimization is aimed at finding the best solution of constrained optimization problem which (may) also have various local optima.
General global optimization problem (GOP)
Given a bounded, robust set D in the real n-space Rn and a continuous function f: D R ,find
global min f(x), subject to the constraint x Є D
Note: robust set : the closure of its nonempty interior.
First, we have to tell you..
No single optimization package can solve all global optimization problems efficiently.
Two General Classes In Global Optimization
Deterministic -Grid search -Branch and bound
Stochastic -Simulated Annealing -Tabu Search -Genetic Algorithm -Statistical Algorithm
Deterministic class and software
Actually, We can further classify deterministic class into two different sub-class :
Explicit Function RequiredSuch as ..Baron
Explicit Function isn’t requiredSuch as ..LGO( Lipschitz Global Optimization ).
Remark:1. In present deterministic solvers, the number of solvers in first class is more then in second class. 2. Even though LGO is regard as using deterministic way to solve the problem, the solution isn’t always guaranteed to be “deterministic” global optimal.3. There are some more solvers in first class won’t be discussed in detail, but in later slides, they will be included in comparison.
LGO Lipschitz Global Optimization
0
min ( )
{ : ( ) 0, 1,..., }j
f x
x D x D f x j J
1 2 1 2( ) ( )j j jf x f x L x x
represents a ‘simple’ explicit constraint set:frequently, it’s a finite n-dimensional interval or simplex, or Rn.
Furthermore, the objective function and constraint functions are Lipschitz-continuous on D0.That is,they satisfy the relation
0nD R
LGO Lipschitz Global Optimization
Three Key Components in the approach:
Lipschitz Continuous Function
Adaptive Partition Strategy
Branch and Bound
Lipschitz Continuous FunctionWith Lipschitz Continuous Property :
We can conclude the following observation of the function:
The ’slope’ is bounded with respect to the system input variables x.
If a function is Lipschitz Continuous on certain compact domain, it’s guaranteed that the bound of the function exists.
On the other hand, without the property, on the sole basis of sample points and corresponding function values, one cannot provide a lower bound after any finite number of function evaluations of D.
Remark: It’s not necessary to compute L in global optimization, but the existence of it
is a necessary condition to have lower bound.
1 2 1 2( ) ( )j j jf x f x L x x
Lipschitz Continuous Function
In Lipschitz continuous function , the more sample points we have , the more accurate approximation of the lower bound we can obtain.
Adaptive partition strategy
Usually implement on the relaxed feasible set, such as:
-Interval set: a<x<b (x, a, b is vector)
The strategy is to partition the interval into sub-interval by bisection. In high dimension, it could be regard as a box.
-Simplex set:
The strategy is to partition the simplex into sub-simplex by each time cutting one vertex out.
-Convex Cone set: The strategy is to partition the cone into sub-cone.
Remark: As you may see, partition usually should: -Create linear bound constraint of each partition -Fulfill “exhaustive search” The choice of different partition strategy usually depends on how well
the relaxation is, such as tightness.
Example of computing L when having relaxed bound constraint
2
1
( ) ((1/ 2) )n
k k k k kk
f x p x q x r
{ : , 1,.... }n
k k kP x R a x b k n
1 2
22 1/ 2
1
2
[ ( ) ( ) ]
{ : / (1/ 2)( )}
{ : / (1/ 2)( )}
k k k k k kk I k I
k k k k
k k k k
then
L p a q p b q
where
I k q p a b
I k q p a b
0, , , ( 1,2,... )k k k k kp q r a b k n
Branch and Bound
Branch literally means that the algorithm trying to partition the feasible region in some fashion.
Bound means while doing searching, we try to
estimate the objective value by using upper bound and lower bound.
Upper bound: In each feasible region, the founded local optimum gives the upper bound, or the function evaluation of randomly sampling.
Lower bound : usually composed by certain approximation.
Branch and Bound
Set up domain D’ with simpleexplicit constraint
Pick samplepoints,x1,x2..,
calculatef(x1),f(x2)..
Do the local search ,set
local optimal asupper bound,and Record it.
Compute thelower bound
for thebounded area
Ifupper bound =lower bound
Partition the
domain D
Compute thelower bound
for eachpartition,
Do the local searchlocal optimal,
update the upperbound, andRecord it.
IfLower bound >
latest upperbound
No
Yes
Yes
stop
stop
Three approaches in LGO
LGO integrates a suite of robust and efficient global and local scope solvers. These include:
adaptive partition and search (branch-and-bound)
adaptive global random search (single & multi-start)
constrained local optimization( reduced gradient method)
Remark:The random option of approach is also usually used to handle black-box function.
General global optimization model in LGO
x is a real n-vector (to describe feasible decisions)
a, b are finite, component-wise vector bounds imposed on x
f(x) is a continuous function (to describe the model objective)
g(x) is a continuous vector function (to describe the model constraints; the inequality sign is interpreted component-wise).
min ( )
( ) 0
f x
g x
a x b
LGO interface
Library: LGO solver suite for C and Fortran compilers, with a text I/O interface, or embedd
ed in a Windows GUI
Spreadsheets: Excel Premium Solver Platform/LGO solver engine, in cooperation with Frontline
Systems
Modeling Language: GAMS/LGO solver engine, in cooperation with the GAMS Development Corporatio
n
Integrated technical computing systems: AIMMS/LGO solver engine, in cooperation with Paragon Decision Technologies Global Optimization Toolbox for Maple, in cooperation with Maplesoft MPL/LGO solver engine, in cooperation with Maximal Software MathOptimizer for Mathematica, a native Mathematica product MathOptimizer Professional (LGO for Mathematica), in cooperation with Dr. Frank
Kampas TOMLAB/LGO for Matlab, in cooperation with TOMLAB Optimization
LGO
LGO has been used to solve models with up to one thousand variables and constraints.
These packages are developed by J. D. Pinter, who, since doing his PhD (1982 - Moscow State University) in optimisation, has become an internationally known expert in the field. One of his textbooks has won an international award (INFORMS Computing Society Prize for Research Excellence)
Further detail will be discussed in later slides.
LGO testing
In our numerical experiments described here, we have used LGO to solve a set of GAMS models based on the Handbook of Test Problems in Local and Global Optimization by Floudas et al.(1999). For brevity, we shall refer to the model collection studied as HTPLGO. The set of models considered is available from GLOBALLib (GAMS Global World, 2003).
GLOBALLib is a collection of nonlinear models that provides GO solv
er developers with a large and varied set of theoretical and practical test models.
The entire test set used consists of 117 models. The test models included have up to 142 variables, 109 constra
ints, 729 non-zero and 567 nonlinear-non-zero model terms.
LGO test result
Figure 3. Efficiency profiles: all LGO solver modes are applied to GLOBALLib models.
Operational mode(for brevity we shall use opmode) .opmode 0: local search from a given nominal solution, without a preceding global search mode (LS)
·opmode 1: global branch-and-bound search and local search (BB+LS)
· opmode 2: global adaptive random search and local search (GARS+LS)
· opmode 3: global multi-start random search and local search (MS+LS)
Using LGO
There are usually five stages while using LGO, they are problem definition, problem compilation, model parameters, model solution, and result analysis
-Problem definition: Define the function -Problem Compilation : Link to obj and lib-Model parameters: Set up lower bound, upper bound , and number of constraint, etc-Model solution: There is automatic model and interactive model Automatic model : Program determine which of the four module to use to compute with respect to the input file Interactive model: User determine which module to use and in which order ,maximum search effort
Price
GAMS/LGO commercial $1,600 academic $320
Premium Solver Platform $1,495
TOMLAB /LGO commercial $1,600 academic $600
Some important fact
Continuity of the functions (objective and constraints) defining the global optimization model is sufficient to use the LGO software.
Naturally, in such cases only a statistical
guarantee can be given for the global lower bound estimate. The lower bound generated by LGO is statistical in all cases, since it is based partially on pseudo-random sampling.
LGO could only give the global optima deterministically based on deterministic L and deterministic boundary.
Comparison of complete global optimization solvers
Solvers being compared: We present test results for the global optimization systems BARON, COCOS,
GlobSol, ICOS, LGO/GAMS, LINGO, OQNLP Premium Solver, and for comparison the local solver MINOS. All tests were made on the COCONUT benchmarking suite.
Outline of test set: The test set from three libraries consists of 1322 models varying in dimension (number of
variables) between 1 and over 1000, coded in the modeling language AMPL.
Library 1 : GAMS Global library ; real life global optimization problems with industrial relevance,
but currently most problems on this site are without computational
results.
Library 2: CUTE library ; consist of global (and some local) optimization problems with
nonempty feasible domain
Library 3: EPFL library ; consists of pure constraint satisfaction problems (constant objective function)
almost all being feasible.
Comparison of complete global optimization solvers(2)
Those excluded from libraries: 1.Certain difficult ones for testing, but the difficulties is unrelated to solver 2.Those contain function which aren’t support by ampl2dag converter.3. Problem actually contain objective function in Library3.4. Showing strange behavior, which might caused by bug in converter5. No solver can get optimal solution
Brief overview of special characteristic of other solvers
Globsol , Premium solver exploiting interval method.
ICOS is a pure constraint solver, which currently cannot handle models with an objective function
COCOS contains many modules that can be combined to yield various combination strategies for global optimization.
Characteristic comparison
Important related details
All solvers are tested with the default options suggested by the providers of the codes.
The timeout limit used was (scaled to a 1000 MHz machine) around 180 seconds CPU time for models of size 1, 900 seconds for models of size 2, and 1800 seconds for models of size 3
The solvers LGO and GlobSol required a bounded search region, and we bounded each variable between ¡1000 and 1000, except in a few cases where this leads to a loss of the global optimum.
The reliability of claimed results is the most poorly documented aspect of current global optimization software.
Reliability
Reliability
Performance
Note: Different solvers have different stopping criteria, Which should also be considered.
Like..Baron, Lingo : stop while time is upLGO,OQNLP : stop based on
certain statistic
Final Remark
In a few cases, GlobSol and Premium Solver found solutions where BARON failed, which suggests that BARON would benefit from some of the advanced interval techniques implemented in GlobSol and Premium Solver.
However, GlobSol and Premium Solver are much less efficient in both time and solving capacity than BARON. To a large extent this may be due to the fact that both GlobSol and Premium Solve strive to achieve mathematical rigor, resulting in significant slowdown due to the need of rigorously validated techniques.
Reference
http://myweb.dal.ca/jdpinter/index.html Janos D. Pinter (LGO ‘s creator) website
Global Optimization in Action Continuous and Lipschitz Optimization :
Algorithm, Implementations and Applications, Author: Janos D. Pinter
Introduction to Global Optimization Author : Reiner Horst, Panos M.Pardalos and Nguyen V.Thoai
A comparison of complete global optimization solvers Arnold Neumaier. Oleg Shcherbina, Waltraud Huyer, Tamas Vinko , Mathematical Programming
http://www.mat.univie.ac.at/~neum/glopt.html Website maintained by Arnold Neumaier
p.s. If you like to check above two books, go asking Prof.Tamas. He will be generous to who like to learn.
Top Related