<<

AMO - Advanced Modeling and Optimization Volume 1, Number 1, 1999

The TOMLAB NLPLIB Toolb ox

for

Kenneth Holmstrom and Mattias Bjorkman

Center for Mathematical Mo deling

Department of Mathematics and Physics

Malardalen University PO Box SE Vasteras Sweden

Abstract

The pap er presents the to olb ox NLPLIB TB NonLinear Programming LIBrary a set

of Matlab solvers test problems graphical and computational utilities for unconstrained and

constrained optimization unconstrained and constrained nonlinear

least squares b oxbounded global mixedinteger nonlinear programming

and exp onential sum mo del tting

NLPLIB TB like the to olb ox OPERA TB for linear and discrete optimization is a part

of TOMLAB an environment in Matlab for research and teaching in optimization TOMLAB

currently solves small and medium size dense problems

Presently NLPLIB TB implements more than solver algorithms and it is p ossible to

call solvers in the Matlab Optimization Toolb ox MEXle interfaces are prepared for seven

Fortran and solvers and others are easily added using the same type of interface routines

Currently MEXle interfaces have b een developed for MINOS NPSOL NPOPT NLSSOL

LPOPT QPOPT and LSSOL There are four ways to solve a problem by a direct call to the

solver routine or a call to a multisolver driver routine or interactively using the Graphical

User Interface GUI or a menu system The GUI may also b e used as a prepro cessor to

generate Matlab co de for standalone runs If analytical derivatives are not available automatic

dierentiation is easy using an interface to ADMATADMIT TB Furthermore ve types of

numerical dierentiation metho ds are included in NLPLIB TB

NLPLIB TB implements a large set of standard test problems Furthermore using MEXle

interfaces problems in the CUTE test problem data base and problems dened in the AMPL

mo deling language can b e solved

TOMLAB and NLPLIB TB have b een used to solve several applied optimization problems

New types of algorithms are implemented for the nonlinear least squares problem to approxi

mate sums of exp onential functions to empirical data and for global optimization We present

some preliminary test results which show very go o d p erformance for the NLPLIB TB solvers

Keywords Nonlinear Programming MATLAB CUTE AMPL Graphical User Interface Soft

ware Engineering Mathematical Software Optimization Algorithms Exp onential Sum Fitting

Nonlinear Least Squares

AMS Sub ject Classication C C C

1

Financed by the Malardalen University Research Board pro ject Applied Optimization and Modeling TOM

2

Email hkhmdhse URL httpwwwimamdhsetom

3

Email mbkmdhse

The TOMLAB NLPLIB Toolb ox for Nonlinear Programming

Introduction

The to olb ox NLPLIB TB NonLinear Programming LIBrary Toolb ox is part of TOMLAB an

environment in Matlab for research and teaching in optimization NLPLIB TB

is a set of Matlab mles which solves nonlinear optimization problems and nonlinear parameter

estimation problems in op erations research and mathematical programming The fo cus is on dense

problems The to olb ox is running in Matlab x and works on b oth PC NT Windows

Windows and UNIX systems SUN HP

Currently NLPLIB TB consists of ab out lines of mle co de in les implementing

algorithms utilities and predened problems all well do cumented in the Users Guide The

Users Guide includes descriptions and examples of how to dene and solve optimization problems

as well as detailed descriptions of the routines

The optimization problem to b e solved is either selected using a interactive menu program or

directly dened in a call to a multisolver driver routine The problem is solved using either a

NLPLIB TB solver a solver in the Matlab Optimization Toolb ox or using a MEXle interface

to call a Fortran or C optimization co de

NLPLIB TB has interactive menu programs for unconstrained and constrained optimization un

constrained and constrained nonlinear least squares quadratic programming b oxbounded global

optimization and global mixedinteger nonlinear programming

NLPLIB TB includes a graphical user interface GUI where all types of predened problems can

b e solved Using the GUI the user has total control of all optimization parameters and variables

TOMLAB MEXle interfaces for b oth PC and UNIX has b een developed for the commercial op

timization co de MINOS In TOMLAB MINOS is used to solve nonlinear programs in

NLPLIB TB and linear programs in OPERA TB TOMLAB MEXle interfaces working on

b oth PC and UNIX have also b een developed for the commercial co des from the Systems Opti

mization Lab oratory Department of Op erations Research Stanford University California

NPSOL NPOPT up dated version of NPSOL NLSSOL QPOPT

LSSOL and LPOPT The aim is to expand this list in the near future

NLPLIB TB implements a large set of predened test problems It is easy to try to solve any of

these problems using any of the solvers present The user can easily expand the set of test problems

with his own problems

NLPLIB TB was designed with the aim to simplify the solution of practical optimization problems

After dening a new problem in the NLPLIB TB format it is then p ossible to try to solve the

problem using any available solver or metho d

For twodimensional nonlinear unconstrained problems the menu programs supp ort graphical dis

play of the selected optimization problem as a mesh or contour plot The search directions together

with marks of the trial step lengths are displayed on the contour plot For higherdimensional

problems the contour plot is displayed in a twodimensional subspace Plots showing the estimated

convergence rate and the sequence of function values are included The GUI has the same graphical

options as the menu programs

For nonlinear least squares problems a routine to plot the data against the starting mo del and the

tted mo del is included Also included are new algorithms for the nonlinear parameter estimation

problem of tting sums of exp onential functions to empirical data

In Section the dierent optimization algorithms and solvers in NLPLIB TB are discussed Some

other imp ortant utility routines are discussed in Section eg dierent types of dierentiation

Some information ab out the MEXle interfaces the lowlevel routines and the test problems are

given in Section The Section discusses the most frequently menu and plot options used In

Section we present three areas where TOMLAB and NLPLIB TB have b een a valuable to ol

constrained nonlinear least squares the sp ecial case of exp onential sum tting problems and b ox

b ounded global optimization Some test results are presented for these application areas showing

go o d p erformance for the NLPLIB TB solvers We end by some conclusions in Section

The TOMLAB NLPLIB Toolb ox for Nonlinear Programming

Optimization Algorithms and Solvers

In this section we discuss the optimization problems that NLPLIB TB are able to solve In Table

the optimization solvers in NLPLIB TB are listed The solver for unconstrained optimization

ucSolve and the nonlinear least squares solvers lsSolve and clsSolve are all written as prototype

routines ie the routines implements several optimization algorithms in one co de This simplies

maintenance and further algorithm development

Table Optimization solvers in NLPLIB TB

Function Description

ucSolve A prototype routine for unconstrained optimization with simple b ounds on

the variables Implements Newton four quasiNewton and three conjugate

gradient metho ds

glbSolve A routine for b oxbounded global optimization

gblSolve Standalone version of glbSolve Runs indep endently of NLPLIB TB

glcSolve A routine for global mixedinteger nonlinear programming

gclSolve Standalone version of glcSolve Runs indep endently of NLPLIB TB

lsSolve A prototype algorithm for nonlinear least squares with simple b ounds Imple

ments GaussNewton and hybrid quasiNewton and GaussNewton metho ds

clsSolve A prototype algorithm for constrained nonlinear least squares Currently han

dles simple b ounds and linear equality and inequality constraints using an

active set strategy Implements GaussNewton and hybrid quasiNewton and

GaussNewton metho ds

conSolve Constrained nonlinear minimization solver using two dierent sequential

quadratic programming metho ds

nlpSolve Constrained nonlinear minimization solver using lter SQP

sTrustR Solver for constrained convex optimization of partially separable functions

using a structural trust region algorithm

qpBiggs Solves a quadratic program

qpSolve Solves a quadratic program

qpe Solves a quadratic program restricted to equality constraints using a null space

metho d

qplm Solves a quadratic program restricted to equality constraints using Lagranges

metho d

The routine ucSolve implements a prototype algorithm for unconstrained optimization with

simple b ounds on the variables uc ie solves the problem

min f x

x

x x x

st

L U

n

where x x x R and f x R ucSolve includes several of the most p opular search step

L U

metho ds for unconstrained optimization Bound constraints are treated as describ ed in Gill et al

The search step metho ds for unconstrained optimization included in ucSolve are the Newton

metho d the quasiNewton BFGS and inverse BFGS metho d the quasiNewton DFP and inverse

DFP metho d the FletcherReeves and PolakRibiere conjugategradient metho d and the Fletcher

conjugatedescent metho d For the Newton and the quasiNewton metho ds the co de is using a

subspace minimization technique to handle rank problems see Lindstrom The quasiNewton

co des also use safe guarding techniques to avoid rank problem in the up dated matrix

The constrained nonlinear optimization problem con is dened as

The TOMLAB NLPLIB Toolb ox for Nonlinear Programming

min f x

x

x x x

L U

b Ax b

st

L U

c cx c

L U

m n m n m

2 1 1

For general and c cx c R where x x x R f x R A R b b R

L U L U L U

constrained nonlinear optimization a sequential quadratic programming SQP metho d by Schit

tkowski is the main metho d implemented in the routine conSolve conSolve also includes the

HanPowell SQP metho d A third SQP type algorithm is the Filter SQP by Fletcher and Leyer

implemented in nlpSolve

Another constrained solver in NLPLIB TB is sTrustR implementing a structural trust region algo

rithm combined with an initial trust region radius algorithm The co de is based on the algorithms

in and and treats partially separable functions If not using an analytical Hessian safe

guarded BFGS or DFP are used for the QuasiNewton up date Currently sTrustR only solves

problems where the feasible region dened by the constraints is convex

A quadratic program qp is dened as

1

T T

min f x x F x c x

2

x

x x x

L U

st

b Ax b

L U

n nn m n m

1 1

Quadratic programs are solved where c x x x R F R A R and b b R

L U L U

with a standard activeset metho d implemented in the routine qpSolve qpSolve explicitly

treats b oth inequality and equality constraints as well as lower and upp er b ounds on the variables

simple b ounds For indenite quadratic programs it is using directions of negative curvature in

the pro cess of nding a lo cal minimum

NLPLIB TB includes two algorithms for solving quadratic programs restricted to equality con

straints EQP a null space metho d qpe and Lagranges metho d qplm

The nonlinear least squares problem ls is dened as

1

T

r x r x min f x

2

x

x x x

st

L U

n N

where x x x R and r x R

L U

In NLPLIB TB the prototype nonlinear least squares algorithm lsSolve treats problems with b ound

constraints in a similar way as the routine ucSolve

The prototype routine lsSolve includes four optimization metho ds for nonlinear least squares prob

lems the GaussNewton metho d the AlBaaliFletcher and the FletcherXu hybrid metho d

and the Huschens TSSM metho d If rank problems o ccur the prototype algorithm is using

subspace minimization The line search algorithm used is the same as for unconstrained problems

The constrained nonlinear least squares problem cls is dened as

1

T

min f x r x r x

2

x

x x x

L U

b Ax b

st

L U

c cx c

L U

m n N m n m

2 1 1

and c cx c R where x x x R r x R A R b b R

L U L U L U

The TOMLAB NLPLIB Toolb ox for Nonlinear Programming

The constrained nonlinear least squares solver clsSolve is based on lsSolve and its search steps

metho ds Currently clsSolve treats linear equality and inequality constraints using an activeset

strategy

The routine glbSolve implements an algorithm for b oxbounded global optimization glb ie

problems of the form that have nite simple b ounds on all the variables glbSolve implements

the DIRECT algorithm which is a mo dication of the standard Lipschitzian approach that

eliminates the need to sp ecify a Lipschitz constant In glbSolve no derivative information is used

For global mixedinteger nonlinear programming glc glcSolve implements an extended

version of DIRECT Jones that handles problems with b oth nonlinear and integer constraints

For global optimization problems with exp ensive function evaluations the routine ego implements

the Ecient Global Optimization EGO algorithm The idea of the EGO algorithm is to rst

t a resp onse surface to data collected by evaluating the ob jective function at a few p oints Then

EGO balances b etween nding the minimum of the surface and improving the approximation by

sampling where the prediction error may b e high

Other Routines in NLPLIB TB

There are seven menu programs dened in Table one for each type of optimization problem prob

Type Included in the table is also the Graphical User Interface GUI for nonlinear programming

which has the same functionality in one routine as all the menu programs

Table Menu programs

Function Description

nlplib Graphical user interface GUI for nonlinear optimization Handles all types

of nonlinear optimization problems

ucOpt Menu for unconstrained optimization

glbOpt Menu for b oxbounded global optimization

glcOpt Menu for global mixedinteger nonlinear programming

qpOpt Menu for quadratic programming

conOpt Menu for constrained optimization

lsOpt Menu for nonlinear least squares problems

clsOpt Menu for constrained nonlinear least squares problems

The menu programs describ ed in Table calls the corresp onding driver routine with the same

probType any of ucRun glbRun glcRun qpRun conRun lsRun or clsRun

In Table the utility functions needed by the solvers in Table are displayed The function ittr

implements the initial trust region radius algorithm by Sartenaer

The line search algorithm LineSearch used by the solvers conSolve lsSolve clsSolve and ucSolve is

a mo died version of an algorithm by Fletcher chap The use of quadratic intpol and

cubic interpolation intpol is p ossible in the line search algorithm For more details see

The routine preSolve is running a presolve analysis on a system of linear equalities linear inequalities

and simple b ounds An algorithm by Gondzio is implemented in preSolve

Instead of analytical derivatives it is easy to use either any of ve types of numerical dierentiation

or automatic dierentiation using an interface to the to olb ox ADMAT TB For information of how to

obtain a copy of ADMAT TB see the URL httpsimoncscornelleduhomevermaAD

The default numerical dierentiation type is an implementation of the FD algorithm page

It is the classical approach with forward or backward dierences together with an automatic step

selection pro cedure If the Spline Toolb ox is installed gradient Jacobian constraint gradient and

Hessian approximations could b e computed in three dierent ways using either of the routines csapi

The TOMLAB NLPLIB Toolb ox for Nonlinear Programming

Table Utility routines for the optimization solvers

Function Description

itrr Initial trust region radius algorithm

LineSearch Line search algorithm by Fletcher

intpol Find the minimum of a quadratic interpolation Used by LineSearch

intpol Find the minimum of a cubic interpolation Used by LineSearch

preSolve Presolve analysis on linear constraints and simple b ounds

csaps or spaps Numerical dierentiation is automatically used for gradient Jacobian constraint

gradient and Hessian if the user routine is not present First order derivatives could also b e estimated

by use of complex variables This approach avoids the subtractive cancellation error inherent in the

classical derivative approximation see

MEXle Interfaces

In NLPLIB TB there is currently seven MEXle interfaces developed to the commercial solvers

MINOS NPSOL NPOPT NLSSOL QPOPT LPOPT and LSSOL As standard MINOS has a

very advanced inputoutput handling but is also p ossible to switch it o and use MINOS as a silent

subroutine The other routines are also p ossible to make totally silent The MEXle interfaces are

all written in C and compiled and linked using the WATCOM CC version compiler after

converting the Fortran co de to C using fc

Low Level Routines and Test Problems

We dene the low level routines as the routines that compute the ob jective function value the

gradient vector the Hessian matrix second derivative matrix the residual vector for NLLS prob

lems the Jacobian matrix for NLLS problems the vector of constraint functions the matrix of

constraint normals and the second part of the second derivative of the Lagrangian function The

last three routines are only needed for constrained problems Only the routines relevant for a certain

type of optimization problem need to b e co ded There are dummy routines for the other routines

If routines that are computing derivatives are undened function name variables are set as empty

NLPLIB TB automatically uses numerical dierentiation

All information ab out a problem is stored in a structure variable Prob describ ed in detail in

and the Users Guide This structure variable is an argument to all low level routines If the

user needs to supply information to the low level routines this information should b e put in the

vector eld element ProbuP or as arbitrarily sub elds to ProbUSER By this way information

needed to evaluate the low level routines is easily retrieved Normally the user also writes a setup

routine for the initialization pro cess of problems together with the low level routines It is also

p ossible to avoid this setup routine and directly solve problems only dening the low level routines

the NLPLIB TB QuickRun option

Dierent solvers all have very dierent demand on how the sucient information should b e supplied

ie the function to optimize the gradient vector the Hessian matrix To b e able to co de the

problem only once and then use this formulation to run all types of solvers it was necessary to

develop interface routines that returns the information in the format needed for the actual solver

Table describ es the low level test functions and the corresp onding problem setup routines needed

for the predened constrained optimization con problems For the predened unconstrained

optimization uc problems and the quadratic programming problems qp similar routines are

needed

The problem of tting p ositive sums of p ositively weighted exp onential functions to empirical data

The TOMLAB NLPLIB Toolb ox for Nonlinear Programming

Table Generally constrained nonlinear con test problems

Function Description

con prob Initialization of con test problems

con f Compute the ob jective function f x for con test problems

con g Compute the gradient g x for con test problems

H Compute the Hessian matrix H x of f x for con test problems con

con c Compute the constraint residuals cx for con test problems

dc Compute the derivative of the constraint residuals for con test problems con

con fm Compute merit function x

k

con gm Compute gradient of merit function x

k

may b e formulated either as a nonlinear least squares problem or a separable nonlinear least squares

problem Several empirical data series are predened and articial data series may also b e generated

Algorithms to nd starting values for dierent number of exp onential terms are implemented Table

describ es the relevant routines

Table Exp onential tting test problems

Function Description

exp ArtP Generate articial exp onential sum problems

expInit Find starting values for the exp onential parameters

prob Denes a exp onential tting type of problem with data series t y The le exp

includes data from several dierent empirical test series

prob Denes medical research problems supplied by Helax AB Uppsala where Helax

an exp onential mo del is tted to data The actual data series t y are stored

on one le each ie data les MB large and are not distributed A

prob sample of ve similar les are part of exp

n

exp r Compute the residual vector r x i m x R

i

J Compute the Jacobian matrix r dx i m j n exp

i j

exp dr Compute the nd part of the second derivative for the nonlinear least squares

exp onential tting problem

c Compute the constraints on the exp onential parameters i exp

1 2 i

p

dc Compute matrix of constraint normals for constrained exp onential mo del tting exp

problem

exp dc Compute second part of second derivative matrix of the Lagrangian function for

constrained exp onential mo del tting problem This is a zero matrix b ecause

the constraints are linear

q Find starting values for exp onential parameters i p exp

i

exp p Find optimal number of exp onential terms p

Table describ es the low level routines and the initialization routines needed for the predened

constrained nonlinear least squares cls test problems Similar routines are needed for the nonlinear

least squares ls test problems no constraint routines needed

Table describ es the low level test functions and the corresp onding problem setup routines needed

for the unconstrained and constrained optimization problems from the CUTE data base

There are several options in the menu programs to display graphical information for the selected

problem For twodimensional nonlinear unconstrained problems the menu programs supp ort

graphical display of the selected optimization problem as mesh or contour plots On the con

tour plot the iteration steps are displayed For higherdimensional problems iterations steps are

The TOMLAB NLPLIB Toolb ox for Nonlinear Programming

Table Constrained nonlinear least squares cls test problems

Function Description

cls prob Initialization of cls test problems

n

cls r Compute the residual vector r x i m x R for cls test problems

i

cls J Compute the Jacobian matrix J x r dx i m j n for cls

ij i j

test problems

c Compute the vector of constraint functions cx for cls test problems cls

dc Compute the matrix of constraint normals cxdx for for cls test problems cls

cls dc Compute the second part of the second derivative of the Lagrangian function

for cls test problems

1

T

ls f General routine to compute the ob jective function value f x r x r x for

2

nonlinear least squares type of problems

T

ls g General routine to compute the gradient g x J x r x for nonlinear least

squares type of problems

T

H General routine to compute the Hessian approximation H x J x J x ls

for nonlinear least squares type of problems

displayed in twodimensional subspaces Sp ecial plots for nonlinear least squares problems plot

ting mo del against data are also available as well as plots of line search problem plots of circles

approximating p oints in the plane for the Circle Fitting Problem etc

The Menu Systems

This section gives a brief description of options when running the menu routines ucOpt ucOpt

qpOpt conOpt lsOpt and clsOpt The Graphical User Interface GUI has the same functionality

as the menu programs The GUI is presented in detail in The following is a list of most of the

menu choices

Name of the problem setup le and the problem to b e solved

Should the problem b e solved using default parameters or should problem dep endent questions

b e asked

Optimization algorithm and solver

Optimization solver submetho d

Set optimization parameters of the following type

Choice if to use automatic dierentiation

Metho d how to approximate derivatives

The line search accuracy

The maximal number of iterations

Starting values and lower and upp er b ounds for the unknown variables x

Choice if to use quadratic or cubic interpolation in line search algorithm

A b est guess of the lower b ound on the optimal ob jective function value used by the

line search algorithm

The tolerance on the convergence for the iterative sequence of the variables x a conver

gence tolerance on the ob jective function value f x on the norm of the gradient vector

T

g x and on the norm of the directed derivative p g x p x x

k +1 k

The TOMLAB NLPLIB Toolb ox for Nonlinear Programming

Table Test problems from CUTE data base

Function Description

ctools Interface routine to constrained CUTE test problems

utools Interface routine to unconstrained CUTE test problems

prob Initialization of constrained CUTE test problems cto

ctl prob Initialization of large constrained CUTE test problems

cto f Compute the ob jective function f x for constrained CUTE test problems

g Compute the gradient g x for constrained CUTE test problems cto

cto H Compute the Hessian H x of f x for constrained CUTE test problems

c Compute the vector of constraint functions cx for constrained CUTE test cto

problems

cto dc Compute the matrix of constraint normals for constrained CUTE test problems

cto dc Compute the second part of the second derivative of the Lagrangian function

for constrained CUTE test problems

uto prob Initialization of unconstrained CUTE test problems

prob Initialization of large unconstrained CUTE test problems utl

uto f Compute the ob jective function f x for unconstrained CUTE test problems

g Compute the gradient g x for unconstrained CUTE test problems uto

uto H Compute the Hessian H x of f x for unconstrained CUTE test problems

The maximal violation for the b ounds and the linear and nonlinear constraints

The rank test tolerance which determines the pseudo rank used in the subspace mini

mization technique The subspace minimization technique is part of the determination

of the search direction in some of the NLPLIB TB internal solvers

Choice if to use a separable nonlinear least squares algorithm or not

Print levels and pauseno pause after each iteration

Optimize Start an optimization with the selected optimization solver

ReOptimize with the latest solution as starting value Useful to determine if the optimal

p oint is really found

Plot options

Draw a contour plot of f x and also draw the search directions p Mark line search

step length trials for each search direction

i

Draw a mesh plot of f x

Plot data against the starting mo del and the tted mo del for parameter estimation

problems

Draw other types of graphics eg the ob jective function value for each iteration or the

estimated linear convergence rate for each iteration

Every parameter has initial default values The user selects new values or simply uses the default

values

One of the menu options is to draw a contour plot of f x together with the search steps On each

search step there are marks for each trial value where the line search algorithm had to evaluate the

ob jective function It is p ossible to follow the full iterative sequence on twodimensional problems

We have run the prototype unconstrained solver ucSolve using two dierent metho ds In Figure

the result of optimizing the classical Rosenbrock banana function see or page using

Newtons metho d are displayed There are a few steps where the line search has shortened the step

In contrast to this see the b ehavior of the FletcherReeves conjugategradient metho d in Figure

The TOMLAB NLPLIB Toolb ox for Nonlinear Programming

This metho d not using second derivative information has a much more chaotic path to the

solution Such graphs can b e illustrative for students in a rst course in optimization

Rosenbrocks banana

1.2

1

0.8

0.6

0.4

0.2

0

−0.2

−1 −0.5 0 0.5 1

Figure The Rosenbrock banana function with search directions and marks for the line search

trials running ucsolve using the Newtons metho d

Rosenbrocks banana

1.2

1

0.8

0.6

0.4

0.2

0

−0.2

−1 −0.5 0 0.5 1

Figure The Rosenbrock banana function with search directions and marks for the line search

trials running ucsolve using the FletcherReeves conjugategradient metho d

Examples and Applications

In this section we will present results showing that the NLPLIB TB routines p erform well on

standard test problems as well as on real life applications For nonlinear least squares problems

and exp onential sum mo del tting problems comparisons b etween the NLPLIB TB solvers and

commercial solvers are made

Nonlinear Least Squares Problems

The NLPLIB TB nonlinear least squares solvers lsSolve and clsSolve are used in several of our

research pro jects eg estimation of formation constants in chemical equilibrium analysis analysis

The TOMLAB NLPLIB Toolb ox for Nonlinear Programming

of plasmid stability in fermentation pro cesses and tting of exp onential sums to empirical data in

radiotherapy planning Examples of some exp onential sum mo del tting problems will b e given in

Section

Here we will present results from a comparison of our routines to NLSSOL NPOPT and the

Optimization Toolb ox routine leastsq By our routines we mean clsSolve with the ordinary

GaussNewton metho d GN and clsSolve with the FletcherXu hybrid metho d FX A standard

test set for nonlinear squares problems is the test set of More Garb ow and Hillstrom The

solvers are applied to the problems of the test set The results are presented in Table Each

entry consists of three integers which give the number of iterations residual evaluations and Jaco

bian evaluations required to solve the problem A indicates that the maximum number of allowed

iterations here is reached and a bar indicates that the optimal function value has not b een

reached at all The routine leastsq take as input the maximum number of allowed function evalu

ations instead of iterations so this limit will b e reached b efore iterations have b een p erformed

so the average values could b e a bit misleading This test is promising b ecause it shows that our

routines p erforms even b etter than the commercial solvers but we must keep in mind that this is

not a full evaluation

For a comparison of clsSolve and other solvers on linearly constrained nonlinear least squares prob

lems see the thesis of Bjorkman

Exp onential Sum Fitting Problems

P

p

In algorithms for tting exp onential sums D r a exp b r to numerical data

i i

i=1

are presented and compared for eciency and robustness The numerical examples stem from

parameter estimation in dose calculation for radiotherapy planning The doses are simulated by

emitting an ionizing photon b eam into water and at dierent depths and dierent radius from the

b eam center measuring the absorption The absorb ed dose is normally distinguished into primary

dose from particles excited by the photon b eam and scattered dose from the following particle

interactions

In Table we present results from a comparison of the same type as presented in Section for

the Helax problems describ ed ab ove The table entries consists of three integers which give the

numbers of iterations residual evaluations and Jacobian evaluations required to solve the problem

We restrict to present detailed information for the rst fteen and the last two problems but the

average values and the number of failures are based on all the problems The y indicates that

the separable nonlinear least squares algorithm II in is run The eciency of the clsSolve

FletcherXu metho d with the separable algorithm is obvious less than p ercent of the number

of iterations residual evaluations and Jacobian evaluations for the b est of the other solvers are

required to solve the problems

Worth mentioning is that NLPLIB TB includes very well p erforming routines for computing starting

values for exp onential sum mo del tting problems see the thesis by Petersson This is extremely

imp ortant when solving problems in real life applications and these go o d initial values are used in

the test ab ove

Global Optimization Problems

As mentioned in Section NLPLIB TB also includes routines for solving global optimization

problems The b oxbounded constrained version and the general mixed integer constrained version

of the DIRECT algorithm are implemented in glbSolve and glcSolve resp ectively In Table we

show how glbSolve p erform on the seven standard test functions from Dixon and Szego and two

test functions from Yao The table entries give the number of function evaluations needed

for convergence Here convergence is dened in terms of p ercent error from the known optimal

function value Let f denote the known optimal function value and let f denote the b est

g lobal min

The TOMLAB NLPLIB Toolb ox for Nonlinear Programming

Table Comparison of algorithmic p erformance for the More Garb ow Hillstrom test set for

nonlinear least squares problems

clsSolve GN clsSolve FX leastsq LM leastsq GN NLSSOL NPOPT

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

MGH

Average

Failures

The TOMLAB NLPLIB Toolb ox for Nonlinear Programming

Table Comparison of algorithmic p erformance for the Helax exp onential sum tting problems

The y indicates the use of a separable nonlinear least squares algorithm

clsSolve GN clsSolve FX y leastsq LM leastsq GN NLSSOL NPOPT

Helax XP

Helax XP

Helax XP

Helax XP

Helax XP

Helax XP

Helax XP

Helax XP

Helax XP

Helax XP

Helax XP

Helax XP

Helax XP

Helax XP

Helax XP

Helax XP

Helax XP

Average

Failures

function value at some p oint in the search then the p ercent error is dened by

f f

min g lobal

E

jf j

g lobal

Table Number of function evaluations needed by glbSolve for convergence on the DixonSzego

and Yao test functions

Test function function evaluations E function evaluations E

Shekel

Shekel

Shekel

Hartman

Hartman

Branin

GoldsteinPrice

Sixhump camel

Shubert

The test function Shekels foxholes from the First International Contest on Evolutionary Optimiza

tion ICEO is in two dimensions illustrative to show how eective glbSolve could b e on problems

with several lo cal nonglobal minima A contour plot of the function with dots indicating p oints

where the function value has b een computed is shown in Figure

After iterations and function evaluations the b est function value found by glbSolve is

T

at x As you can see in Figure most of the sampled p oints are concentrated

The TOMLAB NLPLIB Toolb ox for Nonlinear Programming

Shekel´s Foxholes, n=2 10

9

8

7

6 2

x 5

4

3

2

1

0 0 1 2 3 4 5 6 7 8 9 10 x

1

Figure Contour plot of the Shekelss Foxholes function with sampled p oints using glbSolve

in the area around the global minimum up in the right corner of the b ounding b ox and very few

function evaluations are wasted on sampling around the many lo cal minima

Conclusions

NLPLIB TB is a p owerful to ol for applied optimization research and algorithmic development in

nonlinear programming NLPLIB TB is also suitable for computer based learning of optimization

and in computer exercises

NLPLIB TB together with TOMLAB is a exible to ol with b oth a graphical user interface menu

programs and multisolver driver routines The p ossibility to very easily use b oth automatic and

numerical dierentiation makes TOMLAB esp ecially useful in practical applications where deriva

tive information may b e hard to obtain The global optimization routines are very suitable for the

common case of parameter estimation in simulation mo dels The robust constrained nonlinear least

squares routines are ecient to ols in the solution of applied parameter estimation problems With

the op en design and the interfaces the Fortran solvers CUTE and AMPL there are many p ossible

ways to utilize the system

Acknowledgements

We would like to thank Prof Philip E Gill for giving us access to the commercial co des NPSOL

NPOPT NLSSOL LPOPT QPOPT and LSSOL Also we are grateful to Prof Michael Saunders

for sending us the MINOS distribution Prof David M Gay help ed us making the AMPL interface

work We would also thank Prof Yangquan Chen for a lot of help in making the CUTE interface

work on PC systems Many others around the world has given us interesting suggestions found

bugs and discussed the system and we hereby thank all p eople that have contributed

Thanks to Dr Donald R Jones who has with the DIRECT and EGO algorithms inspired us

to include routines for global optimization problems He has furthermore b een very helpful in

discussions on details of the algorithms

The TOMLAB NLPLIB Toolb ox for Nonlinear Programming

The graduate students in the Applied Optimization Group TOM have made imp ortant contribu

tions Joran Petersson has made imp ortant contributions to the routines for the exp onential tting

problem as part of the exp onential tting pro ject Erik Dotzauer developed the rst version of

the solver nlpSolve Many students have also contributed with bits and pieces which we hereby

acknowledge

References

M AlBaali and R Fletcher Variational methods for nonlinear least squares J Op er

Res So c pp

M Bjorkman Nonlinear Least Squares with Inequality Constraints Bachelor Thesis Depart

ment of Mathematics and Physics Malardalen University Sweden

I Bongartz A R Conn N Gould and P L Toint CUTE Constrained and Un

constrained Testing Environment tech rep IBM T J Watson Research Center Yorktown

Heights NY September

I Bongartz A R Conn N I M Gould and P L Toint CUTE Constrained and

Unconstrained Testing Environment ACM Transactions on Mathematical Software

pp

M A Branch and A Grace Optimization Toolbox Users Guide Prime Park Way

Natick MA

A R Conn N Gould A Sartenaer and P L Toint Convergence properties of

minimization algorithms for convex constraints using a structured trust region SIAM Journal

on Scientic and Statistical Computing pp

L C W Dixon and G P Szego The global optimisation problem An introduction in

Toward Global Optimization L Dixon and G Szego eds New York NorthHolland

Publishing Company pp

E Dotzauer and K Holmstrom The TOMLAB Graphical User Interface for Nonlinear

Programming Advanced Mo deling and Optimization

S I Feldman D M Gay M W Maimone and N L Schryer A FortrantoC

converter Tech Rep Computing Science Technical Rep ort No ATT Bell Lab oratories

May

R Fletcher Practical Methods of Optimization John Wiley and Sons New York nd ed

R Fletcher and S Leyffer Nonlinear programming without a penalty function Tech

Rep NA University of Dundee September

R Fletcher and C Xu Hybrid methods for nonlinear least squares IMA Journal of Nu

merical Analysis pp

P E Gill W Murray M A Saunders and M H Wright Users Guide for NPSOL

Version A Fortran package for nonlinear programming Department of Op erations Re

search Stanford University Stanford CA SOL

P E Gill W Murray and M H Wright Practical Optimization Academic Press

London

J Gondzio Presolve analysis of linear programs prior to applying an interior point method

INFORMS Journal on Computing pp

The TOMLAB NLPLIB Toolb ox for Nonlinear Programming

K Holmstrom TOMLAB A General Purpose Open Matlab Environment for Research

and Teaching in Optimization Technical Rep ort IMaTOM Department of Mathe

matics and Physics Malardalen University Sweden Presented at the th International

Symp osium on Mathematical Programming Lausanne Switzerland August

TOMLAB An Environment for Solving Optimization Problems in Matlab in Pro ceed

ings for the Nordic Matlab Conference Octob er M Olsson ed Sto ckholm Sweden

Computer Solutions Europ e AB

TOMLAB An Optimization Development Environment in Matlab Annals of Op era

tions Research Mo deling Languages and Approaches Submitted

The TOMLAB Optimization Environment in Matlab Advanced Mo deling and Opti

mization pp

K Holmstrom A Ahnesjo and J Petersson Algorithms for exponential sum tting

in radiotherapy planning To b e submitted

K Holmstrom M Bjorkman and E Dotzauer The TOMLAB OPERA Toolbox for

Linear and Discrete Optimization Advanced Mo deling and Optimization

TOMLAB v Users Guide Technical Rep ort IMaTOM Department of

Mathematics and Physics Malardalen University Sweden

J Huschens On the use of product structure in secant methods for nonlinear least squares

problems SIAM Journal on Optimization pp

D R Jones DIRECT Encyclop edia of Optimization To b e published

D R Jones C D Perttunen and B E Stuckman Lipschitzian optimization without

the Lipschitz constant Journal of Optimization Theory and Applications pp

D R Jones M Schonlau and W J Welch Ecient global optimization of expensive

BlackBox functions Journal of Global Optimization pp

P Lindstrom Algorithms for Nonlinear Least Squares Particularly Problems with Con

straints PhD thesis Inst of Information Pro cessing University of Umea Sweden

D G Luenberger Linear and Nonlinear Programming AddisonWesley Publishing Com

pany Reading Massachusetts nd ed

J J More B S Garbow and K E Hillstrom Testing unconstrained optimization

software ACM Trans Math Software pp

B A Murtagh and M A Saunders MINOS USERS GUIDE Technical Rep ort

SOL R Revised Feb Systems Optimization Lab oratory Department of Op erations

Research Stanford University Stanford California

J Petersson Algorithms for Fitting Two Classes of Exponential Sums to Empirical Data Li

centiate Thesis ISSN Opuscula ISRN HEVBIBOPSE Division of Optimiza

tion and Systems Theory Royal Institute of Technology Sto ckholm Malardalen University

Sweden December

A Ruhe and PA Wedin Algorithms for Separable Nonlinear Least Squares Problems

SIAM Review pp

A Sartenaer Automatic determination of an initial trust region in nonlinear programming

Technical Rep ort Department of Mathematics Facultes Universitaires ND de la Paix

Bruxelles Belgium

The TOMLAB NLPLIB Toolb ox for Nonlinear Programming

K Schittkowski On the Convergence of a Sequential Quadratic Programming Method with

an Augmented Lagrangian Line Search Function technical rep ort Systems Optimization lab

oratory Stanford University Stanford CA

W Squire and G Trapp Using complex variables to estimate derivatives of real functions

SIAM Review pp

Y Yao Dynamic tunneling algorithm for global optimization IEEE Transactions on Systems

Man and Cyb ernetics pp