Parallel Algebraic Modeling for Stochastic Optimization

Total Page:16

File Type:pdf, Size:1020Kb

Parallel Algebraic Modeling for Stochastic Optimization Parallel algebraic modeling for stochastic optimization Joey Huchette and Miles Lubin Cosmin G. Petra Operations Research Center Mathematics and Computer Science Division Massachusetts Institute of Technology Argonne National Laboratory Cambridge, MA 02139 9700 S. Cass Avenue Email: [email protected] Argonne, IL 60439 [email protected] Email: [email protected] Preprint number ANL/MCS-P5181-0814 AMLs (AMPL and GAMS) offer standalone environments, open-source AMLs are typically embedded in high-level Abstract—We present scalable algebraic modeling software, dynamic languages; CVX and YALMIP are both toolboxes StochJuMP, for stochastic optimization as applied to power grid for MATLAB, and Pyomo is a package for Python. Indeed, economic dispatch. It enables the user to express the problem in a dynamic languages provide convenient platforms for AMLs, high-level algebraic format with minimal boilerplate. StochJuMP and domain-specific languages in general, because they make allows efficient parallel model instantiation across nodes and development easier (no need for a customized parser), and efficient data localization. Computational results are presented promote the ease of use by being accessible from a language showing that the model construction is efficient, requiring less that is already familiar to users and has its own tools for than one percent of solve time. StochJuMP is configured with the processing and formatting data. parallel interior-point solver PIPS-IPM but is sufficiently generic to allow straight forward adaptation to other solvers. Historically, speed has been a trade-off for using AMLs Keywords—optimization, parallel programming, high perfor- embedded in high-level dynamic languages, primarily because mance computing, mathematical model, Power system modeling, of their extensive use of operator overloading within an in- scalability terpreted language. These AMLs may be orders of magnitude slower at generating the optimization model before passing it I. INTRODUCTION to a solver; in some reasonable use cases, this model generation time may perversely exceed the time spent in the solver. Algebraic modeling languages (AMLs) for optimization are widely used by both academics and practitioners for specify- The Julia programming language [6] presented an oppor- ing optimization problems in a human-readable, mathematical tunity to address this performance gap. Lubin and Dunning form, which is then automatically translated by the AML to developed JuMP [7], an AML embedded in Julia. By ex- the low-level formats required by efficient implementations ploiting Julia’s metaprogramming and just-in-time compilation of optimization algorithms, known as solvers. For example, functionality, they achieved performancec competitive with a classical family of optimization problems is linear program- that of commercial AMLs (AMPL and Gurobi/C++) and an ming (LP), that is, minimization of a linear function of decision improvement of one to two orders of magnitude over open- variables subject to linear equality and inequality constraints. source AMLs (Pyomo and PuLP [8]) in model generation time As input, LP solvers expect vectors c, l, u, L, and U and a for a benchmark of LP problems. sparse constraint matrix A defining the LP problem, In this paper, we present an extension to JuMP, called minimize cT x x StochJuMP, for modeling a class of stochastic optimization subject to L ≤ Ax ≤ U problems, namely, two-stage stochastic optimization with re- l ≤ x ≤ u; course, a popular paradigm for optimization under uncer- tainty [9]. These problems are computationally challenging and where inequalities are componentwise. at the largest scale require the use of specialized solvers em- ploying distributed-memory parallel computing. These solvers In practice, it is tedious and error-prone to explicitly form require structural information not provided by standard AMLs, a sparse constraint matrix and corresponding input vectors; and because of memory limits, it may not be feasible to build this process often involves concatenating different classes of up the entire instance in serial and then distribute the pieces. decision variables, each indexed over multidimensional sym- Instead, StochJuMP generates the model in parallel accord- bolic sets or numeric ranges, into a single x decision vector. ing to the data distribution across processes required by the Instead, AMLs, which may be considered domain-specific specialized solvers. We present numerical results demonstrat- languages in the vocabulary of computer science, handle these ing that StochJuMP can effectively generate these structured transformations automatically. models in a small fraction (≈ 1.5%) of the solution time. To Notable AMLs include AMPL [1], GAMS [2], our knowledge, these results are among the first reports of YALMIP [3], Pyomo [4], and CVX [5]. While commercial employing Julia on a moderately sized MPI-based cluster. II. BLOCK-ANGULAR STRUCTURE thus, the matrices Qi, i = f0; 1;:::;Ng, are required to be positive semidefinite. In addition, the interior-point algorithm The structured optimization problems considered in this used by the solver requires the matrices A, W , :::, W to paper are known in the optimization community as dual block 1 N have full row rank. angular problems, which means that they have separable objective function and block angular, or half-arrow shaped constraints. In particular, we consider dual block angular prob- III. PIPS-IPM lems with quadratic objective function and affine constraints, Solving practically sized instances of (1) requires the use of which can be mathematically expressed as memory-distributed computing environments and specialized 1 T T PN 1 T T optimization solvers capable of efficiently exploiting the dual min 2 x0 Q0x0 + c0 x0 + i=1 2 xi Qixi + ci xi s.t. Ax0 = b0; block angular structure. One such solver, PIPS-IPM, devel- T1x0+ W1x1 = b1; oped in the past few years by some of the authors of this T2x0+ W2x2 = b2; (1) paper, achieves data parallelism inside the numerical Mehrotra . .. predictor-corrector optimization algorithm [12] by distributing . the data and computations required by the optimization across TN x0+ WN xN = bN ; computational nodes. The data distribution is done by parti- x0 ≥ 0; x1 ≥ 0; x2 ≥ 0; : : : ; xN ≥ 0: tioning the scenarios and assigning a partition to a computa- The decision variables are the vectors xi, i = f0; 1;:::;Ng. tional node. The first-stage data are replicated on all nodes The data of the problem consist of the coefficients vectors ci, to avoid communication overhead. The computations follow right-hand sides bi, symmetric matrices Qi, and rectangular a similar distribution scheme: the intrascenario computations matrices A, Ti, Wi (i = f1;:::;Ng). (factorizations of sparse matrices, solving with the factors, matrix-matrix and matrix-vector products, and vector-vector Problems of the form (1) arise in stochastic optimization operations) are performed in parallel, independently for each with recourse. Stochastic programming problems with recourse scenario, while the computations corresponding to the first- (in (2) we show two-stage quadratic problems [10]) provide stage (factorizations of dense matrices, solve with the factors optimal decisions to be made now that minimize the expected and matrix-vector and vector-vector operations) are replicated cost in the presence of future uncertain conditions: across all computational nodes [13]. 1 T T min x0 Q0x0 + c0 x0 + Eξ[G(x0; ξ)] Several algorithmic and implementation developments x0 2 (2) were targeted at reducing the time to solution and increasing s.t. Ax0 = b0; x0 ≥ 0: the parallel efficiency of PIPS-IPM. In particular, the aug- The recourse function G(x ; ξ) is defined by mented incomplete factorization approach [14] considerably 0 reduced the time to solution by making better use the cores 1 T T inside each node and by fully exploiting the sparsity of G(x0; ξ) = min y Qξy + cξ y y 2 (3) the scenario data. Additional implementation developments, s.t. Tξx0 + Wξy = bξ; y ≥ 0; such as hardware-tuned communication and adoption of GPUs in the dense first-stage calculations [15], enabled real-time and the expected value E[·], which is assumed to be well defined, is taken with respect to the random variable ξ, which solutions of stochastic economic dispatch problems with very good parallel scaling on a variety of HPC platforms: Cray contains the data (Qξ; cξ;Tξ;Wξ; bξ). Some of the elements in ξ can be deterministic. The matrix Q is assumed to be XK7, Cray XC30, IBM BG/P, and IBM BG/Q [15]. However, ξ the algebraic specification and instantiation of the economic symmetric for all possible ξ. W , the recourse matrix, and T , ξ ξ dispatch problems were done serially and required consid- the technology matrix, are random matrices. The symmetric erably longer time than the time spent in the optimization matrix Q , the matrix A , and the linear coefficients c are 0 0 0 process. Removing this capability limitation and being able deterministic. The variable x0 is called the first-stage decision, which is a decision to be made now. The second-stage decision to process and instantiate the optimization models in times y is a recourse or corrective decision that one makes in the that are negligible, that is, a few percent of total solution time, future after some random event occurs. are the main motivations behind the existing work. The recourse problems (2) take the form of the dual- IV. STOCHJUMP block angular problems (1) when the probability distribution of ξ has finite support. When ξ does not have finite support, StochJuMP is an extension built on top of the JuMP problems (1) arise from the use of the sample average approx- algebraic modeling language [7]. Extending AMLs to deal imation (SAA) method, in which a sample (ξ1; ξ2; : : : ; ξN ) with structured problems such as multistage stochastic opti- 1 PN mization presents additional challenges for expressing problem is used to estimate Eξ[G(x0; ξ)] ≈ N i=1 G(x0; ξi) and the minimization and expectation operators are commuted [11]. In structure, both technically and conceptually.
Recommended publications
  • Julia, My New Friend for Computing and Optimization? Pierre Haessig, Lilian Besson
    Julia, my new friend for computing and optimization? Pierre Haessig, Lilian Besson To cite this version: Pierre Haessig, Lilian Besson. Julia, my new friend for computing and optimization?. Master. France. 2018. cel-01830248 HAL Id: cel-01830248 https://hal.archives-ouvertes.fr/cel-01830248 Submitted on 4 Jul 2018 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. « Julia, my new computing friend? » | 14 June 2018, IETR@Vannes | By: L. Besson & P. Haessig 1 « Julia, my New frieNd for computiNg aNd optimizatioN? » Intro to the Julia programming language, for MATLAB users Date: 14th of June 2018 Who: Lilian Besson & Pierre Haessig (SCEE & AUT team @ IETR / CentraleSupélec campus Rennes) « Julia, my new computing friend? » | 14 June 2018, IETR@Vannes | By: L. Besson & P. Haessig 2 AgeNda for today [30 miN] 1. What is Julia? [5 miN] 2. ComparisoN with MATLAB [5 miN] 3. Two examples of problems solved Julia [5 miN] 4. LoNger ex. oN optimizatioN with JuMP [13miN] 5. LiNks for more iNformatioN ? [2 miN] « Julia, my new computing friend? » | 14 June 2018, IETR@Vannes | By: L. Besson & P. Haessig 3 1. What is Julia ? Open-source and free programming language (MIT license) Developed since 2012 (creators: MIT researchers) Growing popularity worldwide, in research, data science, finance etc… Multi-platform: Windows, Mac OS X, GNU/Linux..
    [Show full text]
  • Numericaloptimization
    Numerical Optimization Alberto Bemporad http://cse.lab.imtlucca.it/~bemporad/teaching/numopt Academic year 2020-2021 Course objectives Solve complex decision problems by using numerical optimization Application domains: • Finance, management science, economics (portfolio optimization, business analytics, investment plans, resource allocation, logistics, ...) • Engineering (engineering design, process optimization, embedded control, ...) • Artificial intelligence (machine learning, data science, autonomous driving, ...) • Myriads of other applications (transportation, smart grids, water networks, sports scheduling, health-care, oil & gas, space, ...) ©2021 A. Bemporad - Numerical Optimization 2/102 Course objectives What this course is about: • How to formulate a decision problem as a numerical optimization problem? (modeling) • Which numerical algorithm is most appropriate to solve the problem? (algorithms) • What’s the theory behind the algorithm? (theory) ©2021 A. Bemporad - Numerical Optimization 3/102 Course contents • Optimization modeling – Linear models – Convex models • Optimization theory – Optimality conditions, sensitivity analysis – Duality • Optimization algorithms – Basics of numerical linear algebra – Convex programming – Nonlinear programming ©2021 A. Bemporad - Numerical Optimization 4/102 References i ©2021 A. Bemporad - Numerical Optimization 5/102 Other references • Stephen Boyd’s “Convex Optimization” courses at Stanford: http://ee364a.stanford.edu http://ee364b.stanford.edu • Lieven Vandenberghe’s courses at UCLA: http://www.seas.ucla.edu/~vandenbe/ • For more tutorials/books see http://plato.asu.edu/sub/tutorials.html ©2021 A. Bemporad - Numerical Optimization 6/102 Optimization modeling What is optimization? • Optimization = assign values to a set of decision variables so to optimize a certain objective function • Example: Which is the best velocity to minimize fuel consumption ? fuel [ℓ/km] velocity [km/h] 0 30 60 90 120 160 ©2021 A.
    [Show full text]
  • Pysp: Modeling and Solving Stochastic Programs in Python
    Noname manuscript No. (will be inserted by the editor) PySP: Modeling and Solving Stochastic Programs in Python Jean-Paul Watson · David L. Woodruff · William E. Hart Received: September 6, 2010. Abstract Although stochastic programming is a powerful tool for modeling decision- making under uncertainty, various impediments have historically prevented its wide- spread use. One key factor involves the ability of non-specialists to easily express stochastic programming problems as extensions of deterministic models, which are often formulated first. A second key factor relates to the difficulty of solving stochastic programming models, particularly the general mixed-integer, multi-stage case. Intri- cate, configurable, and parallel decomposition strategies are frequently required to achieve tractable run-times. We simultaneously address both of these factors in our PySP software package, which is part of the COIN-OR Coopr open-source Python project for optimization. To formulate a stochastic program in PySP, the user speci- fies both the deterministic base model and the scenario tree with associated uncertain parameters in the Pyomo open-source algebraic modeling language. Given these two models, PySP provides two paths for solution of the corresponding stochastic program. The first alternative involves writing the extensive form and invoking a standard deter- ministic (mixed-integer) solver. For more complex stochastic programs, we provide an implementation of Rockafellar and Wets’ Progressive Hedging algorithm. Our particu- lar focus is on the use of Progressive Hedging as an effective heuristic for approximating Jean-Paul Watson Sandia National Laboratories Discrete Math and Complex Systems Department PO Box 5800, MS 1318 Albuquerque, NM 87185-1318 E-mail: [email protected] David L.
    [Show full text]
  • Pyomo Documentation Release 5.6.2.Dev0
    Pyomo Documentation Release 5.6.2.dev0 Pyomo Feb 06, 2019 Contents 1 Installation 3 1.1 Using CONDA..............................................3 1.2 Using PIP.................................................3 2 Citing Pyomo 5 2.1 Pyomo..................................................5 2.2 PySP...................................................5 3 Pyomo Overview 7 3.1 Mathematical Modeling.........................................7 3.2 Overview of Modeling Components and Processes...........................9 3.3 Abstract Versus Concrete Models....................................9 3.4 Simple Models.............................................. 10 4 Pyomo Modeling Components 17 4.1 Sets.................................................... 17 4.2 Parameters................................................ 23 4.3 Variables................................................. 24 4.4 Objectives................................................ 25 4.5 Constraints................................................ 26 4.6 Expressions................................................ 26 4.7 Suffixes.................................................. 31 5 Solving Pyomo Models 41 5.1 Solving ConcreteModels......................................... 41 5.2 Solving AbstractModels......................................... 41 5.3 pyomo solve Command....................................... 41 5.4 Supported Solvers............................................ 42 6 Working with Pyomo Models 43 6.1 Repeated Solves............................................. 43 6.2 Changing
    [Show full text]
  • Research and Development of an Open Source System for Algebraic Modeling Languages
    Vilnius University Institute of Data Science and Digital Technologies LITHUANIA INFORMATICS (N009) RESEARCH AND DEVELOPMENT OF AN OPEN SOURCE SYSTEM FOR ALGEBRAIC MODELING LANGUAGES Vaidas Juseviˇcius October 2020 Technical Report DMSTI-DS-N009-20-05 VU Institute of Data Science and Digital Technologies, Akademijos str. 4, Vilnius LT-08412, Lithuania www.mii.lt Abstract In this work, we perform an extensive theoretical and experimental analysis of the char- acteristics of five of the most prominent algebraic modeling languages (AMPL, AIMMS, GAMS, JuMP, Pyomo), and modeling systems supporting them. In our theoretical comparison, we evaluate how the features of the reviewed languages match with the requirements for modern AMLs, while in the experimental analysis we use a purpose-built test model li- brary to perform extensive benchmarks of the various AMLs. We then determine the best performing AMLs by comparing the time needed to create model instances for spe- cific type of optimization problems and analyze the impact that the presolve procedures performed by various AMLs have on the actual problem-solving times. Lastly, we pro- vide insights on which AMLs performed best and features that we deem important in the current landscape of mathematical optimization. Keywords: Algebraic modeling languages, Optimization, AMPL, GAMS, JuMP, Py- omo DMSTI-DS-N009-20-05 2 Contents 1 Introduction ................................................................................................. 4 2 Algebraic Modeling Languages .........................................................................
    [Show full text]
  • Methods of Distributed and Nonlinear Process Control: Structuredness, Optimality and Intelligence
    Methods of Distributed and Nonlinear Process Control: Structuredness, Optimality and Intelligence A THESIS SUBMITTED TO THE FACULTY OF THE GRADUATE SCHOOL OF THE UNIVERSITY OF MINNESOTA BY Wentao Tang IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY Advised by Prodromos Daoutidis May, 2020 c Wentao Tang 2020 ALL RIGHTS RESERVED Acknowledgement Before meeting with my advisor Prof. Prodromos Daoutidis, I was not determined to do a Ph.D. in process control or systems engineering; after working with him, I could not imagine what if I had chosen a different path. The important decision that I made in the October of 2015 turned out to be most correct, that is, to follow an advisor who has the charm to always motivate me to explore the unexploited lands, overcome the obstacles and make achievements. I cherish the freedom of choosing different problems of interest and the flexibility of time managing in our research group. Apart from the knowledge of process control, I have learned a lot from Prof. Daoutidis and will try my best to keep learning, about the skills to write, present and teach, the pursuit of knowledge and methods with both elegance of fundamental theory and a promise of practical applicability, and the pride and fervor for the undertaken works. \The high hill is to be looked up to; the great road is to be travelled on."1 I want to express my deepest gratitude to him. Prof. Qi Zhang always gives me very good suggestions and enlightenment with his abundant knowledge in optimization since he joined our department.
    [Show full text]
  • Open Source Tools for Optimization in Python
    Open Source Tools for Optimization in Python Ted Ralphs Sage Days Workshop IMA, Minneapolis, MN, 21 August 2017 T.K. Ralphs (Lehigh University) Open Source Optimization August 21, 2017 Outline 1 Introduction 2 COIN-OR 3 Modeling Software 4 Python-based Modeling Tools PuLP/DipPy CyLP yaposib Pyomo T.K. Ralphs (Lehigh University) Open Source Optimization August 21, 2017 Outline 1 Introduction 2 COIN-OR 3 Modeling Software 4 Python-based Modeling Tools PuLP/DipPy CyLP yaposib Pyomo T.K. Ralphs (Lehigh University) Open Source Optimization August 21, 2017 Caveats and Motivation Caveats I have no idea about the background of the audience. The talk may be either too basic or too advanced. Why am I here? I’m not a Sage developer or user (yet!). I’m hoping this will be a chance to get more involved in Sage development. Please ask lots of questions so as to guide me in what to dive into! T.K. Ralphs (Lehigh University) Open Source Optimization August 21, 2017 Mathematical Optimization Mathematical optimization provides a formal language for describing and analyzing optimization problems. Elements of the model: Decision variables Constraints Objective Function Parameters and Data The general form of a mathematical optimization problem is: min or max f (x) (1) 8 9 < ≤ = s.t. gi(x) = bi (2) : ≥ ; x 2 X (3) where X ⊆ Rn might be a discrete set. T.K. Ralphs (Lehigh University) Open Source Optimization August 21, 2017 Types of Mathematical Optimization Problems The type of a mathematical optimization problem is determined primarily by The form of the objective and the constraints.
    [Show full text]
  • Introduction to Modeling Optimization Problems in Python
    Open Source Tools for Optimization in Python Ted Ralphs SciPy 2015 IIT Bombay, 16 Decmber 2015 T.K. Ralphs (Lehigh University) COIN-OR December 16, 2015 Outline 1 Introduction 2 PuLP 3 Pyomo 4 Solver Studio 5 Advanced Modeling Sensitivity Analysis Tradeoff Analysis (Multiobjective Optimization) Nonlinear Modeling Integer Programming Stochastic Programming T.K. Ralphs (Lehigh University) COIN-OR December 16, 2015 Outline 1 Introduction 2 PuLP 3 Pyomo 4 Solver Studio 5 Advanced Modeling Sensitivity Analysis Tradeoff Analysis (Multiobjective Optimization) Nonlinear Modeling Integer Programming Stochastic Programming T.K. Ralphs (Lehigh University) COIN-OR December 16, 2015 Algebraic Modeling Languages Generally speaking, we follow a four-step process in modeling. Develop an abstract model. Populate the model with data. Solve the model. Analyze the results. These four steps generally involve different pieces of software working in concert. For mathematical programs, the modeling is often done with an algebraic modeling system. Data can be obtained from a wide range of sources, including spreadsheets. Solution of the model is usually relegated to specialized software, depending on the type of model. T.K. Ralphs (Lehigh University) COIN-OR December 16, 2015 Open Source Solvers: COIN-OR The COIN-OR Foundation A non-profit foundation promoting the development and use of interoperable, open-source software for operations research. A consortium of researchers in both industry and academia dedicated to improving the state of computational research in OR. A venue for developing and maintaining standards. A forum for discussion and interaction between practitioners and researchers. The COIN-OR Repository A collection of interoperable software tools for building optimization codes, as well as a few stand alone packages.
    [Show full text]
  • Casadi Tutorial – Introduction
    Lund, 6 December 2011 CasADi tutorial { Introduction Joel Andersson Johan Akesson˚ Department of Electrical Engineering (ESAT-SCD) & Optimization in Engineering Center (OPTEC) Katholieke Universiteit Leuven OPTEC (ESAT { SCD) { Katholieke Universiteit Leuven OPTEC { Optimization in Engineering Interdiciplinary center at Katholieke Universiteit Leuven, Belgium (Flemish region) since 2005 20 professors, 10 postdocs and 60+ PhD students from Mech.Eng, Elec.Eng, Civ.Eng, Comp.Sc. Principle investigator: Moritz Diehl Aim: Bridge the gap between state-of-the optimization algorithms and applications Applications: Kite power, river networks, robot control, chemical reactors, etc. www.kuleuven.be/optec CasADi tutorial { Introduction | Joel Andersson Johan Akesson˚ Outline 1 Optimization { an overview 2 What is CasADi? 3 Schedule of tutorials 4 Exercise: Getting started Optimization { an overview What is optimization? Optimization = search for the best solution in mathematical terms: minimization or maximization of an objective function f (x) depending on variables x subject to constraints Equivalence of maximization and minimization problems: (from now on only minimization) f(x) -f(x) x x x* Maximum x* Minimum CasADi tutorial { Introduction | Joel Andersson Johan Akesson˚ Optimization { an overview Constrained optimization Often variable x shall satisfy certain constraints, e.g.: x ≥ 0, 2 2 x1 + x2 = C. General formulation minimize f (x) (1) subject to g(x) ≥ 0; h(x) = 0 f objective (cost) function g inequality constraint function h equality constraint
    [Show full text]
  • Pyomo and Jump – Modeling Environments for the 21St Century
    Qi Chen and Braulio Brunaud Pyomo and JuMP – Modeling environments for the 21st century EWO Seminar Carnegie Mellon University March 10, 2017 1.All the information provided is coming from the standpoint of two somewhat expert users in Pyomo and Disclaimer JuMP, former GAMS users 2. All the content is presented to the best of our knowledge 2 The Optimization Workflow . These tasks usually involve many tools: Data Input – Databases – Excel – GAMS/AMPL/AIMMS/CPLEX Data – Tableau Manipulation . Can a single tool be used to complete the entire workflow? Optimization Analysis Visualization 3 Optimization Environments Overview Visualization End User ● Different input sources ● Easy to model ● AMPL: Specific modeling constructs ○ Piecewise ○ Complementarity Modeling conditions ○ Logical Implications ● Solver independent models ● Developing complex algorithms can be challenging ● Large number of solvers available Advanced Optimization algorithms Expert ● Commercial 4 Optimization Environments Overview Visualization End User ● Different input sources ● Build input forms ● Easy to model ● Multiplatform: ○ Standalone Modeling ○ Web ○ Mobile ● Solver independent models ● Build visualizations ● Commercial Advanced Optimization algorithms Expert 5 Optimization Environments Overview Visualization End User ● Different input sources ● Hard to model ● Access to the full power of a solver ● Access to a broad range of tools Modeling ● Solver-specific models ● Building visualizations is hard ● Open source and free Advanced Optimization Expert algorithms solver
    [Show full text]
  • Computational Integer Programming Lecture 3
    Computational Integer Programming Lecture 3: Software Dr. Ted Ralphs Computational MILP Lecture 3 1 Introduction to Software (Solvers) • There is a wealth of software available for modeling, formulation, and solution of MILPs. • Commercial solvers { IBM CPLEX { FICO XPress { Gurobi • Open Source and Free for Academic Use { COIN-OR Optimization Suite (Cbc, SYMPHONY, Dip) { SCIP { lp-solve { GLPK { MICLP { Constraint programming systems 1 Computational MILP Lecture 3 2 Introduction to Software (Modeling) • There are two main categories of modeling software. { Algebraic Modeling Languages (AMLs) { Constraint Programming Systems (CPSs) • According to our description of the modeling process, AMLs should probably be called \formulation languages". • AMLs assume the problem will be formulated as a mathematical optimization problem. • Although AMLs make the formulation process more convenient, the user must provide an initial \formulation" and decide on an appropriate solver. • Solvers do some internal reformulation, but this is limited. • Constraint programming systems use a much higher level description of the model itself. • Reformulation is done internally before automatically passing the problem to the most appropriate of a number of solvers. 2 Computational MILP Lecture 3 3 Algebraic Modeling Languages • A key concept is the separation of \model" (formulation, really) from \data." • Generally speaking, we follow a four-step process in modeling with AMLs. { Develop an \abstract model" (more like a formulation). { Populate the formulation with data. { Solve the Formulation. { Analyze the results. • These four steps generally involve different pieces of software working in concert. • For mathematical optimization problems, the modeling is often done with an algebraic modeling system. • Data can be obtained from a wide range of sources, including spreadsheets.
    [Show full text]
  • MPEC Methods for Bilevel Optimization Problems∗
    MPEC methods for bilevel optimization problems∗ Youngdae Kim, Sven Leyffer, and Todd Munson Abstract We study optimistic bilevel optimization problems, where we assume the lower-level problem is convex with a nonempty, compact feasible region and satisfies a constraint qualification for all possible upper-level decisions. Replacing the lower- level optimization problem by its first-order conditions results in a mathematical program with equilibrium constraints (MPEC) that needs to be solved. We review the relationship between the MPEC and bilevel optimization problem and then survey the theory, algorithms, and software environments for solving the MPEC formulations. 1 Introduction Bilevel optimization problems model situations in which a sequential set of decisions are made: the leader chooses a decision to optimize its objective function, anticipating the response of the follower, who optimizes its objective function given the leader’s decision. Mathematically, we have the following optimization problem: minimize f ¹x; yº x;y subject to c¹x; yº ≥ 0 (1) y 2 argmin f g¹x; vº subject to d¹x; vº ≥ 0 g ; v Youngdae Kim Argonne National Laboratory, Lemont, IL 60439, e-mail: [email protected] Sven Leyffer Argonne National Laboratory, Lemont, IL 60439, e-mail: [email protected] Todd Munson Argonne National Laboratory, Lemont, IL 60439, e-mail: [email protected] ∗ Argonne National Laboratory, MCS Division Preprint, ANL/MCS-P9195-0719 1 2 Youngdae Kim, Sven Leyffer, and Todd Munson where f and g are the objectives for the leader and follower, respectively, c¹x; yº ≥ 0 is a joint feasibility constraint, and d¹x; yº ≥ 0 defines the feasible actions the follower can take.
    [Show full text]