Solving Well-Structured MINLP Problems

Total Page:16

File Type:pdf, Size:1020Kb

Solving Well-Structured MINLP Problems UNIVERSITE´ PARIS 13 LABORATOIRE D’INFORMATIQUE DE PARIS NORD HABILITATION A` DIRIGER DES RECHERCHES specialit´ e´ SCIENCES (Informatique) par Claudia D’Ambrosio Solving well-structured MINLP problems Jury: Emilio Carrizosa Universidad de Sevilla Rapporteur Sourour Elloumi Ecole´ Nationale Superieure´ de Examinateur Techniques Avancees´ Ignacio Grossmann Carnegie Mellon University Rapporteur Jean-Bernard Lasserre CNRS-LAAS Examinateur Ruth Misener Imperial College London Rapporteur Fred´ eric´ Roupin Universite´ Paris 13 Parrain Roberto Wolfler Calvo Universite´ Paris 13 Parrain Contents Acknowledgments iii Keywordsv Preface vii I Curriculum Vitae1 II Research Summary 23 1 Introduction to MINLP Problems and Methods 25 1.1 Modeling problems by Mathematical Programming............. 26 1.1.1 Reformulations............................ 26 1.1.2 Relaxations.............................. 27 1.1.3 Approximations........................... 27 1.1.4 Solution methods........................... 28 1.2 Mixed Integer Linear Programming Technology............... 29 1.3 Non Linear Programming Technology.................... 30 1.4 Mixed Integer Non Linear Programming Technology............ 30 2 Exact Methods for MINLP problems 35 2.1 Methods based on strengthening of classic relaxations............ 35 2.1.1 Euclidean Steiner Tree........................ 36 2.1.2 The pooling problem and its variant with binary variables..... 38 2.2 On Separable Functions............................ 41 2.2.1 SC-MINLP.............................. 42 2.2.2 Polyopt................................ 45 3 Heuristic Methods 47 3.1 General methods............................... 47 47subsection.3.1.1 i ii CONTENTS 3.1.2 Optimistic approximation...................... 49 3.2 Tailored methods............................... 51 3.2.1 Non linear Knapsack Problem.................... 51 4 Other contributions 53 4.1 Feasibility issues on Hydro Unit Commitment................ 53 4.2 MILP models for the selection of a small set of well-distributed points... 54 4.3 Well placement in reservoir engineering................... 54 4.4 A heuristic for multiobjective convex MINLPs................ 55 4.5 Bilevel programming and smart grids.................... 55 4.6 Feasibility Pump for aircraft deconfliction problems............. 56 4.7 On the Product Knapsack Problem...................... 56 4.8 Learning to Solve Hydro Unit Commitment Problems in France...... 57 5 Conclusions and Future Research 59 Bibliography 61 A Selected Publications 67 Acknowledgments The road that brought me here was long. The more I think about when my passion to Optimization started, the more I’m convinced that it dates back to when I was a third- year bachelor student at University of Bologna. One of the optional classes I choose and attended was Practical Tools for Optimization taught by Professor Andrea Lodi: besides a natural inclination for the topic, the enthusiasm of Andrea was so captivating that for the Master I choose all the classes on Optimization that were offered and a Master Thesis on mixed integer non linear optimization, a topic I got immediately addicted to. In 2006 I started my Ph.D. at University of Bologna where the O.R. group welcomed me warmly from the very beginning. I deeply thank all the members of this wonderful group, with a special thought to Andrea who, as Ph.D. advisor, was and is an inspiration to me. Back in 2007 I had the chance to spend my summer working as intern at the IBM T.J. Watson Research Center: Jon Lee and Andreas Wachter¨ were great manager and tutor, respectively, and all the colleagues at IBM helped to make it a great experience. Then in 2010 I got a post-doc position at University of Wisconsin-Madison and I had the chance to be supervised by Jeff Linderoth and Jim Luedtke. I learned a lot in 6 months and the festive environment made me enjoy very much my Mid-West experience. Thanks also to all the students of the group who were lovely officemates. In October 2011 I moved to France to join the CNRS family and, in particular, I was affiliated to the SYSMO team at LIX, Ecole´ Polytechnique. Among the SYSMO mem- bers, I would like to thank, in particular, Leo Liberti who’s a precious colleague and friend, Evelyne Rayssac who helps me a lot since the very first moment to survive to the crazy bu- reaucracy, the former and current Ph.D. student Claire Lizon, Youcef Sahraoui, Luca Men- carelli, Vy Khac Ky, Gustavo Dias, Oliver Wang, Kostas Tavlaridis-Gyparakis, Gabriele Iommazzo, and the former and current post-docs Sonia Toubaline, Pierre-Louis Poirion, and Dimitri Thomopulos. Moreover, I would like to thank Fabio Roda and Sylvie Tonda who worked and works hard to convince industries about the importance of Optimization. Earlier this year the SYSMO team gave way to the DASCIM team and I thank Michalis Vazirgiannis who took the lead of such challenging project. I hope we have several years of fruitful collaborations ahead. I started discussing about this HDR with Roberto Wolfler-Calvo and Frederic Roupin several years ago, maybe 5 years ago. They immediately accepted to be my tutors and iii iv ACKNOWLEDGMENTS were kind enough to wait 5 years to make this happen. Thanks a lot for your help and support, it was really precious to me the fact that you did not put pressure on me to finish this when I needed to focus on something else but you did when I needed to finalize the manuscript. I warmly thank also the other members of the HDR, namely the “rapporteurs” Emilio Carrizosa, Ignacio Grossmann, and Ruth Misener, and the “examinateurs” Sourour Elloumi and Jean Bernard Lasserre. It is really an honor to have them in my HDR jury. I made a huge step forward with the writing of this HDR at the end of 2015 when I spent 3 months in Montreal.´ Andrea Lodi invited me to visit him and gave me the opportunity to work in the dynamic environment of GERAD and CIRRELT. I would like to thank him together with all the GERAD and CIRRELT members, in particular Miguel Anjos and Zhao Sun, fantastic collaborators, and Marta Pascoal, with who I shared the office, lunches, and lovely times. I also would like to thank Sonia Vanier & the equipe´ SAMM of Universite´ Paris 1 who was so kind to host me when I had to “hide” to focus on finalizing this manuscript: the nice desk with a view on Paris, the enjoyable coffee break with plenty of chocolate and laughs, and your hospitality helped me a lot to achieve the end of this manuscript writing. During the last ten years I had the chance to meet and collaborate with several col- leagues, each of which taught me a lot and make me enjoy my work. I thank all the co-authors of the works presented in this manuscript and of all my on-going projects: it is also thanks to you that I am still passionate about research! Last but not least, I need to devote a special thought to my parents, my sister, my friends, and, in particular, to Alessandro who supported me constantly and with a unique patience during the last 9 years. I’m really lucky and grateful for having such a generous and understanding husband. Paris, 2018 Claudia D’Ambrosio Keywords Mixed integer non linear programming Non convex problems Well-structured problems Real-world applications Modeling v vi Keywords Preface This manuscript summarizes the main research topics I have been working on since my Ph.D. at University of Bologna (Italy). The broader common topic is mixed integer non linear programming (MINLP) problems, i.e., optimization problems in which both contin- uous and integer decision variables are present and, in general, non linear functions define the objective to minimize and the constraints to satisfy. In this context, I have been working both on theoretical aspects and real-world applica- tions. All the research performed aims at solving a class of MINLP problems by exploring and exploiting the specific properties that characterize such class. The manuscript is composed of two parts. PartI includes a complete curriculum vitæ and the most relevant publications. PartII is divided as follows: In Chapter1 an introduction on MINLP problems and on classical methods to solve it and its most widely used subclasses is presented. The research concerning exact methods for classes of MINLPs is described in Chap- ter2 . In particular, the chapter is composed of two main parts: i. methods based on the strengthening of classical relaxations and ii. methods based on non standard relaxations. In Chapter3 we discuss heuristic methods for generic MINLP problems, i.e., Feasi- bility Pump and Optimistic Approximation for non convex MINLPs, and for specific problems like the non linear knapsack and its generalizations. The methods for non convex MINLP show special properties for subclasses of MINLPs. For example, the heuristics for the non linear knapsack problem exploit the fact that the only non linear functions, i.e., the profit and weight functions, are separable and non decreasing. Other contributions are summarized in Chapter4 . Finally, conclusions and future research are discussed in Chapter5 . vii viii PREFACE Part I Curriculum Vitae 1 Curriculum Vitae Claudia D’Ambrosio June 11, 2018 CNRS Charg´eede Recherche and Charg´eed’Enseignement at: LIX, Laboratoire d’Informatique de l’Ecole´ Polytechnique 91128 Palaiseau CEDEX, France Office: LIX 1066 (Batiment Alan Turing, first floor) E-mail address: [email protected] Office phone n.: +33.1.77.57.80.06 Personal Information and Education From January 2006 to April 2009: PhD student in “Automatic Control and Operations Research” at the Dipartimento di Elettronica, Informatica e Sis- temistica (DEIS), University of Bologna (Italy). Thesis “Application-oriented Mixed Integer Non-Linear Programming”, advisor Prof. Andrea Lodi. 21st March 2005: Master degree in Computer Science Engineering at the Uni- versity of Bologna (Italy). Thesis: “Algorithms for the Water Network De- sign Problem” advisor Prof. Paolo Toth, co-advisors Prof. Andrea Lodi and Dr.
Recommended publications
  • A Computerized Poultry Farm Optimization System By
    • A COMPUTERIZED POULTRY FARM OPTIMIZATION SYSTEM BY TURYARUGAYO THOMPSON 16/U/12083/PS 216008822 SUPERVISOR MR. SERUNJOGI AMBROSE A DISSERTATION SUBMITTED TO THE SCHOOL OF STATISTICS AND PLANNING IN PARTIAL FULFILMENT FOR THE AWARD FOR A DEGREE OF BACHELOR OF STATISTICS AT MAKERERE UNIVERSITY JUNE 2019 i ii iii DEDICATION I dedicate this project to the ALMIGHTY GOD for giving me a healthy life, secondly to my parents; Mr. and Mrs. Twamuhabwa Wilson, my brother Mr. Turyasingura Thomas, my aunt Mrs. Olive Kamuli for being supportive to me financially and always encouraging me to keep moving on. I lastly dedicate it to my course mates most especially the B. Stat computing class. iv ACKONWLEDGEMENT I would like to thank the Almighty God for enabling me finish my final year project. I would also like to express my special appreciation to my supervisor Mr. Sserunjogi Ambrose in providing me with me suggestions, encouragement and helped me to coordinate my project and in writing this document. A special thanks goes to my parents; Mr. and Mrs. Twamuhabwa Wilson, my brother Mr. Turyasingura Thomas, my aunt Mrs. Olive Kamuli for being supportive to me financially and spiritually towards the accomplishment of this dissertation. Lastly, many thanks go to my closest friend Mr. Kyagera Sulaiman, my fellow students most especially Akandinda Noble, Mulapada Seth Augustine, Kakuba Caleb Kanyesigye, Nuwabasa Moses, Kyomuhendo Evarce among others who invested their effort and for being supportive and kind to me during the time of working on my final year project. v TABLE OF CONTENTS DECLARATION ............................................................................................................................. i APPROVAL .................................................................................. Error! Bookmark not defined.
    [Show full text]
  • Reformulations in Mathematical Programming: a Computational Approach⋆
    Reformulations in Mathematical Programming: A Computational Approach⋆ Leo Liberti, Sonia Cafieri, and Fabien Tarissan LIX, Ecole´ Polytechnique, Palaiseau, 91128 France {liberti,cafieri,tarissan}@lix.polytechnique.fr Summary. Mathematical programming is a language for describing optimization problems; it is based on parameters, decision variables, objective function(s) subject to various types of constraints. The present treatment is concerned with the case when objective(s) and constraints are algebraic mathematical expres- sions of the parameters and decision variables, and therefore excludes optimization of black-box functions. A reformulation of a mathematical program P is a mathematical program Q obtained from P via symbolic transformations applied to the sets of variables, objectives and constraints. We present a survey of existing reformulations interpreted along these lines, some example applications, and describe the implementation of a software framework for reformulation and optimization. 1 Introduction Optimization and decision problems are usually defined by their input and a mathematical de- scription of the required output: a mathematical entity with an associated value, or whether a given entity has a specified mathematical property or not. Mathematical programming is a lan- guage designed to express almost all practically interesting optimization and decision problems. Mathematical programming formulations can be categorized according to various properties, and rather efficient solution algorithms exist for many of the categories.
    [Show full text]
  • Highs High-Performance Open-Source Software for Linear Optimization
    HiGHS High-performance open-source software for linear optimization Julian Hall School of Mathematics University of Edinburgh [email protected] CO@Work 2020 21 September 2020 HiGHS: The team What's in a name? HiGHS: Hall, ivet Galabova, Huangfu and Schork Team HiGHS Julian Hall: Reader (1990{date) Ivet Galabova: PhD (2016{date) Qi Huangfu PhD (2009{2013) FICO Xpress (2013{2018) Michael Feldmeier: PhD (2018{date) Julian Hall HiGHS: High-performance open-source linear optimization 2 / 10 HiGHS: Solvers Linear programming (LP) Dual simplex (Huangfu and Hall) Serial techniques exploiting sparsity Parallel techniques exploiting multicore architectures Interior point (Schork) Highly accurate due to its iterative linear system solver Crossover to a basic solution Mixed-integer programming (MIP) Prototype solver Julian Hall HiGHS: High-performance open-source linear optimization 3 / 10 HiGHS: Features and interfaces Features Model management: Load/add/delete/modify problem data Presolve Crash Interfaces Language Applications Future C++ HiGHS class GAMS AMPL Load from .mps JuliaOpt MATLAB Load from .lp OSI Mosel C SCIP PuLp C# SciPy R Julia Suggestions? FORTRAN Python Julian Hall HiGHS: High-performance open-source linear optimization 4 / 10 HiGHS: Access Open-source (MIT license) GitHub: ERGO-Code/HiGHS COIN-OR: Successor to Clp? No third-party code required Runs under Linux, Windows and Mac Build requires CMake 3.15 Parallel code uses OpenMP Documentation: http://www.HiGHS.dev/ Julian Hall HiGHS: High-performance open-source linear optimization
    [Show full text]
  • T-Optimal Designs for Multi-Factor Polynomial Regression Models Via a Semidefinite Relaxation Method
    Noname manuscript No. (will be inserted by the editor) T-optimal designs for multi-factor polynomial regression models via a semidefinite relaxation method Yuguang Yue · Lieven Vandenberghe · Weng Kee Wong Received: date / Accepted: date Abstract We consider T-optimal experiment design problems for discriminat- ing multi-factor polynomial regression models where the design space is defined by polynomial inequalities and the regression parameters are constrained to given convex sets. Our proposed optimality criterion is formulated as a convex optimization problem with a moment cone constraint. When the regression models have one factor, an exact semidefinite representation of the moment cone constraint can be applied to obtain an equivalent semidefinite program. When there are two or more factors in the models, we apply a moment relax- ation technique and approximate the moment cone constraint by a hierarchy of semidefinite-representable outer approximations. When the relaxation hier- archy converges, an optimal discrimination design can be recovered from the optimal moment matrix, and its optimality is confirmed by an equivalence theorem. The methodology is illustrated with several examples. Keywords Continuous design · Convex optimization · Equivalence theorem · Moment relaxation · Semidefinite programming 1 Introduction In many scientific investigations, the underlying statistical model that drives the outcome of interest is not known. In practice, researchers may be able to identify a few plausible models for their problem and an early goal is to Yuguang Yue Department of Biostatistics, University of California, Los Angeles E-mail: [email protected] Lieven Vandenberghe Department of Electrical Engineering, University of California, Los Angeles arXiv:1807.08213v2 [stat.CO] 2 Feb 2020 Weng Kee Wong Department of Biostatistics, University of California, Los Angeles 2 Yuguang Yue et al.
    [Show full text]
  • Global Optimization: from Theory to Implementation
    Global Optimization: from Theory to Implementation Leo Liberti DEI, Politecnico di Milano, Piazza L. da Vinci 32, 20133 Milano, Italy Nelson Maculan COPPE, Universidade Federal do Rio de Janeiro, P.O. Box, 68511, 21941-972 Rio de Janeiro, Brazil DRAFT To Anne-Marie DRAFT Preface The idea for this book was born on the coast of Montenegro, in October 2003, when we were invited to the Sym-Op-Is Serbian Conference on Operations Research. During those days we talked about many optimization problems, going from discussion to implementation in a matter of minutes, reaping good profits from the whole \hands-on" process, and having a lot of fun in the meanwhile. All the wrong ideas were weeded out almost immediately by failed computational experiments, so we wasted little time on those. Unfortunately, translating ideas into programs is not always fast and easy, and moreover the amount of literature about the implementation of global optimization algorithm is scarce. The scope of this book is that of moving a few steps towards the system- atization of the path that goes from the invention to the implementation and testing of a global optimization algorithm. The works contained in this book have been written by various researchers working at academic or industrial institutions; some very well known, some less famous but expert nonetheless in the discipline of actually getting global optimization to work. The papers in this book underline two main developments in the imple- mentation side of global optimization: firstly, the introduction of symbolic manipulation algorithms and automatic techniques for carrying out algebraic transformations; and secondly, the relatively wide availability of extremely ef- ficient global optimization heuristics and metaheuristics that target large-scale nonconvex constrained optimization problems directly.
    [Show full text]
  • MINLP Solver Software
    MINLP Solver Software Michael R. Bussieck and Stefan Vigerske GAMS Development Corp., 1217 Potomac St, NW Washington, DC 20007, USA [email protected], [email protected] March 10, 2014 Abstract In this article we give a brief overview of the start-of-the-art in software for the solution of mixed integer nonlinear programs (MINLP). We establish several groupings with respect to various features and give concise individual descriptions for each solver. The provided information may guide the selection of a best solver for a particular MINLP problem. Keywords: mixed integer nonlinear programming, solver, software, MINLP, MIQCP 1 Introduction The general form of an MINLP is minimize f(x; y) subject to g(x; y) ≤ 0 (P) x 2 X y 2 Y integer n+s n+s m The function f : R ! R is a possibly nonlinear objective function and g : R ! R a possibly nonlinear constraint function. Most algorithms require the functions f and g to be continuous and differentiable, but some may even allow for singular discontinuities. The variables x and y are the decision variables, where y is required to be integer valued. The n s sets X ⊆ R and Y ⊆ R are bounding-box-type restrictions on the variables. Additionally to integer requirements on variables, other kinds of discrete constraints are commonly used. These are, e.g., special-ordered-set constraints (only one (SOS type 1) or two consecutive (SOS type 2) variables in an (ordered) set are allowed to be nonzero) [8], semicontinuous variables (the variable is allowed to take either the value zero or a value above some bound), semiinteger variables (like semicontinuous variables, but with an additional integer restric- tion), and indicator variables (a binary variable indicates whether a certain set of constraints has to be enforced).
    [Show full text]
  • A High-Performance Linear Optimizer Turning Gradware Into Software
    HiGHS: a high-performance linear optimizer Turning gradware into software Julian Hall School of Mathematics, University of Edinburgh ICCOPT, Berlin 6 August 2019 HiGHS: High performance linear optimization Linear optimization Linear programming (LP) minimize cT x subject to Ax = b x 0 ≥ Convex quadratic programming (QP) 1 minimize xT Qx + cT x subject to Ax = b x 0 2 ≥ Q positive semi-definite High performance Serial techniques exploiting sparsity in A Parallel techniques exploiting multicore architectures Julian Hall HiGHS: a high-performance linear optimizer 2 / 22 HiGHS: The team What's in a name? HiGHS: Hall, ivet Galabova, Huangfu and Schork Team HiGHS Julian Hall: Reader (1990{date) Ivet Galabova: PhD (2016{date) Qi Huangfu PhD (2009{2013) FICO Xpress (2013{2018) MSc (2018{date) Lukas Schork: PhD (2015{2018) Michael Feldmeier: PhD (2018{date) Joshua Fogg: PhD (2019{date) Julian Hall HiGHS: a high-performance linear optimizer 3 / 22 HiGHS: Past (2011{2014) Overview Written in C++ to study parallel simplex Dual simplex with standard algorithmic enhancements Efficient numerical linear algebra No interface or utilities Concept High performance serial solver (hsol) Exploit limited task and data parallelism in standard dual RSM iterations (sip) Exploit greater task and data parallelism via minor iterations of dual SSM (pami) Huangfu and H Julian Hall HiGHS: a high-performance linear optimizer 4 / 22 HiGHS: Dual simplex algorithm Assume cN 0 Seek b 0 RHS ≥ ≥ N Scan bi < 0 for p to leave b b B Scan cj =apj < 0 for q to leave aq b b N B T Update:
    [Show full text]
  • 67 SOFTWARE Michael Joswig and Benjamin Lorenz
    67 SOFTWARE Michael Joswig and Benjamin Lorenz INTRODUCTION This chapter is intended as a guide through the ever-growing jungle of geometry software. Software comes in many guises. There are the fully fledged systems consisting of a hundred thousand lines of code that meet professional standards in software design and user support. But there are also the tiny code fragments scattered over the Internet that can nonetheless be valuable for research purposes. And, of course, the very many individual programs and packages in between. Likewise today we find a wide group of users of geometry software. On the one hand, there are researchers in geometry, teachers, and their students. On the other hand, geometry software has found its way into numerous applications in the sciences as well as industry. Because it seems impossible to cover every possible aspect, we focus on software packages that are interesting from the researcher's point of view, and, to a lesser extent, from the student's. This bias has a few implications. Most of the packages listed are designed to run on UNIX/Linux1 machines. Moreover, the researcher's genuine desire to understand produces a natural inclination toward open source software. This is clearly reflected in the selections below. Major exceptions to these rules of thumb will be mentioned explicitly. In order to keep the (already long) list of references as short as possible, in most cases only the Web address of each software package is listed rather than manuals and printed descriptions, which can be found elsewhere. At least for the freely available packages, this allows one to access the software directly.
    [Show full text]
  • Highs: a High-Performance Linear Optimizer
    HiGHS: A High-performance Linear Optimizer Julian Hall School of Mathematics University of Edinburgh [email protected] INFORMS Annual Meeting Seattle 21 October 2019 HiGHS: High performance linear optimization Linear optimization Linear programming (LP) minimize cT x subject to Ax = b x 0 ≥ Convex quadratic programming (QP) 1 minimize xT Qx + cT x subject to Ax = b x 0 2 ≥ Q positive semi-definite High performance Serial techniques exploiting sparsity in A Parallel techniques exploiting multicore architectures Julian Hall HiGHS: A High-performance Linear Optimizer 2 / 15 HiGHS: The team What's in a name? HiGHS: Hall, ivet Galabova, Huangfu and Schork Team HiGHS Julian Hall: Reader (1990{date) Ivet Galabova: PhD (2016{date) Qi Huangfu PhD (2009{2013) FICO Xpress (2013{2018) Lukas Schork: PhD (2015{2018) Michael Feldmeier: PhD (2018{date) Joshua Fogg: PhD (2019{date) Julian Hall HiGHS: A High-performance Linear Optimizer 3 / 15 HiGHS: Dual simplex algorithm Assume cN 0 Seek b 0 RHS ≥ ≥ N Scan bi < 0 for p to leave b b B Scan cj =apj < 0 for q to leave aq b b N B T Update: Exchange p and q between and apq ap bbp b b B N b Update b := b αP aq αP = bp=apq − T b T T T bcq cbN Update cN := cN + αD ap αD = cq=apq b b b b− b Computationb b b b b b b T T T Pivotal row via B πp = ep BTRAN and ap = πp N PRICE −1 Pivotal column via B aq = aq FTRAN Represent B INVERT b −1 ¯ T Update B exploiting B = B + (aq Bep)ep UPDATE b − Julian Hall HiGHS: A High-performance Linear Optimizer 4 / 15 HiGHS: Multiple iteration parallelism Perform standard dual simplex
    [Show full text]
  • Three High Performance Simplex Solvers
    Three high performance simplex solvers Julian Hall1 Qi Huangfu2 Ivet Galabova1 Others 1School of Mathematics, University of Edinburgh 2FICO Argonne National Laboratory 16 May 2017 Overview Background Dual vs primal simplex method Three high performance simplex solvers EMSOL (1992{2004) PIPS-S (2011{date) hsol (2011{date) The future Julian Hall, Qi Huangfu, Ivet Galabova et al. Three high performance simplex solvers 2 / 37 Solving LP problems: Characterizing a basis minimize f = cT x x3 subject to Ax = b x 0 ≥ x2 n A vertex of the feasible region K R has ⊂ m basic components, i given by Ax = b 2 B K n m zero nonbasic components, j − 2 N where partitions 1;:::; n B[N f g Equations partitioned according to as B[N Bx B + Nx N = b with nonsingular basis matrix B x1 Solution set of Ax = b characterized by x = b B−1Nx B − N where b = B−1b b Julian Hall, Qi Huangfu, Ivet Galabova et al. Three high performance simplex solvers 3 / 37 b Solving LP problems: Optimality conditions minimize f = cT x x3 subject to Ax = b x 0 ≥ x2 Objective partitioned according to as B[N f = cT x + cT x K B B N N T = f + c N x N T T T T −1 where f = c B b and c N = c N c B B N is the vector of reduced costs b b − b b Partition yields an optimalb solution if there is Primal feasibility b 0 Dual feasibility c ≥0 x1 N ≥ b b Julian Hall, Qi Huangfu, Ivet Galabova et al.
    [Show full text]
  • Independent Evaluation of Optimization Software Including in Competitions
    Independent Evaluation of Optimization Software including in Competitions Hans D Mittelmann School of Mathematical and Statistical Sciences Arizona State University ARPA-E Grid Optimization Workshop Arlington, VA 13 November 2014 Independent Evaluation of Optimization Software Hans D Mittelmann MATHEMATICS AND STATISTICS 1 / 22 Outline Who am I? in all humility, sorry Why am I here? actually, they asked me What can I do? that depends Questions? or Remarks? Independent Evaluation of Optimization Software Hans D Mittelmann MATHEMATICS AND STATISTICS 2 / 22 Outline Who am I? in all humility, sorry Why am I here? actually, they asked me What can I do? that depends Questions? or Remarks? Independent Evaluation of Optimization Software Hans D Mittelmann MATHEMATICS AND STATISTICS 3 / 22 Short CV I Education I University of Mainz, M.A. 1966-1971 I Technical University of Darmstadt, Ph.D. 1971-1973 I Technical University of Darmstadt, Habilitation, 1973-1976 I Employment History I University of Mainz, Computing Center 1971-1973 I Technical University of Darmstadt, Asst. Prof., 1974-1977 I University of Dortmund, Assoc. Prof., 1977-1982 I Arizona State University, Professor, 1982- I Sabbatical stay at Stanford (CS dept) lead to move to US Independent Evaluation of Optimization Software Hans D Mittelmann MATHEMATICS AND STATISTICS 4 / 22 What else is of interest? I Standard research/teaching career in Computational Mathematics (140 papers etc) I Early interest in optimization (1976-) I But initial research activity in PDEs, finite elements I Side
    [Show full text]
  • Reformulations in Mathematical Programming: a Computational Approach
    Reformulations in Mathematical Programming: A Computational Approach Leo Liberti, Sonia Cafieri, and Fabien Tarissan LIX, Ecole´ Polytechnique, Palaiseau, 91128 France {liberti,cafieri,tarissan}@lix.polytechnique.fr Summary. Mathematical programming is a language for describing optimization problems; it is based on parameters, decision variables, objective function(s) subject to various types of constraints. The present treatment is concerned with the case when objective(s) and constraints are algebraic mathematical expres- sions of the parameters and decision variables, and therefore excludes optimization of black-box functions. A reformulation of a mathematical program P is a mathematical program Q obtained from P via symbolic transformations applied to the sets of variables, objectives and constraints. We present a survey of existing reformulations interpreted along these lines, some example applications, and describe the implementation of a software framework for reformulation and optimization. 1 Introduction Optimization and decision problems are usually defined by their input and a mathematical de- scription of the required output: a mathematical entity with an associated value, or whether a given entity has a specified mathematical property or not. These mathematical entities and prop- erties are expressed in the language of set theory, most often in the ZFC axiomatic system [62]. The scope of set theory language in ZFC is to describe all possible mathematical entities, and its limits are given by G¨odel’s incompleteness theorem. Optimization and decision problems are special in the sense that they are closely linked to a particular algorithmic process designed to solve them: more precisely, although the algorithm is not directly mentioned in the problem definition, the main reason why problems are formally cast is that a solution to the problem is desired.
    [Show full text]