Metaheuristic Optimization 23

Metaheuristic Optimization 23

Metaheuristic Optimization 23. Linear Programming 思 Thomas Weise ó 汤卫 [email protected] ó http://iao.hfuu.edu.cn Hefei University, South Campus 2 合肥学院 南艳湖校区/南2区 Faculty of Computer Science and Technology 计算机科学与技术系 Institute of Applied Optimization 应用优化研究所 230601 Shushan District, Hefei, Anhui, China 中国 安徽省 合肥市 蜀山区 230601 Econ. & Tech. Devel. Zone, Jinxiu Dadao 99 经济技术开发区 锦绣大道99号 Outline 1 Introduction 2 A First Example using GAMS 3 A Second Example using GAMS 4 Summary website Metaheuristic Optimization Thomas Weise 2/36 Section Outline 1 Introduction 2 A First Example using GAMS 3 A Second Example using GAMS 4 Summary Metaheuristic Optimization Thomas Weise 3/36 Linear Programming Metaheuristics can be used to tackle arbitrary problems. Metaheuristic Optimization Thomas Weise 4/36 Linear Programming Metaheuristics can be used to tackle arbitrary problems. But many problems in the real world are either simple, e.g., sums of products of decision variables with constants Metaheuristic Optimization Thomas Weise 4/36 Linear Programming Metaheuristics can be used to tackle arbitrary problems. But many problems in the real world are either simple, e.g., sums of products of decision variables with constants The more knowledge we have about a problem, the better we can solve it. Metaheuristic Optimization Thomas Weise 4/36 Linear Programming Metaheuristics can be used to tackle arbitrary problems. But many problems in the real world are either simple, e.g., sums of products of decision variables with constants The more knowledge we have about a problem, the better we can solve it. ...because we can design specialized algorithms Metaheuristic Optimization Thomas Weise 4/36 Linear Programming Metaheuristics can be used to tackle arbitrary problems. But many problems in the real world are either simple, e.g., sums of products of decision variables with constants The more knowledge we have about a problem, the better we can solve it. ...because we can design specialized algorithms, i.e., algorithms which do not need to handle any complexity not present in the problem and instead make maximum use of the knowledge we have about the nature of the problem Metaheuristic Optimization Thomas Weise 4/36 Linear Programming Metaheuristics can be used to tackle arbitrary problems. But many problems in the real world are either simple, e.g., sums of products of decision variables with constants The more knowledge we have about a problem, the better we can solve it. ...because we can design specialized algorithms, i.e., algorithms which do not need to handle any complexity not present in the problem and instead make maximum use of the knowledge we have about the nature of the problem The best algorithms for many classical problems, such as the Traveling Salesman Problem or the Maxmum Satisfiability Problem, are metaheuristics that make extensive use of the knowledge about the nature of these problems Metaheuristic Optimization Thomas Weise 4/36 Linear Programming Metaheuristics can be used to tackle arbitrary problems. But many problems in the real world are either simple, e.g., sums of products of decision variables with constants The more knowledge we have about a problem, the better we can solve it. ...because we can design specialized algorithms, i.e., algorithms which do not need to handle any complexity not present in the problem and instead make maximum use of the knowledge we have about the nature of the problem The best algorithms for many classical problems, such as the Traveling Salesman Problem or the Maxmum Satisfiability Problem, are metaheuristics that make extensive use of the knowledge about the nature of these problems However, the vast majority of industry applications in optimization do not use metaheuristics. Metaheuristic Optimization Thomas Weise 4/36 Linear Programming Metaheuristics can be used to tackle arbitrary problems. But many problems in the real world are either simple, e.g., sums of products of decision variables with constants The more knowledge we have about a problem, the better we can solve it. ...because we can design specialized algorithms, i.e., algorithms which do not need to handle any complexity not present in the problem and instead make maximum use of the knowledge we have about the nature of the problem The best algorithms for many classical problems, such as the Traveling Salesman Problem or the Maxmum Satisfiability Problem, are metaheuristics that make extensive use of the knowledge about the nature of these problems However, the vast majority of industry applications in optimization do not use metaheuristics. They use Linear Programming tools such as as CPLEX and GAMS. Metaheuristic Optimization Thomas Weise 4/36 Linear Programming Definition (Linear Programming) Linear programming (LP) is a method to find the optimal solution of a problem whose (potentially multiple) constraints and (single) objective are linear relationships. Metaheuristic Optimization Thomas Weise 5/36 Linear Programming Definition (Linear Programming) Linear programming (LP) is a method to find the optimal solution of a problem whose (potentially multiple) constraints and (single) objective are linear relationships. The objective function f is a simple linear function of n real-valued decision variables x1,x2,...,xn ∈ R, i.e., we can write f(~x) : Rn 7→ R. Metaheuristic Optimization Thomas Weise 5/36 Linear Programming Definition (Linear Programming) Linear programming (LP) is a method to find the optimal solution of a problem whose (potentially multiple) constraints and (single) objective are linear relationships. The objective function f is a simple linear function of n real-valued decision variables x1,x2,...,xn ∈ R, i.e., we can write f(~x) : Rn 7→ R. There are any number m of inequality constraints gi and any number of equality constraints hj (see Lesson 16: Constraint Handling), each of which are linear. Metaheuristic Optimization Thomas Weise 5/36 Example Let us define a simple linear programming problem with two decision variables and an objective function. f 10 9 8 7 6 5 4 3 2 x 1 x 1 2 0 0 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 9 9 10 Metaheuristic Optimization Thomas Weise 6/36 Example Let us define a simple linear programming problem with two decision variables and an objective function. f(x1 ,x 2 )=10-0.6x 1 -0.4x 2 Assume we want to maximize the objective function f(x1,x2) = 10 − 0.6x1 − 0.4x2 x1 x2 with the two decision variables and . 10 9 8 7 6 5 4 3 2 x 1 x 1 2 0 0 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 9 9 10 Metaheuristic Optimization Thomas Weise 6/36 Example Let us define a simple linear programming problem with two decision variables and an objective function. f(x1 ,x 2 )=10-0.6x 1 -0.4x 2 Assume we want to maximize the objective function f(x1,x2) = 10 − 0.6x1 − 0.4x2 x1 x2 with the two decision variables and . 10 9 This function would have its optimum 8 somewhere at x1 →∞,x2 →∞, because 7 it is unconstraint. There are no limits on 6 5 x1 and x2, they can become arbitrarily big. 4 3 2 x 1 x 1 2 0 0 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 9 9 10 Metaheuristic Optimization Thomas Weise 6/36 Example Let us define a simple linear programming problem with two decision variables and an objective function. f(x1 ,x 2 )=10-0.6x 1 -0.4x 2 Assume we want to maximize the objective function f(x1,x2) = 10 − 0.6x1 − 0.4x2 x1 x2 with the two decision variables and . 10 9 This function would have its optimum 8 somewhere at x1 →∞,x2 →∞, because 7 it is unconstraint. There are no limits on 6 5 x1 and x2, they can become arbitrarily big. 4 3 Usually, the values of variables are limited 2 x in some way. 1 x 1 2 0 0 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 9 9 10 Metaheuristic Optimization Thomas Weise 6/36 Example Let us define a simple linear programming problem with two decision variables and an objective function. f(x1 ,x 2 )=10-0.6x 1 -0.4x 2 Assume we want to maximize the objective function f(x1,x2) = 10 − 0.6x1 − 0.4x2 x1 x2 with the two decision variables and . 10 9 This function would have its optimum 8 somewhere at x1 →∞,x2 →∞, because 7 it is unconstraint. There are no limits on 6 5 x1 and x2, they can become arbitrarily big. 4 3 Usually, the values of variables are limited 2 x in some way. Let’s say that both x1 and 1 x 1 2 x2 0 0 must be positive. 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 9 9 10 Metaheuristic Optimization Thomas Weise 6/36 Example Let us define a simple linear programming problem with two decision variables and an objective function. f(x ,x )=10-0.6x -0.4x ≥ 1 2 1 2 g1 (x 1 ,x 2 ):x 2 0 Assume we want to maximize the objective function f(x1,x2) = 10 − 0.6x1 − 0.4x2 x1 x2 with the two decision variables and . 10 9 This function would have its optimum 8 somewhere at x1 →∞,x2 →∞, because 7 it is unconstraint. There are no limits on 6 5 x1 and x2, they can become arbitrarily big. 4 3 Usually, the values of variables are limited 2 x in some way. Let’s say that both x1 and 1 x 1 2 x2 0 0 must be positive.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    106 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us