Metaheuristics Algorithms

Total Page:16

File Type:pdf, Size:1020Kb

Metaheuristics Algorithms Outline ► Introduction to ► BRKGA Metaheuristics Metaheuristics methods ► Hybrid Metaheuristics ► Local Search ► Matheuristics ► GRASP ► SimHeuristics ► Tabu Search ► Learnheuristics Helena Ramalhinho ► Iterated Local Search Metaheuristics 1 Metaheuristics 2 Solution Methods Solution Methods ► Branch-and-bound Local Search Methods ► Branch-and-cut Metaheuristics ► Column generation ► Cutting and price ► Dynamic programming Integer Programming ► Lagrangean relaxation Exact Methods Integer Programming ► Linear relaxation Exact Methods ► Surrogate relaxation ► Etc… Metaheuristics 3 Metaheuristics 4 Solution Methods Solution Methods ► Good solutions for ► Local Improvement Local Search Methods complex and large- Local Search Methods ► Tabu Search Metaheuristics scale problems Metaheuristics ► Iterated Local Search ► Short running times ► Simulated annealing ► Easily adapted • Mathematical proved ► Genetic Algorithms optimal solutions ► Evolutionary algorithms • Important information on the characteristics and Integer Programming ► Ant Colony Optimization properties of the Exact Methods ► Scatter Search problem. ► Memetic Algorithms ► Etc…. Metaheuristics 5 Metaheuristics 6 © Helena Ramalhinho-Lourenço, (2019) 1 Algorithms Algorithms ► Heuristics ► Metaheuristics § Any approximate method build up on the basis of the structural § The process of finding a good solution (eventually the optimum) properties or on the characteristics of the problem solution, with consists in applying at each step (guiding) a subordinate heuristic reduced complexity with respect to exact methods and providing, which has to be design for each particular problem. in general, good feasible quality solutions, without a forma * C. Ribeiro [1996] guarantee of solution quality. § Cost vs. Time § Have been designed to attack complex optimization problem where classical heuristics and optimization have failed (so far) to be effective and efficient. Metaheuristics 7 Metaheuristics 8 Metaheuristics Metaheuristics for real problems ► Four attributes of Heuristics and Metaheuristics: ► Metaheuristics ► Real Problems § Accuracy § Highly effective on hard § Complex problems * Close to optimal problems § Rapid changes in reality § Speed § Modularity § Need to quick * Small computational time § Easy implementation implementation § Simplicity § Short updates § Different aspects in * No parameters adjustment / easy to program § Robust different sectors § Flexibility § Able to give good § Need to quick answer * Easy to adapt to other real problems solutions in short time. and multiple scenarios Cordeau JF, Gendreau M, Laporte G, Potvin JY, Semet F (2002) A guide to vehicle routing heuristics. J Oper Res Soc 53:512–522. Metaheuristics 9 Metaheuristics 10 Local optimization algorithms Local optimization algorithms ► Given a solution, attempts to improve this solution by making ► Local search local modifications at each iteration. § 1. Get a initial solution x (current solution). Use a constructive ► Neighborhood heuristic. § N: A® 2A subset of feasible solutions § 2. Search the neighborhood. While there is an untested neighbor § Define a local modification, move; * the neighborhood of a solution x is the subset of feasible solutions of x: obtained from applying this move to x. * 2.1 Let x’ be an untested neighbor of x; § Local optima solution x: * 2.2 If c(x’)<c (x) set x=x’; (x’ is the new current solution) * c(x) £ c(y) for all y Î N(x) § 3. Return x (local optimal solution). ► Search Strategy § the name of the metaheuristic come usually from the type of search. Metaheuristics 11 Metaheuristics 12 © Helena Ramalhinho-Lourenço, (2019) 2 Local optimization algorithms Traveling Salesman Problem ► Design of a local optimization algorithm: ► Traveling Salesman Problem § Obtain an initial solution § Traveling Salesman Problem * Heuristic * E: set of edges, each has a cost c(e); * Random solution * A: any subset of edges forming a Hamilton cycle; * Constructive method * c(x): total cost of the edges in x. § Define the neighborhood 5 * Specific for each problem 8 § How to search the neighborhood 10 10 * Complete search 12 18 * First improvement 5 14 6 Metaheuristics 13 Metaheuristics 14 Vehicle Routing Nearest neighbor algorithm ► Traveling salesman problem – initial solution ► Nearest neighbor algorithm on TSP § Heuristic of the nearest neighbor 1. stand on an arbitrary vertex as current vertex. 2. find out the shortest edge connecting current vertex and an 0 1 12 unvisited vertex V. 3. set current vertex be V. 8 4. mark V as visited. 4 4 20 5. if all the vertices in domain are visited then terminate. Else go to 8 8 step 2. 18 5 10 6 Metaheuristics 15 Metaheuristics 16 2 3 Vehicle Routing Vehicle Routing ► Heuristic of the nearest neighbor ► Heuristic of the nearest neighbor 0 0 1 12 1 12 8 8 4 4 4 4 20 20 8 8 8 8 18 18 5 5 10 10 6 6 2 3 2 3 Metaheuristics 17 Metaheuristics 18 © Helena Ramalhinho-Lourenço, (2019) 3 Vehicle Routing Vehicle Routing ► Heuristic of the nearest neighbor ► Heuristic of the nearest neighbor 0 1 0 1 12 12 8 8 4 4 4 4 20 20 8 8 8 8 18 18 5 5 10 10 6 6 2 3 2 3 Metaheuristics 19 Metaheuristics 20 Vehicle Routing Vehicle Routing ► Heuristic of the nearest neighbor ► Optimal solution 0 1 0 1 12 12 8 8 4 4 4 4 20 20 8 8 8 8 18 18 5 5 10 10 6 6 2 3 2 3 Total distance = 43 Total distance = 35 Metaheuristics 21 Metaheuristics 22 Neighborhood Neighborhood ► Traveling Salesman Problem: ► Traveling Salesman Problem: § 2-Opt Move § 3-Opt Move Metaheuristics 23 Metaheuristics 24 © Helena Ramalhinho-Lourenço, (2019) 4 Local Search Example TSP – local search application ► Examples… ► Initial solution and current § Construction heuristics solution § 2-opt local search method ► Nearest neighbor heuristic ► http://www-e.uni-magdeburg.de/mertens/TSP/TSP.html The closest vertex to Berlin is Paris The closest vertex to Paris is Londres The closest vertex to Londres is Sevilla The closest vertex to Sevilla is Roma The closest vertex to Roma is Moscow The nearest to the initial vertex is Moscow SOLUTION: Nearest-Neighbor ------------------------- Route: Berlin, Paris, Londres, Sevilla, Roma, Moscow, Berlin Length: 65 Metaheuristics 25 Metaheuristics 26 Example TSP – local search application Example TSP – local search application ► 2-opt neighborhood ► 2-opt neighborhood § 9 neighbors § 9 neighbors § Neighbor 1 § Neighbor 2 § Cost: 69 § Cost: 69 Metaheuristics 27 Metaheuristics 28 Example TSP – local search application Example TSP – local search application ► 2-opt neighborhood ► 2-opt neighborhood § 9 neighbors § 9 neighbors § Neighbor 3 § Neighbor 4 § Cost: 78 § Cost: 67 Metaheuristics 29 Metaheuristics 30 © Helena Ramalhinho-Lourenço, (2019) 5 Example TSP – local search application Example TSP – local search application ► 2-opt neighborhood ► 2-opt neighborhood § 9 neighbors § 9 neighbors § Neighbor 5 § Neighbor 6 § Cost: 63 § Cost: 73 Metaheuristics 31 Metaheuristics 32 Example TSP – local search application Example TSP – local search application ► 2-opt neighborhood ► 2-opt neighborhood § 9 neighbors § 9 neighbors § Neighbor 7 § Neighbor 8 § Cost: 76 § Cost: 78 Metaheuristics 33 Metaheuristics 34 Example TSP – local search application Example TSP – local search application ► 2-opt neighborhood ► Best neighbor (5) § 9 neighbors ► Cost 63 § Neighbor 9 ► New current solution § Cost: 69 § Repeat until find a local optimal § Local optimal solution: The current solution is better than any solution in the neighborhood. Metaheuristics 35 Metaheuristics 36 © Helena Ramalhinho-Lourenço, (2019) 6 Facility Location Models Location example – Local search ► Important problem in logistics § Where to locate new facilities. * Retailers, warehouses, factories. § Very complex problems ► Warehouse location problem § to locate a set of warehouses in a distribution network ► Cost of locating a warehouse at a particular site: § fixed cost vs variable cost * cost of open facility vs. transportation cost Example of M. Resende AT&T Labs Research Metaheuristics 37 Metaheuristics 38 Location example – Local search Location example – Local search Example of M. Resende Example of M. Resende AT&T Labs Research AT&T Labs Research Metaheuristics 39 Metaheuristics 40 Location example – Local search Location example – Local search Example of M. Resende Example of M. Resende AT&T Labs Research AT&T Labs Research Metaheuristics 41 Metaheuristics 42 © Helena Ramalhinho-Lourenço, (2019) 7 Location example – Local search Location example – Local search Example of M. Resende Example of M. Resende AT&T Labs Research AT&T Labs Research Metaheuristics 43 Metaheuristics 44 Location example – Local search Location example – Local search Example of M. Resende Example of M. Resende AT&T Labs Research AT&T Labs Research Metaheuristics 45 Metaheuristics 46 Location example – Local search Vehicle Routing ► A set of customers at known geographical locations has to be supplied by a fleet of vehicles from a single depot. ► Each customer has a specific demand. ► Each route starts and finish at the depot. ► The objective is to find the set of routes whose total length or cost is minimal. § Or minimize the number of routes. Example of M. Resende AT&T Labs Research Metaheuristics 47 Metaheuristics 48 © Helena Ramalhinho-Lourenço, (2019) 8 Vehicle Routing The Job-Shop Scheduling ► Careful with the Local search ►A set of n jobs, J1, ... , Jn vehicle capacity! ► Moves: § 2-opt like in TSP. ►A set of m machines, M1, ... , Mm § Interchange customers in routes. ►Schedule the processing of each job by the machines. ►The objective
Recommended publications
  • Hybridizations of Metaheuristics with Branch & Bound Derivates
    Hybridizations of Metaheuristics With Branch & Bound Derivates Christian Blum1, Carlos Cotta2, Antonio J. Fern´andez2,Jos´e E. Gallardo2, and Monaldo Mastrolilli3 1 ALBCOM research group Universitat Polit`ecnica de Catalunya [email protected] 2 Dept. Lenguajes y Ciencias de la Computaci´on Universidad de M´alaga {ccottap,afdez,pepeg}@lcc.uma.es 3 Istituto Dalle Molle di Studi sull’Intelligenza Artificiale (IDSIA) [email protected] Summary. An important branch of hybrid metaheuristics concerns the hybridiza- tion with branch & bound derivatives. In this chapter we present examples for two different types of hybridization. The first one concerns the use of branch & bound fea- tures within construction-based metaheuristics in order to increase their efficiancy. The second example deals with the use of a metaheuristic, in our case a memetic algorithm, in order to increase the efficiancy of branch & bound, respectively branch & bound derivatives such as beam search. The quality of the resulting hybrid tech- niques is demonstrated by means of the application to classical string problems: the longest common subsequence problem and the shortest common supersequence problem. 1 Introduction One of the basic ingredients of an optimization technique is a mechanism for exploring the search space, that is, the space of valid solutions to the con- sidered optimization problem. Algorithms belonging to the important class of constructive optimization techniques tackle an optimization problem by ex- ploring the search space in form of a tree, a so-called search tree.Thesearch tree is generally defined by an underlying solution construction mechanism. Each path from the root node of the search tree to one of the leaves corre- sponds to the process of constructing a candidate solution.
    [Show full text]
  • 4. Convex Optimization Problems
    Convex Optimization — Boyd & Vandenberghe 4. Convex optimization problems optimization problem in standard form • convex optimization problems • quasiconvex optimization • linear optimization • quadratic optimization • geometric programming • generalized inequality constraints • semidefinite programming • vector optimization • 4–1 Optimization problem in standard form minimize f0(x) subject to f (x) 0, i =1,...,m i ≤ hi(x)=0, i =1,...,p x Rn is the optimization variable • ∈ f : Rn R is the objective or cost function • 0 → f : Rn R, i =1,...,m, are the inequality constraint functions • i → h : Rn R are the equality constraint functions • i → optimal value: p⋆ = inf f (x) f (x) 0, i =1,...,m, h (x)=0, i =1,...,p { 0 | i ≤ i } p⋆ = if problem is infeasible (no x satisfies the constraints) • ∞ p⋆ = if problem is unbounded below • −∞ Convex optimization problems 4–2 Optimal and locally optimal points x is feasible if x dom f and it satisfies the constraints ∈ 0 ⋆ a feasible x is optimal if f0(x)= p ; Xopt is the set of optimal points x is locally optimal if there is an R> 0 such that x is optimal for minimize (over z) f0(z) subject to fi(z) 0, i =1,...,m, hi(z)=0, i =1,...,p z x≤ R k − k2 ≤ examples (with n =1, m = p =0) f (x)=1/x, dom f = R : p⋆ =0, no optimal point • 0 0 ++ f (x)= log x, dom f = R : p⋆ = • 0 − 0 ++ −∞ f (x)= x log x, dom f = R : p⋆ = 1/e, x =1/e is optimal • 0 0 ++ − f (x)= x3 3x, p⋆ = , local optimum at x =1 • 0 − −∞ Convex optimization problems 4–3 Implicit constraints the standard form optimization problem has an implicit
    [Show full text]
  • Metaheuristics1
    METAHEURISTICS1 Kenneth Sörensen University of Antwerp, Belgium Fred Glover University of Colorado and OptTek Systems, Inc., USA 1 Definition A metaheuristic is a high-level problem-independent algorithmic framework that provides a set of guidelines or strategies to develop heuristic optimization algorithms (Sörensen and Glover, To appear). Notable examples of metaheuristics include genetic/evolutionary algorithms, tabu search, simulated annealing, and ant colony optimization, although many more exist. A problem-specific implementation of a heuristic optimization algorithm according to the guidelines expressed in a metaheuristic framework is also referred to as a metaheuristic. The term was coined by Glover (1986) and combines the Greek prefix meta- (metá, beyond in the sense of high-level) with heuristic (from the Greek heuriskein or euriskein, to search). Metaheuristic algorithms, i.e., optimization methods designed according to the strategies laid out in a metaheuristic framework, are — as the name suggests — always heuristic in nature. This fact distinguishes them from exact methods, that do come with a proof that the optimal solution will be found in a finite (although often prohibitively large) amount of time. Metaheuristics are therefore developed specifically to find a solution that is “good enough” in a computing time that is “small enough”. As a result, they are not subject to combinatorial explosion – the phenomenon where the computing time required to find the optimal solution of NP- hard problems increases as an exponential function of the problem size. Metaheuristics have been demonstrated by the scientific community to be a viable, and often superior, alternative to more traditional (exact) methods of mixed- integer optimization such as branch and bound and dynamic programming.
    [Show full text]
  • Removing the Genetics from the Standard Genetic Algorithm
    Removing the Genetics from the Standard Genetic Algorithm Shumeet Baluja Rich Caruana School of Computer Science School of Computer Science Carnegie Mellon University Carnegie Mellon University Pittsburgh, PA 15213 Pittsburgh, PA 15213 [email protected] [email protected] Abstract from both parents into their respective positions in a mem- ber of the subsequent generation. Due to random factors involved in producing “children” chromosomes, the chil- We present an abstraction of the genetic dren may, or may not, have higher fitness values than their algorithm (GA), termed population-based parents. Nevertheless, because of the selective pressure incremental learning (PBIL), that explicitly applied through a number of generations, the overall trend maintains the statistics contained in a GA’s is towards higher fitness chromosomes. Mutations are population, but which abstracts away the used to help preserve diversity in the population. Muta- crossover operator and redefines the role of tions introduce random changes into the chromosomes. A the population. This results in PBIL being good overview of GAs can be found in [Goldberg, 1989] simpler, both computationally and theoreti- [De Jong, 1975]. cally, than the GA. Empirical results reported elsewhere show that PBIL is faster Although there has recently been some controversy in the and more effective than the GA on a large set GA community as to whether GAs should be used for of commonly used benchmark problems. static function optimization, a large amount of research Here we present results on a problem custom has been, and continues to be, conducted in this direction. designed to benefit both from the GA’s cross- [De Jong, 1992] claims that the GA is not a function opti- over operator and from its use of a popula- mizer, and that typical GAs which are used for function tion.
    [Show full text]
  • Simulated Annealing Hyper-Heuristic for a Shelf Space Allocation on Symmetrical Planograms Problem
    S S symmetry Article Simulated Annealing Hyper-Heuristic for a Shelf Space Allocation on Symmetrical Planograms Problem Kateryna Czerniachowska * and Marcin Hernes Department of Process Management, Wroclaw University of Economics and Business, Komandorska 118/120, 53-345 Wroclaw, Poland; [email protected] * Correspondence: [email protected] Abstract: The allocation of products on shelves is an important issue from the point of view of effective decision making by retailers. In this paper, we investigate a practical shelf space allocation model which takes into account the number of facings, capping, and nesting of a product. We divide the shelf into the segments of variable size in which the products of the specific types could be placed. The interconnections between products are modelled with the help of categorizing the products into specific types as well as grouping some of them into clusters. This results in four groups of constraints—shelf constraints, shelf type constraints, product constraints, position allocation constraints—that are used in the model for aesthetic symmetry of a planogram. We propose a simulated annealing algorithm with improvement and reallocation procedures to solve the planogram profit maximization problem. Experiments are based on artificial data sets that have been generated according to real-world conditions. The efficiency of the designed algorithm has been estimated using the CPLEX solver. The computational tests demonstrate that the proposed algorithm gives valuable results in an acceptable time. Citation: Czerniachowska, K.; Hernes, M. Simulated Annealing Keywords: retailing decisions; shelf space allocation; simulated annealing; heuristics; merchandising Hyper-Heuristic for a Shelf Space Allocation on Symmetrical Planograms Problem.
    [Show full text]
  • Lecture 4 Dynamic Programming
    1/17 Lecture 4 Dynamic Programming Last update: Jan 19, 2021 References: Algorithms, Jeff Erickson, Chapter 3. Algorithms, Gopal Pandurangan, Chapter 6. Dynamic Programming 2/17 Backtracking is incredible powerful in solving all kinds of hard prob- lems, but it can often be very slow; usually exponential. Example: Fibonacci numbers is defined as recurrence: 0 if n = 0 Fn =8 1 if n = 1 > Fn 1 + Fn 2 otherwise < ¡ ¡ > A direct translation in:to recursive program to compute Fibonacci number is RecFib(n): if n=0 return 0 if n=1 return 1 return RecFib(n-1) + RecFib(n-2) Fibonacci Number 3/17 The recursive program has horrible time complexity. How bad? Let's try to compute. Denote T(n) as the time complexity of computing RecFib(n). Based on the recursion, we have the recurrence: T(n) = T(n 1) + T(n 2) + 1; T(0) = T(1) = 1 ¡ ¡ Solving this recurrence, we get p5 + 1 T(n) = O(n); = 1.618 2 So the RecFib(n) program runs at exponential time complexity. RecFib Recursion Tree 4/17 Intuitively, why RecFib() runs exponentially slow. Problem: redun- dant computation! How about memorize the intermediate computa- tion result to avoid recomputation? Fib: Memoization 5/17 To optimize the performance of RecFib, we can memorize the inter- mediate Fn into some kind of cache, and look it up when we need it again. MemFib(n): if n = 0 n = 1 retujrjn n if F[n] is undefined F[n] MemFib(n-1)+MemFib(n-2) retur n F[n] How much does it improve upon RecFib()? Assuming accessing F[n] takes constant time, then at most n additions will be performed (we never recompute).
    [Show full text]
  • Adversarial Search
    Adversarial Search In which we examine the problems that arise when we try to plan ahead in a world where other agents are planning against us. Outline 1. Games 2. Optimal Decisions in Games 3. Alpha-Beta Pruning 4. Imperfect, Real-Time Decisions 5. Games that include an Element of Chance 6. State-of-the-Art Game Programs 7. Summary 2 Search Strategies for Games • Difference to general search problems deterministic random – Imperfect Information: opponent not deterministic perfect Checkers, Backgammon, – Time: approximate algorithms information Chess, Go Monopoly incomplete Bridge, Poker, ? information Scrabble • Early fundamental results – Algorithm for perfect game von Neumann (1944) • Our terminology: – Approximation through – deterministic, fully accessible evaluation information Zuse (1945), Shannon (1950) Games 3 Games as Search Problems • Justification: Games are • Games as playground for search problems with an serious research opponent • How can we determine the • Imperfection through actions best next step/action? of opponent: possible results... – Cutting branches („pruning“) • Games hard to solve; – Evaluation functions for exhaustive: approximation of utility – Average branching factor function chess: 35 – ≈ 50 steps per player ➞ 10154 nodes in search tree – But “Only” 1040 allowed positions Games 4 Search Problem • 2-player games • Search problem – Player MAX – Initial state – Player MIN • Board, positions, first player – MAX moves first; players – Successor function then take turns • Lists of (move,state)-pairs – Goal test
    [Show full text]
  • Genetic Programming: Theory, Implementation, and the Evolution of Unconstrained Solutions
    Genetic Programming: Theory, Implementation, and the Evolution of Unconstrained Solutions Alan Robinson Division III Thesis Committee: Lee Spector Hampshire College Jaime Davila May 2001 Mark Feinstein Contents Part I: Background 1 INTRODUCTION................................................................................................7 1.1 BACKGROUND – AUTOMATIC PROGRAMMING...................................................7 1.2 THIS PROJECT..................................................................................................8 1.3 SUMMARY OF CHAPTERS .................................................................................8 2 GENETIC PROGRAMMING REVIEW..........................................................11 2.1 WHAT IS GENETIC PROGRAMMING: A BRIEF OVERVIEW ...................................11 2.2 CONTEMPORARY GENETIC PROGRAMMING: IN DEPTH .....................................13 2.3 PREREQUISITE: A LANGUAGE AMENABLE TO (SEMI) RANDOM MODIFICATION ..13 2.4 STEPS SPECIFIC TO EACH PROBLEM.................................................................14 2.4.1 Create fitness function ..........................................................................14 2.4.2 Choose run parameters.........................................................................16 2.4.3 Select function / terminals.....................................................................17 2.5 THE GENETIC PROGRAMMING ALGORITHM IN ACTION .....................................18 2.5.1 Generate random population ................................................................18
    [Show full text]
  • A Lifted Linear Programming Branch-And-Bound Algorithm for Mixed Integer Conic Quadratic Programs Juan Pablo Vielma, Shabbir Ahmed, George L
    A Lifted Linear Programming Branch-and-Bound Algorithm for Mixed Integer Conic Quadratic Programs Juan Pablo Vielma, Shabbir Ahmed, George L. Nemhauser, H. Milton Stewart School of Industrial and Systems Engineering, Georgia Institute of Technology, 765 Ferst Drive NW, Atlanta, GA 30332-0205, USA, {[email protected], [email protected], [email protected]} This paper develops a linear programming based branch-and-bound algorithm for mixed in- teger conic quadratic programs. The algorithm is based on a higher dimensional or lifted polyhedral relaxation of conic quadratic constraints introduced by Ben-Tal and Nemirovski. The algorithm is different from other linear programming based branch-and-bound algo- rithms for mixed integer nonlinear programs in that, it is not based on cuts from gradient inequalities and it sometimes branches on integer feasible solutions. The algorithm is tested on a series of portfolio optimization problems. It is shown that it significantly outperforms commercial and open source solvers based on both linear and nonlinear relaxations. Key words: nonlinear integer programming; branch and bound; portfolio optimization History: February 2007. 1. Introduction This paper deals with the development of an algorithm for the class of mixed integer non- linear programming (MINLP) problems known as mixed integer conic quadratic program- ming problems. This class of problems arises from adding integrality requirements to conic quadratic programming problems (Lobo et al., 1998), and is used to model several applica- tions from engineering and finance. Conic quadratic programming problems are also known as second order cone programming problems, and together with semidefinite and linear pro- gramming (LP) problems are special cases of the more general conic programming problems (Ben-Tal and Nemirovski, 2001a).
    [Show full text]
  • Hyperheuristics in Logistics Kassem Danach
    Hyperheuristics in Logistics Kassem Danach To cite this version: Kassem Danach. Hyperheuristics in Logistics. Combinatorics [math.CO]. Ecole Centrale de Lille, 2016. English. NNT : 2016ECLI0025. tel-01485160 HAL Id: tel-01485160 https://tel.archives-ouvertes.fr/tel-01485160 Submitted on 8 Mar 2017 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. No d’ordre: 315 Centrale Lille THÈSE présentée en vue d’obtenir le grade de DOCTEUR en Automatique, Génie Informatique, Traitement du Signal et des Images par Kassem Danach DOCTORAT DELIVRE PAR CENTRALE LILLE Hyperheuristiques pour des problèmes d’optimisation en logistique Hyperheuristics in Logistics Soutenue le 21 decembre 2016 devant le jury d’examen: President: Pr. Laetitia Jourdan Université de Lille 1, France Rapporteurs: Pr. Adnan Yassine Université du Havre, France Dr. Reza Abdi University of Bradford, United Kingdom Examinateurs: Pr. Saïd Hanafi Université de Valenciennes, France Dr. Abbas Tarhini Lebanese American University, Lebanon Dr. Rahimeh Neamatian Monemin University Road, United Kingdom Directeur de thèse: Pr. Frédéric Semet Ecole Centrale de Lille, France Co-encadrant: Dr. Shahin Gelareh Université de l’ Artois, France Invited Professor: Dr. Wissam Khalil Université Libanais, Lebanon Thèse préparée dans le Laboratoire CRYStAL École Doctorale SPI 072 (EC Lille) 2 Acknowledgements Firstly, I would like to express my sincere gratitude to my advisor Prof.
    [Show full text]
  • A Branch-And-Price Approach with Milp Formulation to Modularity Density Maximization on Graphs
    A BRANCH-AND-PRICE APPROACH WITH MILP FORMULATION TO MODULARITY DENSITY MAXIMIZATION ON GRAPHS KEISUKE SATO Signalling and Transport Information Technology Division, Railway Technical Research Institute. 2-8-38 Hikari-cho, Kokubunji-shi, Tokyo 185-8540, Japan YOICHI IZUNAGA Information Systems Research Division, The Institute of Behavioral Sciences. 2-9 Ichigayahonmura-cho, Shinjyuku-ku, Tokyo 162-0845, Japan Abstract. For clustering of an undirected graph, this paper presents an exact algorithm for the maximization of modularity density, a more complicated criterion to overcome drawbacks of the well-known modularity. The problem can be interpreted as the set-partitioning problem, which reminds us of its integer linear programming (ILP) formulation. We provide a branch-and-price framework for solving this ILP, or column generation combined with branch-and-bound. Above all, we formulate the column gen- eration subproblem to be solved repeatedly as a simpler mixed integer linear programming (MILP) problem. Acceleration tech- niques called the set-packing relaxation and the multiple-cutting- planes-at-a-time combined with the MILP formulation enable us to optimize the modularity density for famous test instances in- cluding ones with over 100 vertices in around four minutes by a PC. Our solution method is deterministic and the computation time is not affected by any stochastic behavior. For one of them, column generation at the root node of the branch-and-bound tree arXiv:1705.02961v3 [cs.SI] 27 Jun 2017 provides a fractional upper bound solution and our algorithm finds an integral optimal solution after branching. E-mail addresses: (Keisuke Sato) [email protected], (Yoichi Izunaga) [email protected].
    [Show full text]
  • A Simulated Annealing Based Inexact Oracle for Wasserstein Loss Minimization
    A Simulated Annealing Based Inexact Oracle for Wasserstein Loss Minimization Jianbo Ye 1 James Z. Wang 1 Jia Li 2 Abstract ryl(x; ·) is computed in O(m) time, m being the complex- ity of outcome variables x or y. This part of calculation is Learning under a Wasserstein loss, a.k.a. Wasser- often negligible compared with the calculation of full gra- stein loss minimization (WLM), is an emerging dient with respect to the model parameters. But this is no research topic for gaining insights from a large longer the case in learning problems based on Wasserstein set of structured objects. Despite being concep- distance due to the intrinsic complexity of the distance. tually simple, WLM problems are computation- We will call such problems Wasserstein loss minimization ally challenging because they involve minimiz- (WLM). Examples of WLMs include Wasserstein barycen- ing over functions of quantities (i.e. Wasserstein ters (Li & Wang, 2008; Agueh & Carlier, 2011; Cuturi & distances) that themselves require numerical al- Doucet, 2014; Benamou et al., 2015; Ye & Li, 2014; Ye gorithms to compute. In this paper, we intro- et al., 2017b), principal geodesics (Seguy & Cuturi, 2015), duce a stochastic approach based on simulated nonnegative matrix factorization (Rolet et al., 2016; San- annealing for solving WLMs. Particularly, we dler & Lindenbaum, 2009), barycentric coordinate (Bon- have developed a Gibbs sampler to approximate neel et al., 2016), and multi-label classification (Frogner effectively and efficiently the partial gradients of et al., 2015). a sequence of Wasserstein losses. Our new ap- proach has the advantages of numerical stability Wasserstein distance is defined as the cost of matching two and readiness for warm starts.
    [Show full text]