Paradiseo: from a Modular Framework for Evolutionary Computation to the Automated Design of Metaheuristics---22 Years of Paradiseo

Total Page:16

File Type:pdf, Size:1020Kb

Paradiseo: from a Modular Framework for Evolutionary Computation to the Automated Design of Metaheuristics---22 Years of Paradiseo Paradiseo: From a Modular Framework for Evolutionary Computation to the Automated Design of Metaheuristics |22 Years of Paradiseo| Johann Dreo1, Arnaud Liefooghe2, S´ebastienVerel3, Marc Schoenauer4, Juan J. Merelo5, Alexandre Quemy6, Benjamin Bouvier7, and Jan Gmys8 1Systems Biology Group, Depepartment of Computational Biology, USR 3756, Institut Pasteur and CNRS, Paris, France. [email protected] |Corresponding author. 2Univ. Lille, CNRS, Inria, Centrale Lille, UMR 9189 CRIStAL, F-59000 Lille. [email protected] 3 Univ. Littoral C^oted'Opale, Calais, France. [email protected] 4TAU, Inria, CNRS & UPSaclay, LISN, Saclay, France. [email protected] 5University of Granada, Granada, Spain. [email protected] 6Poznan University of Technology, Poznan, Poland. [email protected] [email protected] 8Inria, Lille, France. [email protected] Abstract ical approaches, numerous applications have shown the efficiency of those randomized search heuristics. The success of metaheuristic optimization methods However, following Wagner et al. [33], we observe that has led to the development of a large variety of al- the metaheuristic research domain lacks mature soft- gorithm paradigms. However, no algorithm clearly ware, while it is crippled with short-lived research dominates all its competitors on all problems. In- prototypes on over-specific features sets. We believe stead, the underlying variety of landscapes of opti- this state hinders the adoption of those technologies mization problems calls for a variety of algorithms to in the industrial world and is an obstacle to break- solve them efficiently. It is thus of prior importance through innovations. Therefore, the development of a to have access to mature and flexible software frame- full-featured and mature metaheuristic optimization works which allow for an efficient exploration of the framework is of prior importance, for both the scien- algorithm design space. Such frameworks should be tific and the applied communities. In this article, we flexible enough to accommodate any kind of meta- summarize our efforts towards this goal, in the guise heuristics, and open enough to connect with higher- of the Paradiseo project. level optimization, monitoring and evaluation soft- The Paradiseo framework is a 22 years old effort wares. This article summarizes the features of the which aims at developing a flexible architecture for Paradiseo framework, a comprehensive C++ free soft- the generic design of metaheuristics for hard opti- arXiv:2105.00420v1 [cs.NE] 2 May 2021 ware which targets the development of modular meta- mization problems. It is implemented in C++, a very heuristics. Paradiseo provides a highly modular archi- mature object-oriented programming language,which tecture, a large set of components, speed of execution is probably one of the fastest, if not the fastest, and automated algorithm design features, which are object-oriented programming platforms on the mar- key to modern approaches to metaheuristics develop- ket [19, 30, 28]. It is also highly portable and ben- ment. efits from very extensive tooling as well as an active community. Paradiseo is released as a free and open- 1 Introduction source software, under the LGPL-v2 and CeCILL li- censes (depending on the module). Its development In the research domain of metaheuristics for black- is open and the source code is freely available on the 1 2 box optimization, a very large variety of algorithms Inria and Github code repositories. has been developed since the first Evolution Strategies appeared in 1965 [31]. Starting from nature-inspired 1https://gitlab.inria.fr/paradiseo/paradiseo computing methods and following recent mathemat- 2https://github.com/jdreo/paradiseo 1 1.1 History Table 1: Main software frameworks for evolutionary The \Evolving Objects" (EOlib, then simply EO) computation and metaheuristics. Fastest languages framework was started in 1999 by the Geneura team are figured in green and slowest in red, copyleft li- at the University of Granada, headed by Juan Juli´an censes are in red. \kloc" stands for \thousands of Merelo. The development team was then reinforced lines of code". e h v n s i s r c t g r i e o by Maarten Keijzer, who designed the current mod- c t e a s p e e j u e e g r a e r a b b c o S U D e i s e t s u s c o l r . P e t . s i i n a t e l ular architecture, and Marc Schoenauer [21]. Later g a d t t o s t e c l G l A d n O t m o c n n u a c o u u P a p o l u v D S t a i o a l came Jeroen Eggermont, who, among other things, N L U L C k E E P L C M G M L S A ParadisEO C++ 2021 LGPLv2 33 82 Y Y Y Y Y Y ~ Y Y Y Y did a lot of work on genetic programming, Olivier jMetal Java 2021 MIT 29 60 Y N Y N Y N N Y N ? N ECF C++ 2017 MIT 19 15 Y N Y N Y N N N N Y N K¨onig,who did a lot of useful additions and cleaning ECJ Java 2021 AFLv3 33 54 Y Y Y N Y Y Y Y N Y N DEAP Python 2020 LGPLv3 45 9 Y N N N Y Y N Y N Y N of the code, and Jochen K¨upper. CIlib Scala 2021 Apachev2 17 4 Y N N N N N N N N ? N HeuristicLab C# 2021 GPLv3 20 150 Y N Y Y Y Y N Y ~ Y N The Inria Dolphin team, headed by El-Ghazali Clojush Clojure 2020 EPLv1 17 19 Y N N N N N N N N N N Talbi, did a lot of contributions starting from around 2003, on their own module collection called Paradiseo. Thomas Legrand worked on particle swarm optimiza- lists 57 software packages related to the implementa- tion, the regretted S´ebastienCahon and Nouredine tion of evolutionary algorithms (among which EOlib, Melab worked on parallelization modules [7,6,4,5]. the ancestor of Paradiseo). Arnaud Liefooghe and J´er´emieHumeau worked a lot Most of those software are now unmaintained or on the multi-objective module [24] and on the local impossible to find. There has been, however, a con- search one along with S´ebastienVerel [18]. In the stant flow of new frameworks, library or solvers every same team, C. FC.3 and Jean-Charles Boisson made year, for decades. We were able to find at least 47 7 significant contributions. of them readily available on the web . Among those The (then) EO project was taken over by Johann projects, only 8 met all the following criteria: Dreo, who worked with the help of Caner Candan on 1. open-source framework aiming at designing algo- adding the EDO module. Johann and Benjamin Bou- rithms8, vier have also designed a MPI parallelization module, while Alexandre Quemy also worked on paralleliza- 2. being active since 2015, tion code. In 2012, the two projects (EO and Paradiseo) were 3. having more than 15 contributors. merged into a single one by Johann Dreo, S´ebastien The features of those main frameworks are compared Verel and Arnaud Liefooghe, who have been acting in Table1, where the number of lines of code was com- as maintainers ever since. puted with the cloc tool9. Note that for HeuristicLab, In 2020, automated algorithm selection design and the code for the GUI modules was excluded from the binding toward the IOHprofiler validation tool were count. The GPGPU module of Paradiseo [27] is not added by Johann Dreo. counted either, as it is not maintained anymore. The Along the life of the project, several spin-off soft- number of contributors has been retrieved from the ware have been developed, among which a port of code repository's commit histories, which underes- the EO module in Java [1], another one in ActiveX4; timates the number of people involved in the case GUIDE, a graphical user interface for assembling al- of Paradiseo; the extracted number is however kept, gorithms [10]5, and EASEA, a high-level declarative for fairness in comparison with the other frameworks, language for evolutionary algorithm specification [9], that might face a similar bias. which later became independent [26] of the specific Among the software close in features to Paradiseo, library. ECF has not been updated in 4 years. ECJ, jMetal edu/pub/doc/EC/FAQ/www/Q20.htm 1.2 Related Frameworks 7Paradiseo, jMetal, ECF, OpenBeagle, Jenetics, ECJ, The 1998's version of the hitch-hiker's guide to evo- DEAP, CIlib, GP.NET, DGPF, JGAP, Watchmaker, Gen- Pro, GAlib, HeuristicLab, PyBrain, JCLEC, GPE, JGAlib, lutionary computation (frequently asked question in pycma, PyEvolve, GPLAB, Clojush, µGP, pySTEP, Pyvolu- the comp.ai.genetic Usenet newsgroup6) already tion, PISA, EvoJ, Galapagos, branecloud, JAGA, PMDGP, GPC++, PonyGE, Platypus, DCTG-GP, Desdeo, PonyGE2, 3Redacted by author's demand. EvoGrad, HyperSpark, Nevergrad, Pagmo2, LEAP, Operon, 4http://geneura.ugr.es/~jmerelo/DegaX/ EMILI, pso-de-framework, MOACO. 5Which also supported ECJ. 8Libraries of solvers, like Pagmo2 or Nevergrad, do not 6Discussion forum which was popular before the World match this criterion. Wide Web and social networks. http://coast.cs.purdue. 9version 1.82 of https://github.com/AlDanial/cloc 2 are close competitors, albeit programmed in Java, patterns: Functor, Strategy, Generic Type and Fac- which is expected to run near 2.6 times slower than tory. Figure1 shows a high-level view of the global programs in C++10, a key drawback for automated design pattern. algorithm design (see Section 4.3). HeuristicLab suf- encoding:EOT fers from the same drawback, but provides a graphi- Interface cal user interface for the run and analysis of solvers.
Recommended publications
  • Metaheuristics ``In the Large''
    Metaheuristics “In the Large” Jerry Swan∗, Steven Adriaensen, Alexander E. I. Brownlee, Kevin Hammond, Colin G. Johnson, Ahmed Kheiri, Faustyna Krawiec, J. J. Merelo, Leandro L. Minku, Ender Ozcan,¨ Gisele L. Pappa, Pablo Garc´ıa-S´anchez, Kenneth S¨orensen, Stefan Voß, Markus Wagner, David R. White Abstract Following decades of sustained improvement, metaheuristics are one of the great success stories of optimization research. However, in order for research in metaheuristics to avoid fragmentation and a lack of reproducibility, there is a pressing need for stronger scientific and computational infrastructure to sup- port the development, analysis and comparison of new approaches. To this end, we present the vision and progress of the “Metaheuristics ‘In the Large’ ” project. The conceptual uderpinnings of the project are: truly extensible algo- rithm templates that support reuse without modification, white box problem descriptions that provide generic support for the injection of domain specific knowledge, and remotely accessible frameworks, components and problems that will enhance reproducibility and accelerate the field’s progress. We ar- gue that, via principled choice of infrastructure support, the field can pur- sue a higher level of scientific enquiry. We describe our vision and report on progress, showing how the adoption of common protocols for all metaheuris- tics can help liberate the potential of the field, easing the exploration of the design space of metaheuristics. Keywords: Evolutionary Computation, Operational Research, Heuristic design, Heuristic methods, Architecture, Frameworks, Interoperability 1. Introduction arXiv:2011.09821v4 [cs.NE] 3 Jun 2021 Optimization problems have myriad real world applications [42] and have motivated a wealth of research since before the advent of the digital computer [25].
    [Show full text]
  • Particle Swarm Optimization
    PARTICLE SWARM OPTIMIZATION Thesis Submitted to The School of Engineering of the UNIVERSITY OF DAYTON In Partial Fulfillment of the Requirements for The Degree of Master of Science in Electrical Engineering By SaiPrasanth Devarakonda UNIVERSITY OF DAYTON Dayton, Ohio May, 2012 PARTICLE SWARM OPTIMIZATION Name: Devarakonda, SaiPrasanth APPROVED BY: Raul Ordonez, Ph.D. John Loomis, Ph.D. Advisor Committee Chairman Committee Member Associate Professor Associate Professor Electrical & Computer Engineering Electrical & Computer Engineering Robert Penno, Ph.D. Committee Member Associate Professor Electrical & Computer Engineering John G. Weber, Ph.D. Tony E. Saliba, Ph.D. Associate Dean Dean, School of Engineering School of Engineering & Wilke Distinguished Professor ii ABSTRACT PARTICLE SWARM OPTIMIZATION Name: Devarakonda, SaiPrasanth University of Dayton Advisor: Dr. Raul Ordonez The particle swarm algorithm is a computational method to optimize a problem iteratively. As the neighborhood determines the sufficiency and frequency of information flow, the static and dynamic neighborhoods are discussed. The characteristics of the different methods for the selection of the algorithm for a particular problem are summarized. The performance of particle swarm optimization with dynamic neighborhood is investigated by three different methods. In the present work two more benchmark functions are tested using the algorithm. Conclusions are drawn by testing the different benchmark functions that reflect the performance of the PSO with dynamic neighborhood. And all the benchmark functions are analyzed by both Synchronous and Asynchronous PSO algorithms. iii This thesis is dedicated to my grandmother Jogi Lakshmi Narasamma. iv ACKNOWLEDGMENTS I would like to thank my advisor Dr.Raul Ordonez for being my mentor, guide and personally supporting during my graduate studies and while carrying out the thesis work and offering me excellent ideas.
    [Show full text]
  • Metaheuristic Optimization Frameworks: a Survey and Benchmarking
    Soft Comput DOI 10.1007/s00500-011-0754-8 ORIGINAL PAPER Metaheuristic optimization frameworks: a survey and benchmarking Jose´ Antonio Parejo • Antonio Ruiz-Corte´s • Sebastia´n Lozano • Pablo Fernandez Ó Springer-Verlag 2011 Abstract This paper performs an unprecedented com- and affordable time and cost. However, heuristics are parative study of Metaheuristic optimization frameworks. usually based on specific characteristics of the problem at As criteria for comparison a set of 271 features grouped in hand, which makes their design and development a com- 30 characteristics and 6 areas has been selected. These plex task. In order to solve this drawback, metaheuristics features include the different metaheuristic techniques appear as a significant advance (Glover 1977); they are covered, mechanisms for solution encoding, constraint problem-agnostic algorithms that can be adapted to incor- handling, neighborhood specification, hybridization, par- porate the problem-specific knowledge. Metaheuristics allel and distributed computation, software engineering have been remarkably developed in recent decades (Voß best practices, documentation and user interface, etc. A 2001), becoming popular and being applied to many metric has been defined for each feature so that the scores problems in diverse areas (Glover and Kochenberger 2002; obtained by a framework are averaged within each group of Back et al. 1997). However, when new are considered, features, leading to a final average score for each frame- metaheuristics should be implemented and tested, implying work. Out of 33 frameworks ten have been selected from costs and risks. the literature using well-defined filtering criteria, and the As a solution, object-oriented paradigm has become a results of the comparison are analyzed with the aim of successful mechanism used to ease the burden of applica- identifying improvement areas and gaps in specific tion development and particularly, on adapting a given frameworks and the whole set.
    [Show full text]
  • Parameter Meta-Optimization of Metaheuristic Optimization Algorithms
    Fachhochschul-Masterstudiengang SOFTWARE ENGINEERING 4232 Hagenberg, Austria Parameter Meta-Optimization of Metaheuristic Optimization Algorithms Diplomarbeit zur Erlangung des akademischen Grades Master of Science in Engineering Eingereicht von Christoph Neumüller, BSc Begutachter: Prof. (FH) DI Dr. Stefan Wagner September 2011 Erklärung Hiermit erkläre ich an Eides statt, dass ich die vorliegende Arbeit selbstständig und ohne fremde Hilfe verfasst, andere als die angegebenen Quellen und Hilfsmit- tel nicht benutzt und die aus anderen Quellen entnommenen Stellen als solche gekennzeichnet habe. Hagenberg, am 4. September 2011 Christoph Neumüller, BSc. ii Contents Erklärung ii Abstract 1 Kurzfassung 2 1 Introduction 3 1.1 Motivation and Goal . .3 1.2 Structure and Content . .4 2 Theoretical Foundations 5 2.1 Metaheuristic Optimization . .5 2.1.1 Trajectory-Based Metaheuristics . .6 2.1.2 Population-Based Metaheuristics . .7 2.1.3 Optimization Problems . 11 2.1.4 Operators . 13 2.2 Parameter Optimization . 17 2.2.1 Parameter Control . 18 2.2.2 Parameter Tuning . 18 2.3 Related Work in Meta-Optimization . 19 3 Technical Foundations 22 3.1 HeuristicLab . 22 3.1.1 Key Concepts . 22 3.1.2 Algorithm Model . 23 3.2 HeuristicLab Hive . 25 3.2.1 Components . 26 4 Requirements 29 5 Implementation 31 5.1 Solution Encoding . 31 5.1.1 Parameter Trees in HeuristicLab . 31 5.1.2 Parameter Configuration Trees . 33 iii Contents iv 5.1.3 Search Ranges . 36 5.1.4 Symbolic Expression Grammars . 37 5.2 Fitness Function . 39 5.2.1 Handling of Infeasible Solutions . 41 5.3 Operators . 41 5.3.1 Solution Creator .
    [Show full text]
  • Rapid and Flexible User-Defined Low-Level Hybridization for Metaheuristics Algorithm in Software Framework
    Journal of Software Engineering and Applications, 2012, 5, 873-882 873 http://dx.doi.org/10.4236/jsea.2012.511102 Published Online November 2012 (http://www.SciRP.org/journal/jsea) Rapid and Flexible User-Defined Low-Level Hybridization for Metaheuristics Algorithm in Software Framework S. Masrom*, Siti Z. Z. Abidin, N. Omar Faculty of Computer and Mathematical Sciences, Universiti Teknologi MARA, Shah Alam, Malaysia. Email: *[email protected], [email protected], [email protected] Received September 23rd, 2012; revised October 21st, 2012; accepted October 30th, 2012 ABSTRACT The metaheuristics algorithm is increasingly important in solving many kinds of real-life optimization problems but the implementation involves programming difficulties. As a result, many researchers have relied on software framework to accelerate the development life cycle. However, the available software frameworks were mostly designed for rapid de- velopment rather than flexible programming. Therefore, in order to extend software functions, this approach involves modifying software libraries which requires the programmers to have in-depth understanding about the internal working structure of software and the programming language. Besides, it has restricted programmers for implementing flexible user-defined low-level hybridization. This paper presents the concepts and formal definition of metaheuristics and its low-level hybridization. In addition, the weaknesses of current programming approaches supported by available soft- ware frameworks for metaheuristics are discussed. Responding to the deficiencies, this paper introduces a rapid and flexible software framework with scripting language environment. This approach is more flexible for programmers to create a variety of user-defined low-level hybridization rather than bounded with built-in metaheuristics strategy in software libraries.
    [Show full text]
  • Integrating Heuristiclab with Compilers and Interpreters for Non-Functional Code Optimization
    Integrating HeuristicLab with Compilers and Interpreters for Non-Functional Code Optimization Daniel Dorfmeister Oliver Krauss Software Competence Center Hagenberg Johannes Kepler University Linz Hagenberg, Austria Linz, Austria [email protected] University of Applied Sciences Upper Austria Hagenberg, Austria [email protected] ABSTRACT 1 INTRODUCTION Modern compilers and interpreters provide code optimizations dur- Genetic Compiler Optimization Environment (GCE) [5, 6] integrates ing compile and run time, simplifying the development process for genetic improvement (GI) [9, 10] with execution environments, i.e., the developer and resulting in optimized software. These optimiza- the Truffle interpreter [23] and Graal compiler [13]. Truffle is a tions are often based on formal proof, or alternatively stochastic language prototyping and interpreter framework that interprets optimizations have recovery paths as backup. The Genetic Compiler abstract syntax trees (ASTs). Truffle already provides implementa- Optimization Environment (GCE) uses a novel approach, which tions of multiple programming languages such as Python, Ruby, utilizes genetic improvement to optimize the run-time performance JavaScript and C. Truffle languages are compiled, optimized and of code with stochastic machine learning techniques. executed via the Graal compiler directly in the JVM. Thus, GCE In this paper, we propose an architecture to integrate GCE, which provides an option to generate, modify and evaluate code directly directly integrates with low-level interpreters and compilers, with at the interpreter and compiler level and enables research in that HeuristicLab, a high-level optimization framework that features a area. wide range of heuristic and evolutionary algorithms, and a graph- HeuristicLab [20–22] is an optimization framework that provides ical user interface to control and monitor the machine learning a multitude of meta-heuristic algorithms, research problems from process.
    [Show full text]
  • Heuristiclab References
    Facts HeuristicLab provides a feature rich software environment for heuristic optimization researchers and practitioners. It is based on a generic and flexible model layer and offers a graphical algorithm designer that enables the user to cre- ate, apply, and analyze heuristic optimization methods. A powerful experimenter allows HeuristicLab users to design and perform parameter tests even in parallel. The results of these tests can be stored and analyzed easily in several configurable charts. HeuristicLab is available under the GPL license and is currently used in education, research, and in- dev.heuristiclab.com dustry projects. System Requirements REFERENCES HEURISTICLAB >> Microsoft Windows XP / Vista / 7 / 8 >> Microsoft .NET Framework 4.0 (full version) A Paradigm-Independent and Extensible Environment for Download Heuristic Optimization >> http://dev.heuristiclab.com Contact Heuristic and Evolutionary Algorithms Laboratory (HEAL) University of Applied Sciences Upper Austria Softwarepark 11, 4232 Hagenberg, Austria Phone: +43 7236 3888 2030 KAROSSERIE & KABINENBAU GMBH Web: http://heal.heuristiclab.com E-Mail: [email protected] KAROSSERIE & KABINENBAU GMBH KAROSSERIE & KABINENBAU GMBH FH OÖ Forschungs & Entwicklungs GmbH • Franz-Fritsch-Str. 11/Top 3 4600 Wels/Austria • Telefonnummer: +43 (0)7242 44808-43 Fax: +43 (0)7242 44808-77 • E-Mail: [email protected] • www.fh-ooe.at Features HeuristicLab >> Rich User Experience The development of HeuristicLab started in 2002, when a Algorithm developers can use a number of included well- A comfortable and feature rich graphical group of researchers in the heuristic optimization domain de- known metaheuristics, a large library of operators, a graphical user interface enables non-program- cided to build a software system for exploring new research algorithm designer and an experiment designer to create and mers to use and apply HeuristicLab.
    [Show full text]
  • Parameter Identification for Simulation Models by Heuristic Optimization Michael Kommenda, Stephan Winkler
    Parameter Identification for Simulation Models by Heuristic Optimization Michael Kommenda, Stephan Winkler FH OÖ Forschungs & Entwicklungs GmbH, Softwarepark 13, A-4232 Hagenberg, AUSTRIA ABSTRACT: In this publication we describe a generic parameter identification approach that couples the heuristic opti- mization framework HeuristicLab with simulation models implemented in MATLAB or Scilab. This approach enables the reuse of already available optimization algorithms in HeuristicLab such as evolution strategies, gradient-based optimization algorithms, or evolutionary algorithms and simulation models implemented in the targeted simulation environment. Hence, the configuration effort is minimized and the only necessary step to perform the parameter identification is the definition of an objective function that calculates the quality of a set of parameters proposed by the optimization algorithm; this quality is here calculated by comparing originally measured values and those produced by the simulation model using the proposed parameters. The suitability of this parameter identification approach is demonstrated using an exemplary use-case, where the mass and the two friction coefficients of an electric cart system are identified by ap- plying two different optimization algorithms, namely the Broyden-Fletcher-Goldfarb-Shanno algorithm and the covariance matrix adaption evolution strategy. Using the here described approach a multitude of opti- mization algorithms becomes applicable for parameter identification. 1 INTRODUCTION Simulation models are used for describing processes and systems of the real world as closely as possible, analyzing and predicting system behavior, and testing alternatives for the original system’s design or parametrization. Dynamical, technical systems are often modeled using differential equation systems; usually, first appropriate equation systems are defined, and after- wards their parameters are adjusted so that the resulting set of equations resembles the mod- eled system as closely as possible.
    [Show full text]
  • Algorithm and Experiment Design with Heuristiclab Instructor Biographies
    19.06.2011 Algorithm and Experiment Design with HeuristicLab An Open Source Optimization Environment for Research and Education S. Wagner, G. Kronberger Heuristic and Evolutionary Algorithms Laboratory (HEAL) School of Informatics, Communications and Media, Campus Hagenberg Upper Austria University of Applied Sciences Instructor Biographies • Stefan Wagner – MSc in computer science (2004) Johannes Kepler University Linz, Austria – PhD in technical sciences (2009) Johannes Kepler University Linz, Austria – Associate professor (2005 – 2009) Upper Austria University of Applied Sciences – Full professor for complex software systems (since 2009) Upper Austria University of Applied Sciences – Co‐founder of the HEAL research group – Project manager and chief architect of HeuristicLab – http://heal.heuristiclab.com/team/wagner • Gabriel Kronberger – MSc in computer science (2005) Johannes Kepler University Linz, Austria – PhD in technical sciences (2010) Johannes Kepler University Linz, Austria – Research assistant (since 2005) Upper Austria University of Applied Sciences – Member of the HEAL research group – Architect of HeuristicLab – http://heal.heuristiclab.com/team/kronberger ICCGI 2011 http://dev.heuristiclab.com 2 1 19.06.2011 Agenda • Objectives of the Tutorial • Introduction • Where to get HeuristicLab? • Plugin Infrastructure • Graphical User Interface • Available Algorithms & Problems • Demonstration • Some Additional Features • Planned Features • Team • Suggested Readings • Bibliography • Questions & Answers ICCGI 2011 http://dev.heuristiclab.com
    [Show full text]
  • Phd Thesis, Technical University of Denmark, Institute of Mathematical Modelling
    JOHANNES KEPLER UNIVERSITAT¨ LINZ JKU Technisch-Naturwissenschaftliche Fakult¨at Adaptive Heuristic Approaches for Dynamic Vehicle Routing - Algorithmic and Practical Aspects DISSERTATION zur Erlangung des akademischen Grades Doktor im Doktoratsstudium der Technischen Wissenschaften Eingereicht von: Stefan Vonolfen, MSc Angefertigt am: Institut f¨ur Formale Modelle und Verifikation Beurteilung: Priv.-Doz. Dr. Michael Affenzeller (Betreuung) Univ.-Prof. Dr. Karl D¨orner Linz, Juni, 2014 Acknowledgments The work presented in this thesis would not have been possible without the many fruitful discussions with my colleagues from the research group Heuris- tic and Evolutionary Algorithms Laboratory (HEAL) and without the Heuris- ticLab optimization environment as a software infrastructure (the web pages of all members of HEAL as well as further information about HeuristicLab can be found at: http://www.heuristiclab.com). Particularly, I would like to thank Michael Affenzeller for his guidance concerning the algorithmic aspects of this thesis as my supervisor as well as providing a very supportive working environment as the research group head. I would also like to thank Stefan Wagner for triggering my research interest in metaheuristic algorithms during the supervision of my bachelor and master thesis. The discussions with my colleagues Andreas Beham and Erik Pitzer about vehicle routing, fitness landscape analysis, and algorithm selection led to many ideas presented in this thesis. Michael Kommenda provided important insights on genetic programming and Monika Kofler contributed knowledge about storage assignment. Stephan Hutterer was working on the generation of policies for smart grids and pointed out many links to related literature. I would also like to thank Prof. Karl D¨orner from the institute for pro- duction and logistics management at the JKU for giving me the possibility to present my work at the ORP3 workshop as well as during a seminar at his institute.
    [Show full text]
  • Simulation Optimization with Heuristiclab
    SIMULATION OPTIMIZATION WITH HEURISTICLAB Andreas Beham(a), Michael Affenzeller(b), Stefan Wagner(c), Gabriel K. Kronberger(d) (a)(b)(c)(d)Upper Austria University of Applied Sciences, Campus Hagenberg School of Informatics, Communication and Media Heuristic and Evolutionary Algorithms Laboratory Softwarepark 11, A-4232 Hagenberg, Austria (a)[email protected], (b)[email protected] (c)[email protected], (d)[email protected] ABSTRACT structure and the parameters of the designed Simulation optimization today is an important branch in optimization strategy and can be executed in the the field of heuristic optimization problems. Several HeuristicLab optimization environment. After the simulators include built-in optimization and several execution has terminated or was aborted the results or companies have emerged that offer optimization respectively intermediate results are saved along the strategies for different simulators. Often the workbench. So in addition to the structure and optimization strategy is a secret and only sparse parameters, the results can also be saved in a single information is known about its inner workings. In this document, reopened at any time and examined. paper we want to demonstrate how the general and open optimization environment HeuristicLab in its latest 2. DESCRIPTION OF THE FRAMEWORK version can be used to optimize simulation models. The user interface to an evolution strategy (ES) is shown in Figure 1. The methods defined by this Keywords: simulation-based optimization, evolutionary interface are common to many optimization strategies algorithms, metaoptimization and reusing them is one idea of the optimization environment. 1. INTRODUCTION In Ólafsson and Kim (2002) simulation optimization is defined as “the process of finding the best values of some decision variables for a system where the performance is evaluated based on the output of a simulation model of this system”.
    [Show full text]
  • Heuristic and Evolutionary Algorithms Lab (HEAL) Softwarepark 11, A-4232 Hagenberg
    gefördert durch Contact: HOPL Dr. Michael Affenzeller FH OOE - School of Informatics, Communications and Media Heuristic Optimization in Production and Logistics Heuristic and Evolutionary Algorithms Lab (HEAL) Softwarepark 11, A-4232 Hagenberg e-mail: [email protected] Web: http://heal.heuristiclab.com http://dev.heuristiclab.com GECCO 2016 Industrial Applications & Evolutionary Computation in Practice Day Research Group HEAL Research Group • 5 professors • 7 PhD students • Interns, Master and Bachelor students Research Focus • Problem modeling • Process optimization • Data-based structure identification Scientific Partners • Supply chain and logistics optimization • Algorithm development and analysis Industry Partners (excerpt) GECCO 2016 Industrial Applications & Evolutionary Computation in Practice Day 2 Metaheuristics Metaheuristics • Intelligent search strategies • Can be applied to different problems • Explore interesting regions of the search space (parameter) • Tradeoff: computation vs. quality Good solutions for very complex problems • Must be tuned to applications Challenges • Choice of appropriate metaheuristics • Hybridization Finding Needles in Haystacks GECCO 2016 Industrial Applications & Evolutionary Computation in Practice Day 3 Research Focus VNS ALPS PSO Structure Identification Machine Learning GP Data Mining Modeling and Regression Simulation ES Time-Series Classification SASEGASA Neural Networks SEGA TS GA SA Statistics Production planning and Logistics optimization Operations Research GECCO 2016
    [Show full text]