Particle Swarm Optimization

Total Page:16

File Type:pdf, Size:1020Kb

Particle Swarm Optimization PARTICLE SWARM OPTIMIZATION Thesis Submitted to The School of Engineering of the UNIVERSITY OF DAYTON In Partial Fulfillment of the Requirements for The Degree of Master of Science in Electrical Engineering By SaiPrasanth Devarakonda UNIVERSITY OF DAYTON Dayton, Ohio May, 2012 PARTICLE SWARM OPTIMIZATION Name: Devarakonda, SaiPrasanth APPROVED BY: Raul Ordonez, Ph.D. John Loomis, Ph.D. Advisor Committee Chairman Committee Member Associate Professor Associate Professor Electrical & Computer Engineering Electrical & Computer Engineering Robert Penno, Ph.D. Committee Member Associate Professor Electrical & Computer Engineering John G. Weber, Ph.D. Tony E. Saliba, Ph.D. Associate Dean Dean, School of Engineering School of Engineering & Wilke Distinguished Professor ii ABSTRACT PARTICLE SWARM OPTIMIZATION Name: Devarakonda, SaiPrasanth University of Dayton Advisor: Dr. Raul Ordonez The particle swarm algorithm is a computational method to optimize a problem iteratively. As the neighborhood determines the sufficiency and frequency of information flow, the static and dynamic neighborhoods are discussed. The characteristics of the different methods for the selection of the algorithm for a particular problem are summarized. The performance of particle swarm optimization with dynamic neighborhood is investigated by three different methods. In the present work two more benchmark functions are tested using the algorithm. Conclusions are drawn by testing the different benchmark functions that reflect the performance of the PSO with dynamic neighborhood. And all the benchmark functions are analyzed by both Synchronous and Asynchronous PSO algorithms. iii This thesis is dedicated to my grandmother Jogi Lakshmi Narasamma. iv ACKNOWLEDGMENTS I would like to thank my advisor Dr.Raul Ordonez for being my mentor, guide and personally supporting during my graduate studies and while carrying out the thesis work and offering me excellent ideas. I also wish to express my deepest gratitude to Dr.Veysel Gazi, who along with my advisor offered me his help while working on my thesis. I would also like to thank Dr. John Loomis and Dr. Robert Penno for being the committee members. I would like to express my appreciation to my brother who has helped with my work. I would like to thank everyone in the Electrical Department for making me feel comfortable in the two and half year's of study at University of Dayton. I finally thank my family for their support and love in all activities during my time in the graduate program. v TABLE OF CONTENTS Page ABSTRACT . iii DEDICATION . iv ACKNOWLEDGMENTS . v LIST OF FIGURES . viii LIST OF TABLES . xiii CHAPTER: 1. INTRODUCTION . 1 1.1 Particle Swarm Optimization Algorithm . 2 1.1.1 Particle Swarm Optimization with Constriction Factor . 5 1.2 Hybrid Particle Swarm Optimization Algorithms . 7 1.3 Parallel and Distributed Implementation . 10 1.4 Multi Objective Optimization . 11 1.5 Stability and Convergence Analysis . 13 1.6 Application Areas . 14 1.6.1 Neural Network Training . 14 1.6.2 Dynamic Tracking . 14 1.6.3 Multi-Agent Search . 15 1.6.4 Wireless-Sensor Networks . 19 1.6.5 Optimal Design of Power Grids . 20 1.6.6 PSO for Multi User Detection in CDMA . 20 2. NEIGHBORHOOD TOPOLOGIES . 22 2.1 Static Neighborhood . 23 2.2 Dynamic Neighborhood . 25 vi 2.2.1 Nearest Neighbors in Search Space . 26 2.2.2 Nearest Neighbors in Function Space . 27 2.2.3 Random Neighborhood . 27 2.3 Synchronous and Asynchronous PSO . 29 2.3.1 Synchronous PSO . 29 2.3.2 Asynchronous PSO . 30 3. RESULTS - I . 32 4. RESULTS - II . 55 5. RESULTS - III . 77 5.1 Synchronous PSO . 77 5.2 Asynchronous PSO . 82 6. CONCLUSIONS . 90 6.1 Design Guidelines . 95 6.2 Future Work . 95 BIBLIOGRAPHY . 96 Appendices: A. MATLAB CODE FOR SYNCHRONOUS PSO ALGORITHM FOR DYNAMIC NEIGHBORHOOD FOR DEJONGF4 FUNCTION . 106 B. MATLAB CODE FOR ASYNCHRONOUS PSO ALGORITHM FOR DYNAMIC NEIGHBORHOOD FOR DEJONGF4 FUNCTION . 110 C. MATLAB CODE FOR SYNCHRONOUS PSO ALGORITHM FOR NO OF PARTICLES AS PARAMETER FOR DEJONGF4 FUNCTION . 114 D. MATLAB CODE FOR ASYNCHRONOUS PSO ALGORITHM FOR DYNAMIC NEIGHBORHOOD FOR DEJONGF4 FUNCTION . 118 vii LIST OF FIGURES Figure Page 2.1 Static Neighborhood Topologies. 24 2.2 Nearest neighbors in search space. 27 2.3 Nearest neighbors in function space. 28 3.1 Contour plots of all six benchmark functions. 33 3.2 Distance between particles in search space against average global value for a Sphere function. 35 3.3 Distance between particles in search space against average global value for a Griewank function. 35 3.4 Distance between particles in search space against average global value for Rastrigin function. 37 3.5 Distance between particles in search space against average global value for a Rosenbrock function. 37 3.6 Distance between particles in search space against average global value for a Ackley function. 38 3.7 Distance between particles in search space against average global value for a DejonF4 function. 41 viii 3.8 Distance between particles in function space against average global value for a Sphere function. 41 3.9 Distance between particles in function space against average global value for a Griewank function. 42 3.10 Distance between particles in function space against average global value for a Rastrigin function. 43 3.11 Distance between particles in function space against average global value for a Rosenbrock function. 45 3.12 Distance between particles in function space against average global value for a Ackley function. 45 3.13 Distance between particles in function space against average global value for a DejonF4 function. 48 3.14 Probability of particles being neighbors against mean global best value for a Sphere function. 48 3.15 Probability of particles being neighbors against mean global best value for a Griewank function. 49 3.16 Probability of particles being neighbors against mean global best value for a Rastrigin function. 50 3.17 Probability of particles being neighbors aganist mean global best value for a Rosenbrock function. 51 3.18 Probability of particles being neighbors against mean global best value for a Ackley function. 51 3.19 Probability of particles being neighbors against mean global best value for a DejongF4 function. 52 ix 3.20 Distance between particles in search space against average global best value for Synchronous PSO . 53 3.21 Distance between particles in search space against average global best value for Asynchronous PSO . 54 4.1 Distance between particles in search space against average global value for a Sphere function. 56 4.2 Distance between particles in search space against average global value for a Griewank function. 57 4.3 Distance between particles in search space against average global value for a Rastrigin function. 58 4.4 Distance between particles in search space against average global value for a Rosenbrock function. 59 4.5 Distance between particles in search space against average global value for a Ackley function. 61 4.6 Distance between particles in search space against average global value for a DejonF4 function. 61 4.7 Distance between particles in function space against average global value for a Sphere function. 63 4.8 Distance between particles in function space against average global value for a Griewank function. 64 4.9 Distance between particles in function space against average global value for a Rastrigin function. 65 4.10 Distance between particles in function space against average global value for a Rosenbrock function. 66 x 4.11 Distance between particles in function space against average global value for a Ackley function. 68 4.12 Distance between particles in function space against average global value for a DejonF4 function. 68 4.13 Distance between particles in Random Neighborhood against average global value for a Sphere function. 70 4.14 Distance between particles in Random Neighborhood against average global value for a Griewank function. 71 4.15 Distance between particles in random neighborhood against average global value for a Rastrigin function. 72 4.16 Distance between particles in random neighborhood against average global value for a Rosenbrock function. 73 4.17 Probability of particles being neighbors against mean global best value for a Ackley function. 73 4.18 Probability of particles being neighbors against mean global best value for a DejongF4 function. 74 4.19 Neighborhood size expressed as percentage of function space against Average global best value for Synchronous PSO . 75 4.20 Neighborhood size expressed as percentage of function space against Average global best value for Asynchronous PSO . 76 5.1 Average global best value versus No.of Neighbors for Sphere function. 79 5.2 Average global best value versus No.of Neighbors for Griewank function. 79 5.3 Average global best value versus No.of Neighbors for Rastrigin function. 81 5.4 Average global best value versus No.of Neighbors for Rosenbrock function. 81 xi 5.5 Average global best value versus No.of Neighbors for Ackley function. 82 5.6 Average global best value versus No.of Neighbors for DejongF4 function. 83 5.7 Average global best value versus No.of Neighbors for Sphere function. 84 5.8 Average global best value versus No.of Neighbors for Griewank function. 85 5.9 Average global best value versus No.of Neighbors for Rastrigin function. 86 5.10 Average global best value versus No.of Neighbors for Rosenbrock function. 87 5.11 Average global best value versus No.of Neighbors for Ackley function. 88 5.12 Average global best value versus No.of Neighbors for DejongF4 function. 89 6.1 Comparison of Synchronous PSO and Asynchronous PSO in Static Neighborhood . 93 6.2 Comparison of Synchronous PSO and Asynchronous PSO in Dynamic Neighborhood . 94 xii LIST OF TABLES Table Page 3.1 Results for Neighborhood determination based on nearest neighbors in the search space . 34 3.2 Results for Neighborhood determination based on nearest neighbors in the search space .
Recommended publications
  • Metaheuristics ``In the Large''
    Metaheuristics “In the Large” Jerry Swan∗, Steven Adriaensen, Alexander E. I. Brownlee, Kevin Hammond, Colin G. Johnson, Ahmed Kheiri, Faustyna Krawiec, J. J. Merelo, Leandro L. Minku, Ender Ozcan,¨ Gisele L. Pappa, Pablo Garc´ıa-S´anchez, Kenneth S¨orensen, Stefan Voß, Markus Wagner, David R. White Abstract Following decades of sustained improvement, metaheuristics are one of the great success stories of optimization research. However, in order for research in metaheuristics to avoid fragmentation and a lack of reproducibility, there is a pressing need for stronger scientific and computational infrastructure to sup- port the development, analysis and comparison of new approaches. To this end, we present the vision and progress of the “Metaheuristics ‘In the Large’ ” project. The conceptual uderpinnings of the project are: truly extensible algo- rithm templates that support reuse without modification, white box problem descriptions that provide generic support for the injection of domain specific knowledge, and remotely accessible frameworks, components and problems that will enhance reproducibility and accelerate the field’s progress. We ar- gue that, via principled choice of infrastructure support, the field can pur- sue a higher level of scientific enquiry. We describe our vision and report on progress, showing how the adoption of common protocols for all metaheuris- tics can help liberate the potential of the field, easing the exploration of the design space of metaheuristics. Keywords: Evolutionary Computation, Operational Research, Heuristic design, Heuristic methods, Architecture, Frameworks, Interoperability 1. Introduction arXiv:2011.09821v4 [cs.NE] 3 Jun 2021 Optimization problems have myriad real world applications [42] and have motivated a wealth of research since before the advent of the digital computer [25].
    [Show full text]
  • Metaheuristic Optimization Frameworks: a Survey and Benchmarking
    Soft Comput DOI 10.1007/s00500-011-0754-8 ORIGINAL PAPER Metaheuristic optimization frameworks: a survey and benchmarking Jose´ Antonio Parejo • Antonio Ruiz-Corte´s • Sebastia´n Lozano • Pablo Fernandez Ó Springer-Verlag 2011 Abstract This paper performs an unprecedented com- and affordable time and cost. However, heuristics are parative study of Metaheuristic optimization frameworks. usually based on specific characteristics of the problem at As criteria for comparison a set of 271 features grouped in hand, which makes their design and development a com- 30 characteristics and 6 areas has been selected. These plex task. In order to solve this drawback, metaheuristics features include the different metaheuristic techniques appear as a significant advance (Glover 1977); they are covered, mechanisms for solution encoding, constraint problem-agnostic algorithms that can be adapted to incor- handling, neighborhood specification, hybridization, par- porate the problem-specific knowledge. Metaheuristics allel and distributed computation, software engineering have been remarkably developed in recent decades (Voß best practices, documentation and user interface, etc. A 2001), becoming popular and being applied to many metric has been defined for each feature so that the scores problems in diverse areas (Glover and Kochenberger 2002; obtained by a framework are averaged within each group of Back et al. 1997). However, when new are considered, features, leading to a final average score for each frame- metaheuristics should be implemented and tested, implying work. Out of 33 frameworks ten have been selected from costs and risks. the literature using well-defined filtering criteria, and the As a solution, object-oriented paradigm has become a results of the comparison are analyzed with the aim of successful mechanism used to ease the burden of applica- identifying improvement areas and gaps in specific tion development and particularly, on adapting a given frameworks and the whole set.
    [Show full text]
  • Parameter Meta-Optimization of Metaheuristic Optimization Algorithms
    Fachhochschul-Masterstudiengang SOFTWARE ENGINEERING 4232 Hagenberg, Austria Parameter Meta-Optimization of Metaheuristic Optimization Algorithms Diplomarbeit zur Erlangung des akademischen Grades Master of Science in Engineering Eingereicht von Christoph Neumüller, BSc Begutachter: Prof. (FH) DI Dr. Stefan Wagner September 2011 Erklärung Hiermit erkläre ich an Eides statt, dass ich die vorliegende Arbeit selbstständig und ohne fremde Hilfe verfasst, andere als die angegebenen Quellen und Hilfsmit- tel nicht benutzt und die aus anderen Quellen entnommenen Stellen als solche gekennzeichnet habe. Hagenberg, am 4. September 2011 Christoph Neumüller, BSc. ii Contents Erklärung ii Abstract 1 Kurzfassung 2 1 Introduction 3 1.1 Motivation and Goal . .3 1.2 Structure and Content . .4 2 Theoretical Foundations 5 2.1 Metaheuristic Optimization . .5 2.1.1 Trajectory-Based Metaheuristics . .6 2.1.2 Population-Based Metaheuristics . .7 2.1.3 Optimization Problems . 11 2.1.4 Operators . 13 2.2 Parameter Optimization . 17 2.2.1 Parameter Control . 18 2.2.2 Parameter Tuning . 18 2.3 Related Work in Meta-Optimization . 19 3 Technical Foundations 22 3.1 HeuristicLab . 22 3.1.1 Key Concepts . 22 3.1.2 Algorithm Model . 23 3.2 HeuristicLab Hive . 25 3.2.1 Components . 26 4 Requirements 29 5 Implementation 31 5.1 Solution Encoding . 31 5.1.1 Parameter Trees in HeuristicLab . 31 5.1.2 Parameter Configuration Trees . 33 iii Contents iv 5.1.3 Search Ranges . 36 5.1.4 Symbolic Expression Grammars . 37 5.2 Fitness Function . 39 5.2.1 Handling of Infeasible Solutions . 41 5.3 Operators . 41 5.3.1 Solution Creator .
    [Show full text]
  • Rapid and Flexible User-Defined Low-Level Hybridization for Metaheuristics Algorithm in Software Framework
    Journal of Software Engineering and Applications, 2012, 5, 873-882 873 http://dx.doi.org/10.4236/jsea.2012.511102 Published Online November 2012 (http://www.SciRP.org/journal/jsea) Rapid and Flexible User-Defined Low-Level Hybridization for Metaheuristics Algorithm in Software Framework S. Masrom*, Siti Z. Z. Abidin, N. Omar Faculty of Computer and Mathematical Sciences, Universiti Teknologi MARA, Shah Alam, Malaysia. Email: *[email protected], [email protected], [email protected] Received September 23rd, 2012; revised October 21st, 2012; accepted October 30th, 2012 ABSTRACT The metaheuristics algorithm is increasingly important in solving many kinds of real-life optimization problems but the implementation involves programming difficulties. As a result, many researchers have relied on software framework to accelerate the development life cycle. However, the available software frameworks were mostly designed for rapid de- velopment rather than flexible programming. Therefore, in order to extend software functions, this approach involves modifying software libraries which requires the programmers to have in-depth understanding about the internal working structure of software and the programming language. Besides, it has restricted programmers for implementing flexible user-defined low-level hybridization. This paper presents the concepts and formal definition of metaheuristics and its low-level hybridization. In addition, the weaknesses of current programming approaches supported by available soft- ware frameworks for metaheuristics are discussed. Responding to the deficiencies, this paper introduces a rapid and flexible software framework with scripting language environment. This approach is more flexible for programmers to create a variety of user-defined low-level hybridization rather than bounded with built-in metaheuristics strategy in software libraries.
    [Show full text]
  • Integrating Heuristiclab with Compilers and Interpreters for Non-Functional Code Optimization
    Integrating HeuristicLab with Compilers and Interpreters for Non-Functional Code Optimization Daniel Dorfmeister Oliver Krauss Software Competence Center Hagenberg Johannes Kepler University Linz Hagenberg, Austria Linz, Austria [email protected] University of Applied Sciences Upper Austria Hagenberg, Austria [email protected] ABSTRACT 1 INTRODUCTION Modern compilers and interpreters provide code optimizations dur- Genetic Compiler Optimization Environment (GCE) [5, 6] integrates ing compile and run time, simplifying the development process for genetic improvement (GI) [9, 10] with execution environments, i.e., the developer and resulting in optimized software. These optimiza- the Truffle interpreter [23] and Graal compiler [13]. Truffle is a tions are often based on formal proof, or alternatively stochastic language prototyping and interpreter framework that interprets optimizations have recovery paths as backup. The Genetic Compiler abstract syntax trees (ASTs). Truffle already provides implementa- Optimization Environment (GCE) uses a novel approach, which tions of multiple programming languages such as Python, Ruby, utilizes genetic improvement to optimize the run-time performance JavaScript and C. Truffle languages are compiled, optimized and of code with stochastic machine learning techniques. executed via the Graal compiler directly in the JVM. Thus, GCE In this paper, we propose an architecture to integrate GCE, which provides an option to generate, modify and evaluate code directly directly integrates with low-level interpreters and compilers, with at the interpreter and compiler level and enables research in that HeuristicLab, a high-level optimization framework that features a area. wide range of heuristic and evolutionary algorithms, and a graph- HeuristicLab [20–22] is an optimization framework that provides ical user interface to control and monitor the machine learning a multitude of meta-heuristic algorithms, research problems from process.
    [Show full text]
  • Heuristiclab References
    Facts HeuristicLab provides a feature rich software environment for heuristic optimization researchers and practitioners. It is based on a generic and flexible model layer and offers a graphical algorithm designer that enables the user to cre- ate, apply, and analyze heuristic optimization methods. A powerful experimenter allows HeuristicLab users to design and perform parameter tests even in parallel. The results of these tests can be stored and analyzed easily in several configurable charts. HeuristicLab is available under the GPL license and is currently used in education, research, and in- dev.heuristiclab.com dustry projects. System Requirements REFERENCES HEURISTICLAB >> Microsoft Windows XP / Vista / 7 / 8 >> Microsoft .NET Framework 4.0 (full version) A Paradigm-Independent and Extensible Environment for Download Heuristic Optimization >> http://dev.heuristiclab.com Contact Heuristic and Evolutionary Algorithms Laboratory (HEAL) University of Applied Sciences Upper Austria Softwarepark 11, 4232 Hagenberg, Austria Phone: +43 7236 3888 2030 KAROSSERIE & KABINENBAU GMBH Web: http://heal.heuristiclab.com E-Mail: [email protected] KAROSSERIE & KABINENBAU GMBH KAROSSERIE & KABINENBAU GMBH FH OÖ Forschungs & Entwicklungs GmbH • Franz-Fritsch-Str. 11/Top 3 4600 Wels/Austria • Telefonnummer: +43 (0)7242 44808-43 Fax: +43 (0)7242 44808-77 • E-Mail: [email protected] • www.fh-ooe.at Features HeuristicLab >> Rich User Experience The development of HeuristicLab started in 2002, when a Algorithm developers can use a number of included well- A comfortable and feature rich graphical group of researchers in the heuristic optimization domain de- known metaheuristics, a large library of operators, a graphical user interface enables non-program- cided to build a software system for exploring new research algorithm designer and an experiment designer to create and mers to use and apply HeuristicLab.
    [Show full text]
  • Parameter Identification for Simulation Models by Heuristic Optimization Michael Kommenda, Stephan Winkler
    Parameter Identification for Simulation Models by Heuristic Optimization Michael Kommenda, Stephan Winkler FH OÖ Forschungs & Entwicklungs GmbH, Softwarepark 13, A-4232 Hagenberg, AUSTRIA ABSTRACT: In this publication we describe a generic parameter identification approach that couples the heuristic opti- mization framework HeuristicLab with simulation models implemented in MATLAB or Scilab. This approach enables the reuse of already available optimization algorithms in HeuristicLab such as evolution strategies, gradient-based optimization algorithms, or evolutionary algorithms and simulation models implemented in the targeted simulation environment. Hence, the configuration effort is minimized and the only necessary step to perform the parameter identification is the definition of an objective function that calculates the quality of a set of parameters proposed by the optimization algorithm; this quality is here calculated by comparing originally measured values and those produced by the simulation model using the proposed parameters. The suitability of this parameter identification approach is demonstrated using an exemplary use-case, where the mass and the two friction coefficients of an electric cart system are identified by ap- plying two different optimization algorithms, namely the Broyden-Fletcher-Goldfarb-Shanno algorithm and the covariance matrix adaption evolution strategy. Using the here described approach a multitude of opti- mization algorithms becomes applicable for parameter identification. 1 INTRODUCTION Simulation models are used for describing processes and systems of the real world as closely as possible, analyzing and predicting system behavior, and testing alternatives for the original system’s design or parametrization. Dynamical, technical systems are often modeled using differential equation systems; usually, first appropriate equation systems are defined, and after- wards their parameters are adjusted so that the resulting set of equations resembles the mod- eled system as closely as possible.
    [Show full text]
  • Algorithm and Experiment Design with Heuristiclab Instructor Biographies
    19.06.2011 Algorithm and Experiment Design with HeuristicLab An Open Source Optimization Environment for Research and Education S. Wagner, G. Kronberger Heuristic and Evolutionary Algorithms Laboratory (HEAL) School of Informatics, Communications and Media, Campus Hagenberg Upper Austria University of Applied Sciences Instructor Biographies • Stefan Wagner – MSc in computer science (2004) Johannes Kepler University Linz, Austria – PhD in technical sciences (2009) Johannes Kepler University Linz, Austria – Associate professor (2005 – 2009) Upper Austria University of Applied Sciences – Full professor for complex software systems (since 2009) Upper Austria University of Applied Sciences – Co‐founder of the HEAL research group – Project manager and chief architect of HeuristicLab – http://heal.heuristiclab.com/team/wagner • Gabriel Kronberger – MSc in computer science (2005) Johannes Kepler University Linz, Austria – PhD in technical sciences (2010) Johannes Kepler University Linz, Austria – Research assistant (since 2005) Upper Austria University of Applied Sciences – Member of the HEAL research group – Architect of HeuristicLab – http://heal.heuristiclab.com/team/kronberger ICCGI 2011 http://dev.heuristiclab.com 2 1 19.06.2011 Agenda • Objectives of the Tutorial • Introduction • Where to get HeuristicLab? • Plugin Infrastructure • Graphical User Interface • Available Algorithms & Problems • Demonstration • Some Additional Features • Planned Features • Team • Suggested Readings • Bibliography • Questions & Answers ICCGI 2011 http://dev.heuristiclab.com
    [Show full text]
  • Phd Thesis, Technical University of Denmark, Institute of Mathematical Modelling
    JOHANNES KEPLER UNIVERSITAT¨ LINZ JKU Technisch-Naturwissenschaftliche Fakult¨at Adaptive Heuristic Approaches for Dynamic Vehicle Routing - Algorithmic and Practical Aspects DISSERTATION zur Erlangung des akademischen Grades Doktor im Doktoratsstudium der Technischen Wissenschaften Eingereicht von: Stefan Vonolfen, MSc Angefertigt am: Institut f¨ur Formale Modelle und Verifikation Beurteilung: Priv.-Doz. Dr. Michael Affenzeller (Betreuung) Univ.-Prof. Dr. Karl D¨orner Linz, Juni, 2014 Acknowledgments The work presented in this thesis would not have been possible without the many fruitful discussions with my colleagues from the research group Heuris- tic and Evolutionary Algorithms Laboratory (HEAL) and without the Heuris- ticLab optimization environment as a software infrastructure (the web pages of all members of HEAL as well as further information about HeuristicLab can be found at: http://www.heuristiclab.com). Particularly, I would like to thank Michael Affenzeller for his guidance concerning the algorithmic aspects of this thesis as my supervisor as well as providing a very supportive working environment as the research group head. I would also like to thank Stefan Wagner for triggering my research interest in metaheuristic algorithms during the supervision of my bachelor and master thesis. The discussions with my colleagues Andreas Beham and Erik Pitzer about vehicle routing, fitness landscape analysis, and algorithm selection led to many ideas presented in this thesis. Michael Kommenda provided important insights on genetic programming and Monika Kofler contributed knowledge about storage assignment. Stephan Hutterer was working on the generation of policies for smart grids and pointed out many links to related literature. I would also like to thank Prof. Karl D¨orner from the institute for pro- duction and logistics management at the JKU for giving me the possibility to present my work at the ORP3 workshop as well as during a seminar at his institute.
    [Show full text]
  • Simulation Optimization with Heuristiclab
    SIMULATION OPTIMIZATION WITH HEURISTICLAB Andreas Beham(a), Michael Affenzeller(b), Stefan Wagner(c), Gabriel K. Kronberger(d) (a)(b)(c)(d)Upper Austria University of Applied Sciences, Campus Hagenberg School of Informatics, Communication and Media Heuristic and Evolutionary Algorithms Laboratory Softwarepark 11, A-4232 Hagenberg, Austria (a)[email protected], (b)[email protected] (c)[email protected], (d)[email protected] ABSTRACT structure and the parameters of the designed Simulation optimization today is an important branch in optimization strategy and can be executed in the the field of heuristic optimization problems. Several HeuristicLab optimization environment. After the simulators include built-in optimization and several execution has terminated or was aborted the results or companies have emerged that offer optimization respectively intermediate results are saved along the strategies for different simulators. Often the workbench. So in addition to the structure and optimization strategy is a secret and only sparse parameters, the results can also be saved in a single information is known about its inner workings. In this document, reopened at any time and examined. paper we want to demonstrate how the general and open optimization environment HeuristicLab in its latest 2. DESCRIPTION OF THE FRAMEWORK version can be used to optimize simulation models. The user interface to an evolution strategy (ES) is shown in Figure 1. The methods defined by this Keywords: simulation-based optimization, evolutionary interface are common to many optimization strategies algorithms, metaoptimization and reusing them is one idea of the optimization environment. 1. INTRODUCTION In Ólafsson and Kim (2002) simulation optimization is defined as “the process of finding the best values of some decision variables for a system where the performance is evaluated based on the output of a simulation model of this system”.
    [Show full text]
  • Heuristic and Evolutionary Algorithms Lab (HEAL) Softwarepark 11, A-4232 Hagenberg
    gefördert durch Contact: HOPL Dr. Michael Affenzeller FH OOE - School of Informatics, Communications and Media Heuristic Optimization in Production and Logistics Heuristic and Evolutionary Algorithms Lab (HEAL) Softwarepark 11, A-4232 Hagenberg e-mail: [email protected] Web: http://heal.heuristiclab.com http://dev.heuristiclab.com GECCO 2016 Industrial Applications & Evolutionary Computation in Practice Day Research Group HEAL Research Group • 5 professors • 7 PhD students • Interns, Master and Bachelor students Research Focus • Problem modeling • Process optimization • Data-based structure identification Scientific Partners • Supply chain and logistics optimization • Algorithm development and analysis Industry Partners (excerpt) GECCO 2016 Industrial Applications & Evolutionary Computation in Practice Day 2 Metaheuristics Metaheuristics • Intelligent search strategies • Can be applied to different problems • Explore interesting regions of the search space (parameter) • Tradeoff: computation vs. quality Good solutions for very complex problems • Must be tuned to applications Challenges • Choice of appropriate metaheuristics • Hybridization Finding Needles in Haystacks GECCO 2016 Industrial Applications & Evolutionary Computation in Practice Day 3 Research Focus VNS ALPS PSO Structure Identification Machine Learning GP Data Mining Modeling and Regression Simulation ES Time-Series Classification SASEGASA Neural Networks SEGA TS GA SA Statistics Production planning and Logistics optimization Operations Research GECCO 2016
    [Show full text]
  • Integrated Simulation and Optimization in Heuristiclab
    INTEGRATED SIMULATION AND OPTIMIZATION IN HEURISTICLAB Andreas Beham(a, b), Gabriel Kronberger(a), Johannes Karder(a), Michael Kommenda(a, b), Andreas Scheibenpflug(a), Stefan Wagner(a), Michael Affenzeller(a, b) (a)) Heuristic and Evolutionary Algorithms Laboratory School of Informatics, Communication, and Media University of Applied Sciences Upper Austria, Softwarepark 11, 4232 Hagenberg, Austria (b) Johannes Kepler University Linz Institute for Formal Models and Verification Altenberger Straße 69, 4040 Linz, Austria (a) [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected] ABSTRACT (Affenzeller et al. 2007), its integration into HeuristicLab Process simulation has many applications that are closely (Wagner et al. 2014) and further developments to generic related to optimization. Finding optimal steering and extensible protocols (Beham et al. 2012) with the aim parameters for the simulated processes is an activity in of supporting parallel model evaluation in order to speed which the simulation model is often used as an evaluation up the optimization. In this work the aim is to introduce function to an optimization procedure. Combining simulation capabilities into the optimization software optimization and simulation has been achieved in the past environment HeuristicLab. Instead of having to connect already, however optimization procedures implemented the optimization software with the simulation in simulation software are often only black box solvers environment we want to allow writing and evaluating that are difficult to change, extend or parameterize. models in HeuristicLab more easily, especially those that Optimization software frameworks on the other hand are tied to optimization problems.
    [Show full text]