Global Search Methods for Solving Nonlinear Optimization Problems
Total Page:16
File Type:pdf, Size:1020Kb
GLOBAL SEARCH METHODS FOR SOLVING NONLINEAR OPTIMIZATION PROBLEMS BY YI SHANG BEngr University of Science and Technology of China MEngr Academia Sinica THESIS Submitted in partial fulllment of the requirements for the degree of Do ctor of Philosophy in Computer Science in the Graduate College of the University of Illinois at UrbanaChampaign Urbana Illinois c Copyrightby Yi Shang GLOBAL SEARCH METHODS FOR SOLVING NONLINEAR OPTIMIZATION PROBLEMS Yi Shang PhD Department of Computer Science University of Illinois at UrbanaChampaign Benjamin W Wah Advisor In this thesis we present new metho ds for solving nonlinear optimization problems These problems are dicult to solve b ecause the nonlinear constraints form feasible regions that are dicult to nd and the nonlinear ob jectives contain lo cal minima that trap descenttyp e search metho ds In order to nd go o d solutions in nonlinear optimization we fo cus on the following twokey issues how to handle nonlinear constraints and how to escap e from lo cal minima We use a Lagrangemultiplierba sed formulation to handle nonlinear constraints and develop Lagrangian metho ds with dynamic control to provide faster and more robust convergence We extend the traditional Lagrangian theory for the continuous space to the discrete space and develop ecient discrete Lagrangian metho ds Toovercome lo cal minima we design a new tracebased globalsearch metho d that relies on an external traveling trace to pull a search tra jectory out of a lo cal optimum in a continuous fashion without having to restart the search from a new starting p oint Go o d starting p oints identied in the global search are used in the lo cal searchtoidentify true lo cal optima By combining these new metho ds wedevelop a prototyp e called Novel Nonlinear Optimization Via External Lead that solves nonlinear constrained and unconstrained problems in a unied framework Weshow exp erimental results in applying Novel to solve nonlinear optimization problems including a the learning of feedforward neural networks b the design of quadrature um satisability mirrorlter digital lter banks c the satisabili ty problem d the maxim problem and e the design of multiplierless quadraturemirrorlter digital lter banks Our metho d achieves b etter solutions than existing metho ds or achieves solutions of the same quality but at a lower cost iii DEDICATION TomywifeLei my parents and my son Charles iv ACKNOWLEDGEMENTS This dissertation would not have b een p ossible without the supp ort of many p eople With much pleasure I take the opp ortunity to thank all the p eople who havehelpedme through the course of graduate study First I would like to thank my advisor Professor Benjamin W Wah for his invaluable guidance advice and encouragement His constant condence in the face of research uncer tainties and his p ersistent questioning of our metho ds have b een and will always b e inspiring to me I would like to express my gratitude for the other memb ers of my dissertation com mittee Professors Michael T Heath Chung L Liu and Sylvian R Ray for their careful reading of the dissertation and valuable suggestions I would like to thank Dr Bart Selman for providing help and guidance in studying SAT problems I would like to thank Ms Jenn Wilson for carefully reading the nal draft of this dissertation Many thanks are due to my colleagues who havecontributed to this dissertation Dr Yao Jen Chang for laying the foundation of this researchwork Ms Ting Yu for adapting our metho ds to digital lterbank applications Mr Tao Wang for extending our metho ds to constrained optimization problems and Mr Zhe Wu for extending our metho ds to more discrete optimization applications and carrying on this work in the future I would liketo thank all my current and former colleagues LonChan Chu Kumar GanapathyArthur Ieumwananonthachai YaoJen Chang ChinChi Teng Peter Wahle Tao Wang ChienWei Li Yu Ting Xiao Su Je Monks Zhe Wu and Dong Lin for their helpful discussions useful advice supp ort and friendship v Iwould liketoacknowledge the nancial supp ort of National Science Foundation Grants MIP MIP and MIP and National Aeronautics and Space Administration GrantNAG My family always surrounded me with love and supp ort I wish to thank my wife Lei for her love understanding encouragement and patience I wish to thank my parents whose love and supp ort have brought me this far I wanttothankmy sister Ting and my brother Ying for their caring and supp ort throughout my education I want to thank my parentsin law for their help in the nal stage of my graduate education I wish to dedicate this thesis to my wife my parents and my little b oy Charles vi TABLE OF CONTENTS Page INTRODUCTION Optimization Problems ATaxonomy of Optimization Problems Continuous Optimization Problems Discrete Optimization Problems Decision Problems Challenges in Solving Nonlinear Optimization Problems Characteristics of Nonlinear Optimization Algorithms Goals and Approaches of This Research Approaches to Handle Nonlinear Constraints Approaches to Overcome Lo cal Minima A Unied Framework to Solve Nonlinear Optimization Problems Applications of Prop osed Metho ds Outline of Thesis Contributions of This Research HANDLING NONLINEAR CONSTRAINTS Previous Work Nontransformational Approaches Transformational Approaches Existing Lagrangian Theory and Metho ds Discrete Optimization Prop osed Metho ds for Handling Nonlinear Constraints Handling Continuous Constraints Convergence Sp eed of Lagrangian Metho ds Dynamic Weight Adaptation Discrete Constraint Handling Metho d Summary vii OVERCOMING LOCAL MINIMA Previous Work on Global Optimization Deterministic Global Optimization Metho ds Probabilistic Global Optimization Metho ds Global Search in Constrained Optimization Problem Sp ecication and Transformation A New Multilevel Global SearchMethod Tracebased Global Search General Framework An Illustrative Example Trace Function Design Evaluation of Coverage Space Filling Curves Searching from Coarse to Fine Numerical Evaluation of TraceFunction Coverage Summary of TraceFunction Design Sp eed of Trace Function Variable Scaling During Global Search Numerical Metho ds for Solving the Dynamic Systems of Equations Novel A Prototyp e Implementing Prop osed Metho ds Continuous Metho ds Discrete Metho ds Potential for Parallel Pro cessing Summary UNCONSTRAINED OPTIMIZATION IN NEURALNETWORK LEARNING Articial Neural Networks Feedforward NeuralNetwork Learning Problems Existing Metho ds for Feedforward Neural Network Learning Learning Using Lo cal Optimization Learning Using Global Search Exp erimental Results of Novel Benchmark Problems Studied in Our Exp eriments Exp erimental Results on the Twospiral Problem Comparison with Other Global Optimization Metho ds Exp erimental Results on Other Benchmarks Summary CONSTRAINED OPTIMIZATION IN QMF FILTERBANK DESIGN TwoChannel Filter Banks Existing Metho ds in QMF FilterBank Design Optimization Formulations of QMF Filterbank Design viii Evaluations of Performance Metrics Closedform Formulas of E x E x and E x r s p Numerical Evaluation of x x T x s p t Evaluations of the Firstorder Derivatives of Performance Metrics Exp erimental Results Performance of Novel Comparison with Other Global Optimization Metho ds Performance Tradeos Summary DISCRETE OPTIMIZATION Intro duction to Satisability Problems Previous Work Discrete Formulations Continuous Formulations Applying DLM to SolveSAT Problems Implementations of DLM Algorithmic Design Considerations Three Implementations of the Generic DLM Reasons b ehind the DevelopmentofA Exp erimental Results Solving Maximum Satisability Problems Previous Work DLM for Solving MAXSAT Problems Exp erimental Results Designing Multiplierless QMF Filter Banks Intro duction DLM Implementation Issues Exp erimental Results Summary CONCLUSIONS AND FUTURE WORK Summary of Work Accomplished Future Work APPENDIX A EXPERIMENTAL RESULTS OF NOVEL ON SOME TEST FUNCTIONS REFERENCES VITA ix LIST OF TABLES Table Page Global and lo cal searchcomponents used in existing global optimization metho ds Von Neumann computer versus biological