Matrix Polynomials and Their Lower Rank Approximations

Matrix Polynomials and Their Lower Rank Approximations

Matrix Polynomials and their Lower Rank Approximations by Joseph Haraldson A thesis presented to the University of Waterloo in fulfillment of the thesis requirement for the degree of Doctor of Philosophy in Computer Science Waterloo, Ontario, Canada, 2019 c Joseph Haraldson 2019 Examining Committee Membership The following served on the Examining Committee for this thesis. The decision of the Examining Committee is by majority vote. External Examiner: Lihong Zhi Professor, Mathematics Mechanization Research Center Institute of Systems Science, Academy of Mathematics and System Sciences Academia Sinica Supervisor(s): Mark Giesbrecht Professor, School of Computer Science, University of Waterloo George Labahn Professor, School of Computer Science, University of Waterloo Internal-External Member: Stephen Vavasis Professor, Dept. of Combinatorics and Optimization, University of Waterloo Other Member(s): Eric Schost Associate Professor, School of Computer Science, University of Waterloo Other Member(s): Yuying Li Professor, School of Computer Science, University of Waterloo ii I hereby declare that I am the sole author of this thesis. This is a true copy of the thesis, including any required final revisions, as accepted by my examiners. I understand that my thesis may be made electronically available to the public. iii Abstract This thesis is a wide ranging work on computing a \lower-rank" approximation of a matrix polynomial using second-order non-linear optimization techniques. Two notions of rank are investigated. The first is the rank as the number of linearly independent rows or columns, which is the classical definition. The other notion considered is the lowest rank of a matrix polynomial when evaluated at a complex number, or the McCoy rank. Together, these two notions of rank allow one to compute a nearby matrix polynomial where the structure of both the left and right kernels is prescribed, along with the structure of both the infinite and finite eigenvalues. The computational theory of the calculus of matrix polynomial valued functions is developed and used in optimization algorithms based on second-order approximations. Special functions studied with a detailed error analysis are the determinant and adjoint of matrix polynomials. The unstructured and structured variants of matrix polynomials are studied in a very general setting in the context of an equality constrained optimization problem. The most general instances of these optimization problems are NP hard to approximate solutions to in a global setting. In most instances we are able to prove that solutions to our optimiza- tion problems exist (possibly at infinity) and discuss techniques in conjunction with an implementation to compute local minimizers to the problem. Most of the analysis of these problems is local and done through the Karush-Kuhn- Tucker optimality conditions for constrained optimization problems. We show that most formulations of the problems studied satisfy regularity conditions and admit Lagrange multipliers. Furthermore, we show that under some formulations that the second-order sufficient condition holds for instances of interest of the optimization problems in question. When Lagrange multipliers do not exist, we discuss why, and if it is reasonable to do so, how to regularize the problem. In several instances closed form expressions for the derivatives of matrix polynomial valued functions are derived to assist in analysis of the optimality conditions around a solution. From this analysis it is shown that variants of Newton's method will have a local rate of convergence that is quadratic with a suitable initial guess for many problems. The implementations are demonstrated on some examples from the literature and sev- eral examples are cross-validated with different optimization formulations of the same mathematical problem. We conclude with a special application of the theory developed in this thesis is computing a nearby pair of differential polynomials with a non-trivial greatest common divisor, a non-commutative symbolic-numeric computation problem. We formu- late this problem as finding a nearby structured matrix polynomial that is rank deficient in the classical sense. iv Acknowledgements I would like to acknowledge the following for their investments in my research: • The Natural Sciences and Engineering Research Council of Canada, • The Government of the Province of Ontario, • The National Science Foundation, United States of America, • The National Security Agency, United States of America, • David. R. Cheriton, and • The University of Waterloo. I would like to thank my supervisors Dr. Mark Giesbrecht and Dr. George Labahn and the other members of my examination committee, Dr. Lihong Zhi, Dr. Stephen Vavasis, Dr. Eric Schost and Dr. Yuying Li for their time and valuable feedback. I would like to thank Dr. Guenter Krause from the University of Manitoba for providing me the opportunity to discover my interests in mathematics and for providing me with an opportunity to succeed. I would also like to thank Dr. Yang Zhang from the University of Manitoba for giving me the opportunity to work on research problems as an undergraduate student and an introduction to research in general. I would also like to thank Dr. Benqi Guo from the University of Manitoba for introducing me to numerical analysis, despite my reservations at the time where I insisted that I would \never need to use this". I would like to thank Weixi and the rest of my family. There are many other people who were supportive and inspirational along this journey who are not mentioned. I would like to mention all of them here and thank them for everything. v Dedication To the advancement of scientific knowledge. vi Table of Contents List of Figures xiii 1 Introduction1 1.1 A Non-Technical Overview...........................1 1.1.1 Structured Matrix Polynomials....................1 1.1.2 Optimization Problems.........................2 1.1.3 Stark Differences between Scalar and Polynomial Matrices.....5 1.2 Partitioning of the Thesis...........................6 1.2.1 Overview of Chapters..........................8 2 Preliminaries 10 2.1 Domain of Computation and Basic Notions.................. 10 2.2 Numerical Linear Algebra........................... 11 2.3 The Calculus of Vector and Matrix Valued Functions............ 15 2.4 Smooth Continuous Optimization....................... 17 2.4.1 Unconstrained Optimization...................... 20 2.4.2 Constrained Optimization....................... 20 2.5 Basic Results About Matrix Polynomials................... 24 2.6 Polynomial Approximate Greatest Common Divisor............. 31 2.6.1 Exact Polynomial Greatest Common Divisor............. 31 2.6.2 Approximate Greatest Common Divisor Problems......... 33 vii 3 Structured Lower Rank Approximations of Matrix Polynomials 36 3.1 Introduction................................... 36 3.1.1 Outline................................. 39 3.1.2 Previous research............................ 40 3.2 Approximate Kernel Computation....................... 44 3.2.1 Rank Computation........................... 45 3.2.2 Kernel Basis via Block Convolution.................. 48 3.2.3 Initial Guesses for Optimization Algorithms............. 52 3.2.4 Summary of Rank Computing Techniques.............. 53 3.3 Optimization Formulation Setup........................ 54 3.4 Rank Factorizations............................... 56 3.4.1 Embedded Rank Factorization..................... 56 3.4.2 Lagrange Multipliers and Optimality Conditions........... 59 3.4.3 The Hessian............................... 61 3.4.4 Implementation Notes......................... 63 3.5 Evaluated Rank Factorization......................... 64 3.5.1 Lagrange Multipliers and Optimality Conditions........... 65 3.5.2 The Hessian............................... 66 3.5.3 Implementation Notes......................... 67 3.6 Explicit Kernel Iterative Algorithm for Lower Rank Approximation.... 68 3.6.1 Minimal System of Equations..................... 68 3.6.2 Lagrange Multipliers and Optimality Conditions........... 70 3.6.3 The Jacobian.............................. 74 3.6.4 The Hessian............................... 76 3.6.5 Implementation Notes......................... 77 3.7 Implementation and Examples......................... 78 3.7.1 Description of Algorithms....................... 78 viii 3.7.2 Linear and Affinely Structured Matrix Examples.......... 81 3.7.3 Affine Structured Examples II..................... 84 3.7.4 Lower Rank Approximation of a 4 × 4 Matrix Polynomial...... 85 3.8 Conclusion.................................... 90 4 Matrix Polynomial Determinants, Adjoints, and their Derivatives 91 4.1 Introduction................................... 91 4.1.1 Outline................................. 93 4.2 Overview of Existing Results and Techniques................. 93 4.3 The First Derivative of the Determinant................... 95 4.3.1 First-Order Perturbation Bounds for the Matrix Polynomial Deter- minant.................................. 96 4.4 The First Derivative of the Adjoint...................... 97 4.4.1 Computing the First Derivative.................... 97 4.4.2 First-Order Perturbation Bounds for the Matrix Polynomial Adjoint 101 4.5 Floating Point Algorithms for Matrix Polynomial Adjoint.......... 102 4.5.1 Exact Symbolic-Numeric Method................... 103 4.5.2 Floating Point Interpolation Method................. 103 4.5.3 Linear System Solving over R ..................... 104 4.5.4 Automatic Differentiation....................... 105 4.5.5 QZ Decomposition..........................

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    222 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us