Optimizing Krylov Subspace Methods for Linear Systems
Total Page:16
File Type:pdf, Size:1020Kb
OPTIMIZING KRYLOV SUBSPACE METHODS FOR LINEAR SYSTEMS AND LEAST SQUARES PROBLEMS by MEI YANG Presented to the Faculty of the Graduate School of The University of Texas at Arlington in Partial Fulfillment of the Requirements for the Degree of DOCTOR OF PHILOSOPHY THE UNIVERSITY OF TEXAS AT ARLINGTON August 2018 Copyright ⃝c by Mei Yang 2018 All Rights Reserved To my parents Fangrong Zhang and Wei Yang, And my husband Zicong Zhou and little boy Wenyuan Zhou, who give me a happy life. ACKNOWLEDGEMENTS First of all, I would like to thank my academic advisor, Professor Ren-Cang Li. I feel very fortunate and honored for having had the opportunity to work with him. Without Prof. Li's help, I couldn't come to the US and finish my doctral degree. I thank Prof. Li from the bottom of my heart for his constant support, patience, encouragement and for putting so much effort and time into guiding me. I would also like to thank Prof. Guojun Liao, Prof. Chaoqun Liu, and Prof. Li Wang for their interest in my research and for taking time to serve in my dissertation committee. I wish to thank everyone in the Department of Mathematics at UTA, for being kind, supportive and encouraging during my doctoral studies. I thank all my friends during doctoral study. I want to thank Gul Karaduman, Sat Byul Seo, Tiffany Tsai and Morgan Choi who shared nice friendships and un- forgettable memory with me for the past several years. I am grateful to have lovely church families who helped me a lot when I was in need, especially when the baby was young. All of them made my intensive study life colorful. I want to express my deepest gratitude and love for my wonderful parents and my grandparants. Their spiritual support is the most biggest motivation for me to pursue my Ph.D degree. They didn't limit my choices but gave my fully encouragement and valuable trust to let me pursue my dream. Finally, I want to express my love to my dearest husband, Zicong Zhou, who loves, cares, and supports my throughout the past few years. He is alway being there iv with me and help me in every aspects, from daily cooking to mathematics dissusion. He is the best husband I've ever seen. Without his presence, none of my success would have been possible. August 10, 2018 v ABSTRACT OPTIMIZING KRYLOV SUBSPACE METHODS FOR LINEAR SYSTEMS AND LEAST SQUARES PROBLEMS Mei Yang, Ph.D. The University of Texas at Arlington, 2018 Supervising Professor: Ren-Cang Li The linear system and the linear least squares problem are two fundamental numerical linear algebra problems. Krylov subspace methods are the most practical and common techniques to build solvers. In this thesis, we focus on optimizing Krylov subspace methods for nonsymmetric linear systems and least squares problems. For nonsymmetric linear systems Ax = b, one of Krylov subspace methods is GMRES, which seeks approximate solutions over the Krylov subspace Kk(A; b) (with given initial guess x0 = 0). Constucting different search spaces and applying restart strategy are two techniques used to deal with possible slow convergence and to reduce computational cost in GMRES and variants of GMRES, such as restarted GMRES and flexible GMRES. In this thesis, we present a numerical method called the heavy ball flexible GMRES method for nonsymmetric linear systems, which is designed to salvage the lost convergence speed of restarted flexible GMRES while keeping the benefit of the restarted flexible GMRES in limiting memory usage and controlling orthogonalization cost. The search space is extended in every iteration by applying the heavy ball idea in optimization. Comparison of the restarted flexible vi GMRES and the heavy ball flexible GMRES are shown in numerical experiments to illustrate the efficiency of our method. The linear least squares problem, min kAx − bk , widely occurs in statistics, x 2 geodetics, photogrammetry, signal processing, and control. Based on the Golub- Kahan process, LSMR seeks approximate solutions xk over the Krylov subspace ( ) T T Kk A A; A b . We are aiming to optimize LSMR in two ways. Firstly, we pro- pose the heavy ball minimal residual method by utilizing the restarted technique to combine LSMR with the heavy ball idea. The restarted Golub-Kahan bidiago- nalization process is applied to control the memory usage and reorthogonalization cost. We add a new direction to include previous iterations' information in extending the searching subspace, which also speeds up the convergence. Secondly, we present a flexible preconditioned iterative method for least squares problems. Because of lacking of effective preconditioners, LSMR sometimes stagnates as iterations go on. Applying a good preconditioner in LSMR is critical to speed up convergence. Usu- ally, it's impossible to tell whether or not a given preconditioner is suitable for the problem beforehand. So, one may attempt many possible preconditioners together and switch periodically among them. We want to design an algorithm that can switch among preconditioners in the outer iterations instead of restarting LSMR. Our flexible preconditioned LSMR method takes an outer-inner iteration to con- stuct changable preconditioners, and applies the factorization-free preconditioning technique to the Golub-Kahan process to reduce computational cost. Numerical examples demonstrate the efficiency of our method compared to LSMR. vii TABLE OF CONTENTS ACKNOWLEDGEMENTS . iv ABSTRACT . vi LIST OF FIGURES . x LIST OF TABLES . xii LIST OF ALGORITHMS . xiii Chapter Page 1. INTRODUCTION . 1 1.1 Problem Statement . 1 1.2 Krylov Subspace Methods for Square Linear Systems . 2 1.2.1 The Arnoldi Process . 3 1.2.2 The Lanczos Process . 5 1.3 Krylov Subspace Methods for Least Squares Problems . 10 1.3.1 The Golub-Kahan process . 10 1.4 Preconditioning . 15 1.5 Overview . 16 2. HEAVY BALL FLEXIBLE GMRES METHOD FOR NONSYMMETRIC LINEAR SYSTEMS . 17 2.1 Variants of GMRES . 17 2.1.1 Restarted GMRES . 19 2.1.2 Heavy Ball GMRES . 21 2.1.3 Flexible GMRES . 23 2.2 Motivation and Main Idea . 24 viii 2.3 Heavy Ball FGMRES Method . 27 2.4 Numerical Experiments . 30 3. HEAVY BALL MINIMAL RESIDUAL METHOD FOR LEAST SQUARES PROBLEMS . 41 3.1 Motivation and Main Idea . 41 3.2 Heavy Ball Minimal Residual Method . 43 3.3 Pseudo-code of Extended LSMR . 46 3.4 Numerical Experiments . 52 3.4.1 Reorthogonalization . 52 3.4.2 Backward Error . 57 3.4.3 Comparison of HBMR and LSMR . 57 4. FLEXIBLE PRECONDITIONED ITERATIVE METHOD FOR LEAST SQUARES PROBLEMS . 64 4.1 Motivation and Main Idea . 64 4.2 Preconditioned Least Squares Problem . 65 4.3 Flexible Preconditioned Iterative Method . 72 4.4 Numerical Experiments . 75 5. SUMMARY . 89 5.1 Contributions . 89 5.2 Future Work . 89 REFERENCES . 91 BIOGRAPHICAL STATEMENT . 98 ix LIST OF FIGURES Figure Page 2.1 FGMRES vs: HBGMRES vs: REGMRES for cavity16 ........ 26 2.2 NRes vs: cycle for cavity10. T op : selective reorthogonalization; Bottom : always reorthogonalization . 33 2.3 NRes vs: cycle for cavity16. T op : selective reorthogonalization; Bottom : always reorthogonalization . 34 2.4 NRes vs: cycle for comsol. T op : selective reorthogonalization; Bottom : always reorthogonalization . 35 2.5 NRes vs: cycle for chipcool0. T op : selective reorthogonalization; Bottom : always reorthogonalization . 36 2.6 NRes vs: cycle for flowmeter5. T op : selective reorthogonalization; Bottom : always reorthogonalization . 37 2.7 NRes vs: cycle for e20r0000. T op : selective reorthogonalization; Bottom : always reorthogonalization . 38 2.8 NRes vs: cycle for e20r0100. T op : selective reorthogonalization; Bottom : always reorthogonalization . 39 2.9 NRes vs: cycle for sherman3. T op : selective reorthogonalization; Bottom : always reorthogonalization . 40 3.1 NRes vs: iteration for abb313 and ash292 ................ 54 3.2 NRes vs: iteration for ash85 and e05r0000 ............... 55 3.3 NRes vs: iteration for e20r0100 and illc1850 ............. 56 x 3.4 Results of lp 80bau3b. T op : NRes vs: iteration; Bottom : E1 vs: iteration . 60 3.5 Results of lp e226. T op : NRes vs: iteration; Bottom : E1 vs: iteration 61 3.6 Results of lp pilot ja. T op : NRes vs: iteration; Bottom : E1 vs: iteration . 62 3.7 Results of lp brandy. T op : NRes vs: iteration; Bottom : E1 vs: iteration 63 4.1 NRes vs: iteration for lp cre a and lp cre b .............. 79 4.2 NRes vs: iteration for lp cre c and lp greenbea ........... 80 4.3 NRes vs. iteration for lp ken 11 and lp maros ............. 81 4.4 NRes vs. iteration for lp pilot and lp osa 07 ............ 82 4.5 Relative Approximate Backward Error vs. iteration for lp cre a and lp cre b .................................. 83 4.6 Relative Approximate Backward Error vs. iteration for lp cre c and lp greenbea ................................ 84 4.7 Relative Approximate Backward Error vs. iteration for lp ken 11 and lp maros .................................. 85 4.8 Relative Approximate Backward Error vs. iteration for lp osa 07 and lp pilot .................................. 86 4.9 Enlarged view of results for lp cre b. T op : NRes vs. iteration; Bottom : Relative Approximate Backward Error vs. iteration . 87 4.10 Results of different l-step GMRES solving inner iteration for lp pilot. Left : NRes vs. iteration; Right : Relative Approximate Backward Error vs. iteration . 88 xi LIST OF TABLES Table Page 1.1 Notation . 16 2.1 Flops of GMRES Variants . 25 2.2 Testing Matrices . 30 2.3 Number of Cycles . 32 3.1 Testing Matrices for Reorthogonalization . 53 3.2 Number of Iterations for LSMR . 53 3.3 Testing Matrices . 58 4.1 Flops for LSMR Variants . 75 4.2 Testing Matrices . 76 4.3 Number of Iterations . 77 xii LIST OF ALGORITHMS 1.1 Arnoldi process . 3 1.2 FOM...................................