Convex Relaxation for Low-Dimensional Representation: Phase Transitions and Limitations

Convex Relaxation for Low-Dimensional Representation: Phase Transitions and Limitations

Convex Relaxation for Low-Dimensional Representation: Phase Transitions and Limitations Thesis by Samet Oymak In Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy California Institute of Technology Pasadena, California 2015 (Defended June 12, 2014) c 2015 Samet Oymak All Rights Reserved ii Dedicated to my family iii Acknowledgments To begin with, it was a great pleasure to work with my advisor Babak Hassibi. Babak has always been a fatherly figure to me and my labmates. Most of our group are thousands of miles away from their home, and we are fortunate to have Babak as advisor, who always helps us with our troubles, whether they are personal or professional. As a graduate student, Babak’s intuition on identifying new problems and motivation for math was a great inspiration for me. He always encouraged me to do independent research and to be persistent with the toughest (mostly mathematical) challenges. I thank my thesis committee, Professor Maryam Fazel, Professor Joel Tropp, Professor Venkat Chan- drasekaran, and Professor PP Vaidyanathan. Maryam has literally been a second advisor to me, and I am very thankful for her professional guidance. I would like to thank Joel for the material he taught me in class as well as for answering my research questions with great patience and for his helpful feedback for this thesis. It was always great to interact with Venkat and PP. Thanks to them, Caltech has been a better and more stimulating place for me. I would like to thank fellow graduate students and lab-mates for their help and support. My labmates Amin Khajehnejad, Ramya Korlakai Vinayak, Kishore Jaganathan, and Chris Thrampoulidis were also my research collaborators with whom I spent countless hours discussing research and meeting deadlines. Kishore, Hyoung Jun Ahn, and Eyal En Gad were my partners in the fiercest racquetball games. Matthew Thill, Wei Mao, and Ravi Teja Sukhavasi were a crucial part of all the fun in the lab. During my time at Caltech, Shirley Slattery and Tanya Owen were amazingly helpful with providing information and daily issues. I cannot possibly count how many times I asked Shirley “Is Babak around?”. I can’t imagine how I could enjoy my time here without my dear friends Necmiye Ozay, Murat Acar, Aycan Yurtsever, Selim Hanay, and Jiasi Chen. I guess I could meet these great people only at Caltech. They are now literally all over the world; but I do hope our paths will cross again. Finally, I am most grateful to my family for always being there for me. Their continuous support has been a source of my ambition and motivation since primary school. It has been a challenging five years away from them, and I am very happy to know that, we are still as connected as ever. iv Abstract There is a growing interest in taking advantage of possible patterns and structures in data so as to extract the desired information and overcome the curse of dimensionality. In a wide range of applications, including computer vision, machine learning, medical imaging, and social networks, the signal that gives rise to the observations can be modeled to be approximately sparse and exploiting this fact can be very beneficial. This has led to an immense interest in the problem of efficiently reconstructing a sparse signal from limited linear observations. More recently, low-rank approximation techniques have become prominent tools to approach problems arising in machine learning, system identification and quantum tomography. In sparse and low-rank estimation problems, the challenge is the inherent intractability of the objective function, and one needs efficient methods to capture the low-dimensionality of these models. Convex op- timization is often a promising tool to attack such problems. An intractable problem with a combinatorial objective can often be “relaxed” to obtain a tractable but almost as powerful convex optimization problem. This dissertation studies convex optimization techniques that can take advantage of low-dimensional rep- resentations of the underlying high-dimensional data. We provide provable guarantees that ensure that the proposed algorithms will succeed under reasonable conditions, and answer questions of the following flavor: For a given number of measurements, can we reliably estimate the true signal? • If so, how good is the reconstruction as a function of the model parameters? • More specifically, i) Focusing on linear inverse problems, we generalize the classical error bounds known for the least-squares technique to the lasso formulation, which incorporates the signal model. ii) We show that intuitive convex approaches do not perform as well as expected when it comes to signals that have multiple low-dimensional structures simultaneously. iii) Finally, we propose convex relaxations for the graph clustering problem and give sharp performance guarantees for a family of graphs arising from the so-called stochastic block model. We pay particular attention to the following aspects. For i) and ii), we aim to provide a general geometric framework, in which the results on sparse and low-rank estimation can be obtained as special cases. For i) and iii), we investigate the precise performance characterization, which yields the right constants in our bounds and the true dependence between the problem parameters. v Contents Acknowledgments iv Abstract v 1 Introduction 1 1.1 Sparse signal estimation....................................2 1.2 Low-dimensional representation via convex optimization...................5 1.3 Phase Transitions........................................ 10 1.4 Literature Survey........................................ 11 1.5 Contributions.......................................... 18 2 Preliminaries 29 2.1 Notation............................................. 29 2.2 Projection and Distance..................................... 30 2.3 Gaussian width, Statistical dimension and Gaussian distance................. 32 2.4 Denoising via proximal operator................................ 34 2.5 Inequalities for Gaussian Processes............................... 35 3 A General Theory of Noisy Linear Inverse Problems 38 3.1 Our Approach.......................................... 43 3.2 Main Results.......................................... 50 3.3 Discussion of the Results.................................... 55 3.4 Applying Gaussian Min-Max Theorem............................. 63 3.5 After Gordon’s Theorem: Analyzing the Key Optimizations................. 68 3.6 The NSE of the C-LASSO................................... 76 vi 3.7 Constrained-LASSO Analysis for Arbitrary s ......................... 81 3.8 `2-LASSO: Regions of Operation............................... 92 3.9 The NSE of the `2-LASSO................................... 97 3.10 Nonasymptotic results on `2-LASSO.............................. 102 3.11 Proof of Theorem 3.7...................................... 104 2 3.12 `2-LASSO............................................ 108 3.13 Converse Results........................................ 115 3.14 Numerical Results........................................ 119 3.15 Future Directions........................................ 122 4 Elementary equivalences in compressed sensing 125 4.1 A comparison between the Bernoulli and Gaussian ensembles................ 125 4.2 An equivalence between the recovery conditions for sparse signals and low-rank matrices.. 133 5 Simultaneously Structured Models 147 5.1 Problem Setup.......................................... 153 5.2 Main Results: Theorem Statements............................... 156 5.3 Measurement ensembles.................................... 163 5.4 Upper bounds.......................................... 168 5.5 General Simultaneously Structured Model Recovery..................... 171 5.6 Proofs for Section 5.2.2..................................... 178 5.7 Numerical Experiments..................................... 182 5.8 Discussion............................................ 185 6 Graph Clustering via Low-Rank and Sparse Decomposition 188 6.1 Model.............................................. 191 6.2 Main Results.......................................... 192 6.3 Simulations........................................... 196 6.4 Discussion and Conclusion................................... 197 7 Conclusions 198 7.1 Generalized Lasso........................................ 198 vii 7.2 Universality of the Phase Transitions.............................. 199 7.3 Simultaneously structured signals............................... 200 7.4 Structured signal recovery beyond convexity.......................... 201 Bibliography 202 A Further Proofs for Chapter3 227 A.1 Auxiliary Results........................................ 227 A.2 Proof of Proposition 2.7..................................... 231 A.3 The Dual of the LASSO.................................... 235 A.4 Proofs for Section 3.5...................................... 236 A.5 Deviation Analysis: Key Lemma................................ 247 A.6 Proof of Lemma 3.20...................................... 251 A.7 Explicit formulas for well-known functions.......................... 256 A.8 Gaussian Width of the Widened Tangent Cone......................... 261 B Further Proofs for Chapter5 264 B.1 Properties of Cones....................................... 264 B.2 Norms in Sparse and Low-rank Model............................. 266 B.3 Results on non-convex recovery................................ 268 C Further Proofs for Chapter6 270 C.1 On the success of the simple program............................. 270

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    302 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us