Submodular and Sparse Optimization Methods for Machine Learning and Communication

Submodular and Sparse Optimization Methods for Machine Learning and Communication

博士論文 Submodular and Sparse Optimization Methods for Machine Learning and Communication (機械学習と通信のための劣モジュラ・スパース最適化手法) 相馬 輔 Submodular and Sparse Optimization Methods for Machine Learning and Communication Tasuku Soma February 11, 2016 Abstract This dissertation considers optimization problems arising from machine learning and commu- nication, and devises efficient algorithms by exploiting submodular and sparse optimization techniques. The first subject of this dissertation is submodular function maximization over the inte- ger lattice. Maximizing a monotone submodular set function has been extensively studied in machine learning, social science, and related areas, since it encompasses a variety of problems considered in these areas and efficient greedy algorithms are available. However, we often face real scenarios that are beyond these existing models based on submodular set functions. For example, the existing models cannot capture the budget allocation problem in which we have to decide how much budget should be set aside. In this dissertation, we overcome this funda- mental limitation of the existing models by exploiting submodularity over the integer lattice. We demonstrate that our new framework can naturally and concisely capture these difficult real scenarios, and provide efficient approximation algorithms for various constraints. In addition to theoretical guarantees, we conduct numerical experiments to establish practical efficiency of our model and algorithms. Then we move on matrix completion problems. Roughly speaking, matrix completion prob- lems are to determine missing entries in a given partial matrix under some criteria. Our first result is a new algorithm for constructing a multicast code for a wireless network using max- rank matrix completion. In the main ingredient of our algorithm, we employ the theory of mixed matrices, matroids, and submodular optimization. Second, we propose a new problem, the low-rank basis problem for a matrix subspace. Originally this problem comes from com- binatorial optimization, but it turns out that this problem has various applications, including image separation and data compression. We present an efficient heuristic algorithm for this problem, using matroid theory and sparse optimization. The last subject of this dissertation is compressed sensing. The main goal of compressed sensing is to recover a high-dimensional sparse signal from a few linear measurements. While the problem setup is quite simple, compressed sensing has been widely applied to machine learning, statistics, and signal processing. In this dissertation, we consider stable signal recovery in compressed sensing and show that the sum-of-squares method yields a new polynomial time recovery method for Rademacher sensing matrices. Our result sheds light on the power of sum-of-squares method on nonconvex problem arising from compressed sensing. i Acknowledgments I have no doubt on that this dissertation would have been impossible without various support of many people. First of all, I am grateful for my supervisor, Satoru Iwata, who invited me to the world of academic research. He has been continuously giving many research ideas, suggestions, and advice during my Ph.D. program with great patience. My days in the Ph.D. course would have been completely different without his infinite support. I also would like to acknowledge that many parts of this dissertation are based on joint works with my collaborators. Especially, I would like to thank Yuichi Yoshida. His endless mathe- matical ideas and great depth of knowledge on computer science have been truly remarkable and inspired me a lot. The idea of the first part of this dissertation originates from my first joint project with Naonori Kakimura, Kazuhiro Inaba, Ken-ichi Kawarabayashi. Their research ideas and valuable discussions with them were really helpful and they took me to the research field of machine learning. Chapter8 benefited from many discussions and suggestions of Yuji Nakatsukasa and Andr´e Uschmajew during our joint work. They raised my small mathematical problem into really fascinating one, invited me to an exciting research area of sparse optimization, and greatly expanded the scope of my research. I also would like to thank members and former members of our laboratory, especially Ku- nihiro Sadakane, Akiko Takeda, Yutaro Yamaguchi, Yu Yokoi, Naoki Ito, Shuichi Katsumata, and Kaito Fujii. Their valuable comments in our seminar and supports in my daily life in the laboratory have been really appreciated. Furthermore, I would like to thank my parents for their support and love. Spending ten years as a student was not short, but they always encouraged me to continue my research. Finally, I thank Japan Society for Promotion of Science, JST ERATO Kawarabayashi Large Graph Project, JST CREST Iwata team, and the University of Tokyo for their financial sup- port. ii Contents 1 Introduction 1 1.1 Monotone Submodular Function Maximization...................1 1.2 Matrix Completion...................................3 1.3 Compressed Sensing..................................3 1.4 Organization of This Thesis..............................4 1.4.1 Publications Contained in This Dissertation.................5 1.5 Notation.........................................5 2 Submodular Functions, Matroids, and Polymatroids7 2.1 Submodular Functions.................................7 2.1.1 Extensions of Submodular Function.....................8 2.1.2 Submodular Function Minimization.....................9 2.2 Matroids and Polymatroids..............................9 2.2.1 Matroids.................................... 10 2.2.2 Polymatroids.................................. 11 3 Submodularity over the Integer Lattice and Budget Allocation 13 3.1 Monotone Submodular Optimization: Literature Overview............. 13 3.1.1 Monotone Submodular Function Maximization............... 15 3.1.2 Submodular Cover............................... 18 3.2 Optimal Budget Allocation: Motivating Problem.................. 18 3.3 Beyond Set Functions: Submodularity over the Integer Lattice.......... 19 3.3.1 Submodularity in Optimal Budget Allocation................ 21 3.4 Our Framework..................................... 22 3.4.1 Budget Allocation Problem with a Competitor............... 23 3.4.2 Generalized Sensor Placement Model..................... 24 3.4.3 Other Applications............................... 24 3.5 Facts on Submodular Function over the Integer Lattice............... 25 3.5.1 Useful Lemmas................................. 25 3.5.2 Reducibility of DR-submodular Functions.................. 26 4 Monotone Submodular Function Maximization over the Integer Lattice 27 4.1 Overview of this Chapter............................... 27 4.1.1 Technical Contribution............................. 28 4.1.2 Related Work.................................. 28 4.1.3 Preliminaries.................................. 29 4.2 Cardinality Constraint for DR-Submodular Function................ 29 4.3 Cardinality Constraint for Lattice Submodular Function.............. 30 4.4 Polymatroid Constraint for DR-Submodular Function............... 33 iii 4.4.1 Continuous Extension for Polymatroid Constraints............. 33 4.4.2 Continuous Greedy Algorithm for Polymatroid Constraint......... 34 4.4.3 Rounding.................................... 38 4.5 Knapsack Constrant for DR-Submodular Function................. 38 4.5.1 Multilinear extension for knapsack constraints............... 39 4.5.2 Algorithm.................................... 40 4.5.3 Subroutine for handling small items..................... 40 4.5.4 Complete algorithm.............................. 43 4.6 Knapsack Constrant for Lattice Submodular Function............... 48 5 The Diminishing Return Submodular Cover Problem 51 5.1 Algorithm for the DR-submodular Cover....................... 52 5.2 Discussion........................................ 55 5.3 Experiments....................................... 56 5.3.1 Experimental Setting.............................. 56 5.3.2 Experimental Results............................. 58 6 Introduction to Matrix Completion 60 6.1 Matrices with Indeterminates and Matrix Subspace................. 60 6.2 Max-Rank Matrix Completion............................ 61 6.2.1 Lov´asz’sRandomized Algorithm....................... 61 6.2.2 Deterministic Algorithms for Max-Rank Matrix Completion........ 61 6.2.3 Simultaneous Max-Rank Matrix Completion................ 62 6.2.4 Hardness of Max-Rank Matrix Completion................. 62 6.3 Low-Rank Matrix Completion............................. 62 6.3.1 Nuclear Norm Relaxation........................... 63 6.3.2 Random Matrix and Observation Model................... 64 7 Faster Algorithm for Multicasting in Linear Deterministic Relay Network 65 7.1 Introduction....................................... 65 7.1.1 Our Contribution................................ 66 7.1.2 Related Work.................................. 67 7.2 Preliminaries...................................... 67 7.2.1 Mixed Matrix.................................. 67 7.2.2 Mixed Matrix Completion........................... 68 7.2.3 Cauchy-Binet Formula............................. 69 7.3 Flow Model....................................... 69 7.4 Algorithm........................................ 70 7.5 Complexity Excluding Unicast Computation..................... 72 7.6 Concluding Remarks.................................. 73 8 The Low-Rank Basis Problem for a Matrix Subspace 74 8.1 Introduction....................................... 74 8.1.1 Our Contribution...............................

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    137 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us