A Fast Approximation Algorithm for Tree-Sparse Recovery

A Fast Approximation Algorithm for Tree-Sparse Recovery

A Fast Approximation Algorithm for Tree-Sparse Recovery Chinmay Hegde, Piotr Indyk, Ludwig Schmidt1 Massachusetts Institute of Technology Abstract—Sparse signals whose nonzeros obey a tree-like kx − xbk2. This problem arises in several settings including structure occur in a range of applications such as image modeling, signal / image compression and denoising [7, 8] genetic data analysis, and compressive sensing. An important The optimal tree-projection problem has a rich history; see, problem encountered in recovering signals is that of optimal tree- projection, i.e., finding the closest tree-sparse signal for a given for example, the papers [9–11] and references therein. The query signal. However, this problem can be computationally very best available (theoretical) performance for this problem is demanding: for optimally projecting a length-n signal onto a tree achieved by the dynamic-programming (DP) approach of [12], with sparsity k, the best existing algorithms incur a high runtime building upon the algorithm first developed in [11]. For signal of O(nk). This can often be impractical. length n and target sparsity k, the algorithm has a runtime of We suggest an alternative approach to tree-sparse recovery. Our approach is based on a specific approximation algorithm O(nk). Unfortunately, this means that the algorithm does not for tree-projection and provably has a near-linear runtime of easily scale to real-world problem sizes. For example, even a O(n log(kr)) and a memory cost of O(n), where r is the modestly-sized natural image (say, of size n = 512×512) can dynamic range of the signal. We leverage this approach in a fast only be expected to be tree-sparse with parameter k ≈ 104. In recovery algorithm for tree-sparse compressive sensing that scales this case, nk exceeds 2:5×109, and hence the runtime quickly extremely well to high-dimensional datasets. Experimental results on several test cases demonstrate the validity of our approach. becomes impractical for megapixel-size images. In this paper, we develop an alternative approach for tree- I. INTRODUCTION projection. The core of our approach is a novel approxima- Over the last decade, the concept of sparsity has attracted tion algorithm that provably has a near-linear runtime of significant attention among researchers in statistical signal O(n log(kr)) and a memory cost of O(n), where r is the processing, information theory, and numerical optimization. dynamic range of the signal. Importantly, the memory cost Sparsity serves as the foundation of compressive sensing (CS), is independent of the sparsity parameter k. Therefore, our a new paradigm for digital signal and image acquisition [1, approach is eminently suitable for applications involving very 2]. A key result in CS states that a k-sparse signal of length high-dimensional signals and images, which we demonstrate n can be recovered using only m = O(k log n=k) linear via several experiments. measurements; when k n, this can have significant benefits Our tree-projection algorithm is approximate: instead of both in theory and in practice. exactly minimizing the error kx − xbk2, we merely return an However, several classes of real-world signals and images estimate xb whose error is at most a constant factor c1 times possess additional structure beyond mere sparsity. One exam- the optimal achievable error (i.e., that achieved by [12]). ple is the class of signals encountered in digital communi- Moreover, our signal estimate xb is tree-sparse with parameter cation: these are often “bursty” and hence can be modeled c2k, therefore giving us a bicriterion approximation guarantee. as sparse signals whose nonzeros are grouped in a small At a high level, our algorithm can be viewed as an extension number of blocks. A more sophisticated example is the class of of the complexity-penalized residual sum of squares (CPRSS) natural images. Here, the dominant coefficients in the wavelet- formulation proposed by Donoho in [9]. We pose the exact tree domain representation can be modeled as a rooted, connected projection as a (nonconvex) sparsity-constrained optimization tree [3]. These (and several other) notions of structure can problem. We perform a Lagrangian relaxation of the sparsity be concisely captured via the notion of a structured sparsity constraint and, similar to Donoho, solve the relaxed problem model. In the CS context, structured sparsity can enable signal using a dynamic program (DP) with runtime (as well as recovery algorithms that succeed with merely O(k) linear memory cost) of O(n). We then iterate this step O(log(kr)) measurements [4]. times by conducting a binary search over the Lagrangian Our focus in this paper is on tree-structured sparsity. Tree- relaxation parameter until we arrive at the target sparsity k. sparse data are not only interesting from a theoretical perspec- A careful termination criterion for the binary search gives our tive but also naturally emerge in a range of applications such desired approximation guarantee. as imaging and genomics [3, 5, 6]. Of particular interest to us is We combine our approximate tree-projection algorithm with n the model-based CS framework proposed in [4]. This produces the following problem: given an arbitrary signal x 2 R , find the k-sparse tree-structured signal x that minimizes the error an extremely fast CS recovery algorithm for tree-sparse sig- b nals. We present several experiments on synthetic and real- 1Authors ordered alphabetically. world signals that demonstrate the benefits of our approach. II. BACKGROUND A. Sparsity and Structure A signal x 2 Rn is said to be k-sparse if no more than k of its coefficients are nonzero. The support of x, denoted by supp(x) ⊆ [n], indicates the locations of its nonzero entries. Fig. 1. A binary wavelet tree for a one-dimensional signal. The squares Suppose that we possess some additional a priori informa- denote the large wavelet coefficients that arise from the discontinuities in the tion about the support of our signals of interest. One way piecewise smooth signal drawn below. The support of the large coefficients to model this information is as follows [4]: denote the set forms a rooted, connected tree. of allowed supports with Mk = fΩ1; Ω2;:::; ΩLg, where with a runtime of O(nk). However, this is still impractical for Ωi ⊆ [n] and jΩij = k. Often it is useful to work with the high-dimensional signal processing applications. closure of Mk under taking subsets, which we denote with + C. Compressive Sensing Mk = fΩ ⊆ [n] j Ω ⊆ S for some S 2 Mkg. Then, we n Suppose that instead of collecting all the coefficients of a define a structured sparsity model, Mk ⊆ R , as the set of n + k-sparse vector x 2 n, we merely record m = O(k log n=k) vectors Mk = fx 2 R j supp(x) 2 Mk g. The number of R x m n allowed supports L is called the “size” of the model Mk; inner products (measurements) of with pre- n selected vectors, i.e., we observe an m-dimensional vector typically, L k . y = Ax, where A 2 m×n is the measurement matrix. We define the model-projection problem for Mk as follows: R n ∗ ∗ The central result of compressive sensing (CS) is that under given x 2 R , determine a x 2 Mk such that kx − x kp is minimized for a norm parameter p. In general, this problem certain assumptions on A, x can be exactly recovered from y, even though A is rank-deficient (and therefore has a nontrivial can be hard, since Mk is typically non-convex. However, nullspace). several special choices of models Mk do admit polynomial- time model-projection methods; see [13] for an overview. Numerous algorithms for signal recovery from compressive measurements have been developed. Of special interest to us B. Tree-Sparsity are iterative support selection algorithms, e.g. CoSaMP [14]. Our focus in this paper is the tree-structured sparsity model Such algorithms can be modified to use any arbitrary struc- (or simply, the tree-sparsity model). We assume that the n tured sparsity model when given access to a model-projection oracle [4]. These modified “model-based” recovery algorithms coefficients of a signal x 2 Rn can be arranged as the nodes of a full d-ary tree. Then, the tree-sparsity model comprises offer considerable benefits both in theory and in practice. For the set of k-sparse signals whose nonzero coefficients form a the tree-sparsity model, a modified version of CoSaMP prov- m = O(k) rooted, connected subtree. It can be shown that the size of this ably recovers tree-sparse signals using measure- model is upper bounded by L ≤ (2e)k=(k + 1) [4]. ments, thus matching the information-theoretic lower bound. For the rest of the paper, we denote the set of supports III. TREE-SPARSE APPROXIMATION corresponding to a subtree rooted at node i with Ti. Then Recall that the main goal in tree-projection is the following: k = fΩ ⊆ [n] j Ω 2 1 and jΩj = kg is the formal n M T for a given signal x 2 R , minimize the quantity kx − xΩkp definition of the tree-sparsity model (we assume node 1 to 2 over Ω 2 Mk for a given 1 ≤ p < 1. In this paper, we are be the root of the entire signal). interested in the following related problem: find a Ωb 2 Mk0 The tree-sparsity model can be used for modeling a va- 0 with k ≤ c2k and riety of signal classes. A compelling application of this x − x ≤ c min kx − x k : (1) model emerges while studying the wavelet decomposition Ωb p 1 Ω p Ω2Mk of piecewise-smooth signals and images.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    5 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us