Structured Compressed Sensing - Using Patterns in Sparsity

Structured Compressed Sensing - Using Patterns in Sparsity

Structured Compressed Sensing - Using Patterns in Sparsity Johannes Maly Technische Universit¨atM¨unchen, Department of Mathematics, Chair of Applied Numerical Analysis [email protected] CoSIP Workshop, Berlin, Dezember 9, 2016 Overview Classical Compressed Sensing Structures in Sparsity I - Joint Sparsity Structures in Sparsity II - Union of Subspaces Conclusion Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 2 of 44 Classical Compressed Sensing Overview Classical Compressed Sensing Structures in Sparsity I - Joint Sparsity Structures in Sparsity II - Union of Subspaces Conclusion Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 3 of 44 Classical Compressed Sensing Compressed Sensing N Let x 2 R be some unknown k-sparse signal. Then, x can be recovered from few linear measurements y = A · x m×N m where A 2 R is a (random) matrix, y 2 R is the vector of measurements and m N. Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 4 of 44 Classical Compressed Sensing Compressed Sensing N Let x 2 R be some unknown k-sparse signal. Then, x can be recovered from few linear measurements y = A · x m×N m where A 2 R is a (random) matrix, y 2 R is the vector of measurements and m N. It is sufficient to have N m Ck log & k measurements to recover x (with high probability) by greedy strategies, e.g. Orthogonal Matching Pursuit, or convex optimization, e.g. `1-minimization. Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 5 of 44 OMP INPUT: matrix A; measurement vector y: INIT: T0 = ;; x0 = 0: ITERATION: until stopping criterion is met T jn+1 arg maxj2[N] (A (y − Axn))j ; Tn+1 Tn [ fjn+1g; xn+1 arg minz2RN fky − Azk2; supp(z) ⊂ Tn+1g : OUTPUT: then ~-sparse approximationx ^ := xn~ Classical Compressed Sensing Orthogonal Matching Pursuit OMP is a simple algorithm that tries to find the true support of x by k greedy steps. It selects stepwise those columns of A that have the highest correlation with the measurements to build a support estimate T . Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 6 of 44 Classical Compressed Sensing Orthogonal Matching Pursuit OMP is a simple algorithm that tries to find the true support of x by k greedy steps. It selects stepwise those columns of A that have the highest correlation with the measurements to build a support estimate T . OMP INPUT: matrix A; measurement vector y: INIT: T0 = ;; x0 = 0: ITERATION: until stopping criterion is met T jn+1 arg maxj2[N] (A (y − Axn))j ; Tn+1 Tn [ fjn+1g; xn+1 arg minz2RN fky − Azk2; supp(z) ⊂ Tn+1g : OUTPUT: then ~-sparse approximationx ^ := xn~ Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 6 of 44 0 1 0 1 ha1; yi ha1; a`i T B . C B . C )A · y = @ . A = @ . A ) j1 = `: haN ; yi haN ; a`i Classical Compressed Sensing Orthogonal Matching Pursuit OMP is a simple algorithm that tries to find the true support of x by k greedy steps. It selects stepwise those columns of A that have the highest correlation with the measurements to build a support estimate T . 001 0 1 B.C j j B.C B C A · e` = @a1 ··· aN A · B1C = a` =: y B C j j B.C @.A 0 Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 7 of 44 ) j1 = `: Classical Compressed Sensing Orthogonal Matching Pursuit OMP is a simple algorithm that tries to find the true support of x by k greedy steps. It selects stepwise those columns of A that have the highest correlation with the measurements to build a support estimate T . 001 0 1 B.C j j B.C B C A · e` = @a1 ··· aN A · B1C = a` =: y B C j j B.C @.A 0 0 1 0 1 ha1; yi ha1; a`i T B . C B . C )A · y = @ . A = @ . A haN ; yi haN ; a`i Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 7 of 44 Classical Compressed Sensing Orthogonal Matching Pursuit OMP is a simple algorithm that tries to find the true support of x by k greedy steps. It selects stepwise those columns of A that have the highest correlation with the measurements to build a support estimate T . 001 0 1 B.C j j B.C B C A · e` = @a1 ··· aN A · B1C = a` =: y B C j j B.C @.A 0 0 1 0 1 ha1; yi ha1; a`i T B . C B . C )A · y = @ . A = @ . A ) j1 = `: haN ; yi haN ; a`i Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 7 of 44 Structures in Sparsity I - Joint Sparsity Overview Classical Compressed Sensing Structures in Sparsity I - Joint Sparsity Structures in Sparsity II - Union of Subspaces Conclusion Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 8 of 44 but L different measurements y1;:::; yL given by L different signals x1;:::; xL sharing a common support TX ⊂ [N]; jTX j ≤ k 0 j j 1 0 j j 1 A · @x1 ··· xLA = @y1 ··· yLA , A · X = Y ; j j j j N×L which can be written into matrices X 2 R , k-row-sparse, and m×L Y 2 R . Structures in Sparsity I - Joint Sparsity Joint Sparsity with Multiple Measurement Vectors We want now to improve on classical CS by using additional structure in sparsity. Possibly we not only have one measurement vector y from one sparse signal A · x = y; Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 9 of 44 Structures in Sparsity I - Joint Sparsity Joint Sparsity with Multiple Measurement Vectors We want now to improve on classical CS by using additional structure in sparsity. Possibly we not only have one measurement vector y from one sparse signal A · x = y; but L different measurements y1;:::; yL given by L different signals x1;:::; xL sharing a common support TX ⊂ [N]; jTX j ≤ k 0 j j 1 0 j j 1 A · @x1 ··· xLA = @y1 ··· yLA , A · X = Y ; j j j j N×L which can be written into matrices X 2 R , k-row-sparse, and m×L Y 2 R . Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 9 of 44 Based on the spark a necessary and sufficient condition for the measurements y = Ax to uniquely determine each k-sparse vector x is given by spark(A) k < ; 2 which leads to the requirement m ≥ 2k. Structures in Sparsity I - Joint Sparsity MMV in Theory Definition (spark(A)) m×N The spark of a matrix A 2 R is the smallest number of linearly dependent columns of A. It fulfills spark(A) ≤ m + 1. Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 10 of 44 Structures in Sparsity I - Joint Sparsity MMV in Theory Definition (spark(A)) m×N The spark of a matrix A 2 R is the smallest number of linearly dependent columns of A. It fulfills spark(A) ≤ m + 1. Based on the spark a necessary and sufficient condition for the measurements y = Ax to uniquely determine each k-sparse vector x is given by spark(A) k < ; 2 which leads to the requirement m ≥ 2k. Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 10 of 44 Structures in Sparsity I - Joint Sparsity MMV in Theory Definition (spark(A)) m×N The spark of a matrix A 2 R is the smallest number of linearly dependent columns of A. It fulfills spark(A) ≤ m + 1. In the MMV case a sufficient condition for the measurements Y = AX to uniquely determine the jointly k-sparse matrix X is spark(A)−1 + rank(X ) k < ; 2 which leads to the requirement m ≥ k + 1 if spark(A) and rank(X ) are optimal, i.e. spark(A) = m + 1 and rank(X ) = k. (see [1]) Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 11 of 44 Structures in Sparsity I - Joint Sparsity Simultaneous OMP SOMP uses a small modification of OMP to benefit from several different measurement vectors. Choosing the support index by the residual's largest row norm shall improve the support recovery. OMP INPUT: matrix A; measurement vector y: INIT: T0 = ;; x0 = 0: ITERATION: until stopping criterion is met T jn+1 arg maxj2[N] (A (y − Axn))j ; Tn+1 Tn [ fjn+1g; xn+1 arg minz2RN fky − Azk2; supp(z) ⊂ Tn+1g : OUTPUT: then ~-sparse approximationx ^ := xn~ Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 12 of 44 Structures in Sparsity I - Joint Sparsity Simultaneous OMP SOMP uses a small modification of OMP to benefit from several different measurement vectors. Choosing the support index by the residual's largest row norm shall improve the support recovery. SOMP INPUT: matrix A; measurement vectors Y = y1;:::; yL: INIT: T0 = ;; X0 = 0: ITERATION: until stopping criterion is met T jn+1 arg maxj2[N] (A (Y − AXn))j p; Tn+1 Tn [ fjn+1g; Xn+1 arg minZ2RN×L fkY − AZk2; supp(Z) ⊂ Tn+1g : OUTPUT: then ~ row-sparse approximation X^ := Xn~ Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 13 of 44 Structures in Sparsity I - Joint Sparsity SOMP - Numerics SOMP comparison with N = 256, m = 32 and L = 1; 2; 4; 8; 16; 32 (from left to right); Source: [2] Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 14 of 44 Structures in Sparsity II - Union of Subspaces Overview Classical Compressed Sensing Structures in Sparsity I - Joint Sparsity Structures in Sparsity II - Union of Subspaces Conclusion Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 15 of 44 N m ≈ k log k Structures in Sparsity II - Union of Subspaces Why Structure? Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 16 of 44 Structures in Sparsity II - Union of Subspaces Why Structure? N m ≈ k log k Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 16 of 44 m ≈ ? Structures in Sparsity II - Union of Subspaces Why Structure? Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity 17 of 44 Structures in Sparsity II - Union of Subspaces Why Structure? m ≈ ? Johannes Maly Structured Compressed Sensing - Using Patterns in Sparsity

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    65 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us