Deep Coupled-Representation Learning for Sparse Linear Inverse Problems with Side Information Evaggelia Tsiligianni and Nikos Deligiannis

Deep Coupled-Representation Learning for Sparse Linear Inverse Problems with Side Information Evaggelia Tsiligianni and Nikos Deligiannis

1 Deep Coupled-Representation Learning for Sparse Linear Inverse Problems with Side Information Evaggelia Tsiligianni and Nikos Deligiannis Abstract—In linear inverse problems, the goal is to recover a understanding what the model has learned is an active research target signal from undersampled, incomplete or noisy linear mea- topic. Among the efforts trying to bridge the gap between surements. Typically, the recovery relies on complex numerical analytical methods and deep learning is the work presented optimization methods; recent approaches perform an unfolding of a numerical algorithm into a neural network form, resulting in in [14], which introduced the idea of unfolding a numerical a substantial reduction of the computational complexity. In this algorithm for sparse approximation into a neural network paper, we consider the recovery of a target signal with the aid of a form. Several unfolding approaches [15]–[17] followed that correlated signal, the so-called side information (SI), and propose of [14]. Although the primary motivation for deploying deep a deep unfolding model that incorporates SI. The proposed model learning in inverse problems concerns the reduction of the is used to learn coupled representations of correlated signals from different modalities, enabling the recovery of multimodal data computational complexity, unfolding offers another significant at a low computational cost. As such, our work introduces the benefit: the model architecture allows a better insight in the first deep unfolding method with SI, which actually comes from inference procedure and enables the theoretical study of the a different modality. We apply our model to reconstruct near- network using results from sparse modelling [15], [18]–[20]. infrared images from undersampled measurements given RGB images as SI. Experimental results demonstrate the superior In this paper, we propose a deep unfolding model for the performance of the proposed framework against single-modal recovery of a signal with the aid of a correlated signal, the side deep learning methods that do not use SI, multimodal deep information (SI). To the best of our knowledge, this is the first learning designs, and optimization algorithms. work in deep unfolding that incorporates SI. Our contribution is as follows: (i) Inspired by [14], we design a deep neural network that unfolds a proximal algorithm for sparse approxi- I. INTRODUCTION mation with SI; we coin our model Learned Side Information Linear inverse problems arise in various signal processing Thresholding Algorithm (LeSITA). (ii) We use LeSITA in domains such as computational imaging, remote sensing, seis- an autoencoder fashion to learn coupled representations of mology and astronomy, to name a few. These problems can correlated signals from different modalities. (iii) We design be expressed by a linear equation of the form: a LeSITA-based reconstruction operator that utilizes learned y = Φx + e; (1) SI provided by the autoencoder to enhance signal recovery. We test our method in an example application, namely, mul- where x Rn is the unknown signal, Φ Rm×n, m n, 2 2 timodal reconstruction from CS measurements. Other inverse is a linear operator, and y Rm denotes the observations 2 problems of the form (1) such as image super-resolution [8], contaminated with noise e Rm. Sparsity is commonly used 2 [21] or image denoising [22] can benefit from the proposed for the regularization of ill-posed inverse problems, leading to approach. We compare our method with existing single-modal the so-called sparse approximation problem [1]. Compressed deep learning methods that do not use SI, multimodal deep sensing (CS) [2] deals with the sparse recovery of linearly learning designs, and optimization algorithms, showing its arXiv:1907.02511v1 [cs.LG] 4 Jul 2019 subsampled signals and falls in this category. superior performance. In several applications, besides the observations of the The paper is organized as follows. Section II provides target signal, additional information from correlated signals is the necessary background and reviews related work. The often available [3]–[10]. In multimodal applications, combin- proposed framework is presented in Section III, followed by ing information from multiple signals calls for methods that experimental results in Section IV. Conclusions are drawn in allow coupled signal representations, capturing the similari- Section V. ties between correlated data. To this end, coupled dictionary learning is a popular approach [8]–[10]; however, dictionary learning methods employ overcomplete dictionaries, resulting II. BACKGROUND AND RELATED WORK in computationally expensive sparse approximation problems. Deep learning has gained a lot of momentum in solving A common approach for solving problems of the form (1) inverse problems, often surpassing the performance of ana- with sparsity constraints is convex optimization [23]. Let us lytical approaches [11]–[13]. Nevertheless, neural networks assume that the unknown x Rn has a sparse representation k 2 n×k have a complex structure and appear as “black boxes”; thus, α R with respect to a dictionary Dx R , n k, that 2 2 ≤ is, x = Dxα. Then, (1) takes the form Both authors are with the Department of Electronics and Informatics, Vrije Universiteit Brussel, Brussels, Belgium, and with imec, Kapeldreef 75, B- 3001, Leuven, Belgium. email: {etsiligi, ndeligia}@etrovub.be. y = ΦDxα + e; (2) 1 1 DeepDeep Coupled-Representation Coupled-Representation Learning Learning for for Sparse Sparse LinearLinear Inverse Inverse Problems Problems with with Side Side Information Information 1 1 wwi DeepDeep Coupled-Representation Coupled-Representation Learning Learning for for Sparse Sparse i 1 LinearLinear Inverse Inverse Problems Problems with with Side Side Information Information 1 DeepDeep Coupled-Representation Coupled-Representation Learning Learning for for Sparse Sparse wwi +2+2µµwwi Linear Inverse Problems with Side Information i i Linear Inverse Problems with Side Information 1 1 1 1 1 1 w DeepDeep Coupled-Representation Coupled-Representation Learning Learning for for Sparse Sparse w +2wiiµ DeepDeep Coupled-Representation Coupled-Representation Learning Learning for forDeep Sparse Sparse Coupled-Representation Learning for Sparse 22µµ wii +2µ LinearLinear Inverse InverseDeep Problems Problems Coupled-Representation with with Side Side Information Information Learning for Sparse −− LinearLinear Inverse Inverse Problems Problems with with Side Side Information InformationLinearLinear Inverse Inverse Problems Problems with with Side Side Information Information w +2µ w 22µµwii +2µ wii wwii wi ⇠⇠((uuii)) −− wi w +2µ ⇠(u ) 22µµ wii +2µ 2 ⇠(uii) 1 −− wi +2µ 1 uuii wi +2µ wwii+2+2µµ DeepDeep Coupled-Representation Coupled-Representation Learning Learning for for Sparse Sparse u ⇠(u ) Linear Inverse22µµ Problems with Side Information uii ⇠(uii) Linear Inverse−− Problems with Side Information 1 1 2µ 1 2µ 2µ 1 ((uuii)) − 2µ ROPOSED RAMEWORK − wwi −− III. P F i DeepDeep Coupled-Representation Coupled-Representation Learning Learning for for Sparse Sparse ((uuiu)i)i ⇠⇠((uuii)) DeepDeep Coupled-Representation Coupled-Representation Learning Learning for for Sparse Sparse i Linear Inverse Problems with Side Information Linear InverseLinear Problems Inverse with Problems Side InformationIn with this Side paper, Information1 we consider that, besides the observations of Linear Inverse Problems with Side Information1 ✓ ⇠⇠((uuii)) ✓ wwii+2+2µµ ⇠⇠((uuii)) wi uuiDeepDeep Coupled-Representation Coupled-Representationthe Learning Learning target for for signal,Sparse Sparse we also have access to SI, that is, a signal ✓✓ ((uuii)) wi i wwii LinearLinear Inverse Inverse Problems Problems with with Side Side Information Information n 2µ z correlated to the unknown x. We assume that x R and ✓✓ uuii 2µ uui −− −− i z d α 2 k w ✓✓ wwii wwii+2+2µµ R have similar sparse representations R , ⇠(u ) k2 n×k d×2k 2 ⇠(uii) R , under dictionaries Dx R , Dz R , n k, 2 2 ≤ 22µµ d k, respectively. Specifically, we assume that α and w u −− uii ≤ ` α w (a) (b) are similar by means of the 1 norm, that is, 1 is ⇠(u ) k − k ⇠(uii) small. The condition holds for representations with partially Figure 1. Graphical representation of the proximal operators of (a) ISTA and (b) SITA (for non-negative SI wi ≥ 0, i = 1; : : : ; k). θ; µ are positive common support and a number of similar nonzero coefficients; uui parameters. i we refer to them as coupled sparse representations. Then, α can be obtained from the `1-`1 minimization problem 1 and a solution can be obtained via the formulation of the `1 2 min ΦDxα y 2 + λ( α 1 + α w 1): (9) minimization problem: α 2k − k k k k − k (9) has been theoretically studied in [29] and has been em- 1 2 min ΦDxα y 2 + λ α 1; (3) ployed for the recovery of sequential signals in [3]–[5]. α 2k − k k k Pn We can easily obtain coupled sparse representations of where 1 denotes the `1-norm ( α 1 = αi ), which k · k k k i=1 j j sequential signals that change slowly using the same spar- promotes sparse solutions and λ is a regularization parameter. sifying dictionary [3]–[5]. However, this is not the case in Numerical methods [1] proposed to solve (3) include pivot- most multimodal applications, where, typically, finding cou- ing algorithms, interior-point methods, gradient based methods pled sparse representations involves dictionary learning and and message passing algorithms (AMP) [24]. Among gradient complex optimization methods [8]–[10]. In this work, we pro- based methods, proximal

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    6 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us