Published on Thu Nov 19 2015

Efficient Sum of Outer Products Dictionary Learning (SOUP-DIL) and Its Application to Inverse Problems

Saiprasad Ravishankar, Raj Rao Nadakuditi, Jeffrey A. Fessler

Dictionary learning problems are typically non-convex and NP-hard. This paper exploits the ideas that drive algorithms such as K-SVD. The resulting block coordinate descent algorithms involve efficient closed-form solutions. We also propose novel and efficient algorithms for adaptive image reconstruction.

0
0
0
Abstract

The sparsity of signals in a transform domain or dictionary has been exploited in applications such as compression, denoising and inverse problems. More recently, data-driven adaptation of synthesis dictionaries has shown promise compared to analytical dictionary models. However, dictionary learning problems are typically non-convex and NP-hard, and the usual alternating minimization approaches for these problems are often computationally expensive, with the computations dominated by the NP-hard synthesis sparse coding step. This paper exploits the ideas that drive algorithms such as K-SVD, and investigates in detail efficient methods for aggregate sparsity penalized dictionary learning by first approximating the data with a sum of sparse rank-one matrices (outer products) and then using a block coordinate descent approach to estimate the unknowns. The resulting block coordinate descent algorithms involve efficient closed-form solutions. Furthermore, we consider the problem of dictionary-blind image reconstruction, and propose novel and efficient algorithms for adaptive image reconstruction using block coordinate descent and sum of outer products methodologies. We provide a convergence study of the algorithms for dictionary learning and dictionary-blind image reconstruction. Our numerical experiments show the promising performance and speed-ups provided by the proposed methods over previous schemes in sparse data representation and compressed sensing-based image reconstruction.

Fri Nov 27 2015
Machine Learning
Efficient Sum of Outer Products Dictionary Learning (SOUP-DIL) - The Method
Dictionary learning problems are typically non-convex and NP-hard. The usual alternating alternating minimization approaches for these problems are often computationally expensive. We provide a convergence analysis for the proposed block coordinate descent approach.
0
0
0
Tue Jan 13 2015
Machine Learning
Efficient Blind Compressed Sensing Using Sparsifying Transforms with Convergence Guarantees and Application to MRI
Compressed sensing exploits the sparsity of images or image patches in a transform domain or synthesis dictionary to reconstruct images from undersampled measurements. The proposed block coordinate descent type algorithms involve highly efficient optimal updates.
0
0
0
Mon May 28 2012
Machine Learning
Learning Dictionaries with Bounded Self-Coherence
A dictionary learning method adapts an initial dictionary to a particular signal class. The coherence of the dictionary to the signal class and the self-coherence of its atoms are key. A high coherence enables the sparse coding of signal observations with a small approximation error. A low self-
0
0
0
Sat Oct 19 2019
Machine Learning
Dictionary Learning with Almost Sure Error Constraints
Dictionaries are a database of standard vectors. The task of learning a dictionary for a given data is to find a good dictionary so that the representation of data points has desirable features. We impose the constraint that every sample is reconstructed properly to within a certain threshold.
0
0
0
Thu Jun 06 2019
Machine Learning
Complete Dictionary Learning via -Norm Maximization over the Orthogonal Group
This paper considers the fundamental problem of learning a complete(orthogonal) dictionary from samples of sparsely generated signals. The algorithm provably converges locally at a superlinear (cubic) rate and cost per iteration is merely an SVD.
0
0
0
Sun Mar 14 2021
Machine Learning
Exact Sparse Orthogonal Dictionary Learning
0
0
0