Published on Tue Jun 29 2021

Convolutional Sparse Coding Fast Approximation with Application to Seismic Reflectivity Estimation

Deborah Pereg, Israel Cohen, Anthony A. Vassiliou

We propose a speed-up upgraded version of the classic iterative thresholding algorithm. It produces a good approximation of the convolutional sparse code within 2-5 iterations. The speed advantage is gained mostly from the observation that most solvers are slowed down by inefficient global thresholds.

0
0
0
Abstract

In sparse coding, we attempt to extract features of input vectors, assuming that the data is inherently structured as a sparse superposition of basic building blocks. Similarly, neural networks perform a given task by learning features of the training data set. Recently both data-driven and model-driven feature extracting methods have become extremely popular and have achieved remarkable results. Nevertheless, practical implementations are often too slow to be employed in real-life scenarios, especially for real-time applications. We propose a speed-up upgraded version of the classic iterative thresholding algorithm, that produces a good approximation of the convolutional sparse code within 2-5 iterations. The speed advantage is gained mostly from the observation that most solvers are slowed down by inefficient global thresholding. The main idea is to normalize each data point by the local receptive field energy, before applying a threshold. This way, the natural inclination towards strong feature expressions is suppressed, so that one can rely on a global threshold that can be easily approximated, or learned during training. The proposed algorithm can be employed with a known predetermined dictionary, or with a trained dictionary. The trained version is implemented as a neural net designed as the unfolding of the proposed solver. The performance of the proposed solution is demonstrated via the seismic inversion problem in both synthetic and real data scenarios. We also provide theoretical guarantees for a stable support recovery. Namely, we prove that under certain conditions the true support is perfectly recovered within the first iteration.

Fri Jul 20 2018
Machine Learning
Convolutional Neural Networks Analyzed via Inverse Problem Theory and Sparse Representations
Convolutional networks (CNNs) have been widely used for many inverse problem areas. CNNs are not mathematically validated as to how and what they learn. In this paper, we prove that during training, CNNs solve for inverse problems.
0
0
0
Sat May 20 2017
Machine Learning
Deep Sparse Coding Using Optimized Linear Expansion of Thresholds
We address the problem of reconstructing sparse signals from noisy and compressive measurements using a feed-forward deep neural network (DNN) We maintain the weights and biases of the network links as prescribed by ISTA. We model the nonlinear activation function using a linear expansion of thresholds (LET)
0
0
0
Thu Sep 12 2019
Computer Vision
Rethinking the CSC Model for Natural Images
Convolutional Sparse Coding (CSC) model, in which the dictionary consists of shift-invariant filters, has gained renewed interest. We suggest a novel feed-forward network that follows an MMSE approximation process to the CSC model, using strided convolutions.
0
0
0
Sat Aug 08 2020
Machine Learning
Representation Learning via Cauchy Convolutional Sparse Coding
Convolutional Sparse Coding (CSC) enables unsupervised learning of features. The performance of the proposed Iterative Cauchy Thresholding (ICT) algorithm in reconstructing natural images is compared against the common choice of soft and hard thresholding.
0
0
0
Tue Jul 23 2019
Machine Learning
Convolutional Dictionary Learning in Hierarchical Networks
We propose an alternating minimization algorithm for learning the filters in this hierarchical model. The algorithm alternates between a coefficient-estimation step and a filter update step. The coefficient update step performs sparse (detail) coding and, when unfolded, leads to a deep neural network.
0
0
0
Tue May 09 2017
Computer Vision
Convolutional Dictionary Learning via Local Processing
Convolutional Sparse Coding (CSC) is an increasingly popular model in the signal and image processing communities. The proposed method is fast to train, simple to implement, and flexible enough that it can be deployed in a variety of applications.
0
0
0