Published on Mon Nov 05 2018

Exact multiplicative updates for convolutional -NMF in 2D

Pedro J. Villasana T., Stanislaw Gorlow

We extend the -CNMF to two dimensions and derive exact multiplicative updates for its factors. The new updates generalize and corrects the nonnegative matrix factor deconvolution previously proposed by Schmidt andM{\o}rup.

0
0
0
Abstract

In this paper, we extend the -CNMF to two dimensions and derive exact multiplicative updates for its factors. The new updates generalize and correct the nonnegative matrix factor deconvolution previously proposed by Schmidt and M{\o}rup. We show by simulation that the updates lead to a monotonically decreasing -divergence in terms of the mean and the standard deviation and that the corresponding convergence curves are consistent across the most common values for .

Wed Mar 14 2018
Machine Learning
Multiplicative Updates for Convolutional NMF Under -Divergence
The new updates unify the -NMF and the convolutional NMF. We show that our updates are stable and that their convergence performance is consistent across the most common values.
0
0
0
Tue May 28 2019
Neural Networks
Network Deconvolution
Network deconvolution is a procedure which optimally removes pixel-wise and channel-wise correlations before the data is fed into each layer. Filtering with such kernels results in a sparse representation, a desired property that has been missing in the training of neural networks.
0
0
0
Wed Nov 14 2018
Machine Learning
Newton Methods for Convolutional Neural Networks
Deep learning involves a difficult non-convex optimization problem, which is often solved by stochastic gradient (SG) methods. Recently, Newton methods have been investigated as an alternative optimization technique.
0
0
0
Thu Aug 26 2021
Artificial Intelligence
Convolutional Neural Networks Demystified: A Matched Filtering Perspective Based Tutorial
Deep Neural Networks (DNN) are a de-facto standard for the analysis of large volumes of signals and images. Their development and underlying principles have been largely performed in an ad-hoc and black box fashion. We establish that the convolution operation within CNNs represents a matched
1
0
0
Thu Sep 22 2016
Computer Vision
Is the deconvolution layer the same as a convolutional layer?
In this note, we want to focus on aspects related to two questions most people asked us at CVPR about the network we presented. What is the relationship between our proposed layer and the deconvolution layer? And why are convolutions in low-resolution (LR) space a better choice?
0
0
0
Wed Jul 27 2016
Machine Learning
Convolutional Neural Networks Analyzed via Convolutional Sparse Coding
Convolutional neural networks (CNN) have led to many state-of-the-art results. However, a clear and profound theoretical understanding of the forward pass, the core algorithm of CNN, is still lacking. We propose a novel multi-layer model, ML-CSC.
0
0
0