Published on Tue Jul 06 2021

Implicit Variational Conditional Sampling with Normalizing Flows

Vincent Moens, Aivar Sootla, Haitham Bou Ammar, Jun Wang

We present a method for conditional sampling with normalizing flows when only a part of an observation is available. We show that our sampling method can be applied with success to the development ofvertible residual networks for inference and classification.

1
4
14
Abstract

We present a method for conditional sampling with normalizing flows when only part of an observation is available. We rely on the following fact: if the flow's domain can be partitioned in such a way that the flow restrictions to subdomains keep the bijectivity property, a lower bound to the conditioning variable log-probability can be derived. Simulation from the variational conditional flow then amends to solving an equality constraint. Our contribution is three-fold: a) we provide detailed insights on the choice of variational distributions; b) we propose how to partition the input space of the flow to preserve bijectivity property; c) we propose a set of methods to optimise the variational distribution in specific cases. Through extensive experiments, we show that our sampling method can be applied with success to invertible residual networks for inference and classification.

Fri Jul 16 2021
Machine Learning
Efficient Bayesian Sampling Using Normalizing Flows to Assist Markov Chain Monte Carlo Methods
Normalizing flows can generate complex target distributions and thus show promise in many applications in Bayesian statistics. Since no data set from the targetipientposterior distribution is available beforehand, the flow is typically trained using the reverse Kullback-Leibler divergence. This strategy may perform
1
0
0
Thu May 21 2015
Artificial Intelligence
Variational Inference with Normalizing Flows
The choice of approximate posterior distribution is one of the core problems in variational inference. We introduce a new approach for specifying flexible,arbitrarily complex and scalable approximate posterior distributions.
0
0
0
Fri Jul 10 2020
Machine Learning
Variational Inference with Continuously-Indexed Normalizing Flows
Continuously-indexed flows (CIFs) have recently achieved improvements over baseline normalizing flows on a variety of density estimation tasks. CIFs do not possess a closed-form marginal density, and so, unlike standard flows, cannot be plugged in directly to a variational inference
0
0
0
Mon Jul 13 2020
Machine Learning
Projected Latent Markov Chain Monte Carlo: Conditional Sampling of Normalizing Flows
Projected Latent Markov Chain Monte Carlo (PL-MCMC) is a technique for sampling from the high-dimensional conditional distributions learned by a normalizing flow. PL- MCMC enables Monte Carlo Expectation Maximization (MC-EM) training of normalizing flows from incomplete data.
0
0
0
Sun Oct 30 2016
Machine Learning
Auxiliary gradient-based sampling algorithms
We introduce a new family of MCMC samplers that combine auxiliary variables, Gibbs sampling and Taylor expansions of the target density. We prove that marginal sampler are superior in terms of asymptoticvariance and demonstrate cases where they are slower in computing time compared to auxiliarySamplers. We introduce a novel MCMC sampling scheme for hyperparameter
0
0
0
Wed Feb 26 2020
Machine Learning
Composing Normalizing Flows for Inverse Problems
Given an inverse problem with a normalizing flow prior, we wish to estimate the distribution of the underlying signal conditioned on the observations. We first establish that this is computationally hard for a large class of flow models. We propose a framework forroximate inference that estimates the target conditional as a
0
0
0
Wed Feb 11 2015
Machine Learning
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
Batch Normalization allows us to use much higher learning rates and be less careful about initialization. It also acts as a regularizer, in some cases eliminating the need for Dropout. Applied to a state-of-the-art image classification model, Batch Normalized achieves the same
2
342
1,163
Fri Nov 02 2018
Machine Learning
Invertible Residual Networks
We show that standard ResNet architectures can be made invertible. This allows the same model to be used for classification, density estimation, and generation. Invertible ResNets perform competitively with state-of-the-art image classifiers and flow-based generative models.
2
182
628
Tue Jun 19 2018
Machine Learning
Neural Ordinary Differential Equations
We introduce a new family of deep neural network models. Instead of specifying a discrete sequence of hidden layers, we parameterize the derivative of the hidden state using a neural network. The output of the network is thencomputed using a black-box differential equation solver.
3
4
29
Mon Jan 04 2016
Machine Learning
Variational Inference: A Review for Statisticians
Aims to approximate difficult-to-compute probability densities through optimization. VI is powerful, but it is not yet well understood. We hope to catalyze statistical research on this class of algorithms.
4
2
16
Fri Mar 27 2020
Machine Learning
MCFlow: Monte Carlo Flow Models for Data Imputation
We consider the topic of data imputation, a foundational task in machine learning. We propose MCFlow, a deep framework for imputation that leverages normalizing flow and Monte Carlo sampling. We demonstrate that MCFlow is superior to competing methods in terms of the quality of the imputed data.
1
0
2
Mon Dec 22 2014
Machine Learning
Adam: A Method for Stochastic Optimization
Adam is an algorithm for first-order gradient-based optimization of stochastic objective functions. The method is straightforward to implement and has little memory requirements. It is well suited for problems that are large in terms of data and parameters.
3
0
2