Published on Fri Jul 02 2021

Truncated Marginal Neural Ratio Estimation

Benjamin Kurt Miller, Alex Cole, Patrick Forré, Gilles Louppe, Christoph Weniger

Parametric stochastic simulators are ubiquitous in science, often featuring high-dimensional input parameters and/or an intractable likelihood. We present a simulator-based inference algorithm which simultaneously offers simulations efficiency and fast empirical posterior testability.

6
6
34
Abstract

Parametric stochastic simulators are ubiquitous in science, often featuring high-dimensional input parameters and/or an intractable likelihood. Performing Bayesian parameter inference in this context can be challenging. We present a neural simulator-based inference algorithm which simultaneously offers simulation efficiency and fast empirical posterior testability, which is unique among modern algorithms. Our approach is simulation efficient by simultaneously estimating low-dimensional marginal posteriors instead of the joint posterior and by proposing simulations targeted to an observation of interest via a prior suitably truncated by an indicator function. Furthermore, by estimating a locally amortized posterior our algorithm enables efficient empirical tests of the robustness of the inference results. Such tests are important for sanity-checking inference in real-world applications, which do not feature a known ground truth. We perform experiments on a marginalized version of the simulation-based inference benchmark and two complex and narrow posteriors, highlighting the simulator efficiency of our algorithm as well as the quality of the estimated marginal posteriors. Implementation on GitHub.

Fri Nov 27 2020
Machine Learning
Simulation-efficient marginal posterior estimation with swyft: stop wasting your precious time
We present algorithms for nested neural likelihood-to-evidence ratio estimation and for simulation reuse via an inhomogeneous Poisson point process cache. The algorithms are applicable to a wide range of physics and astronomy problems.
0
0
0
Wed Nov 21 2018
Machine Learning
Sequential Neural Methods for Likelihood-free Inference
Likelihood-free inference refers to inference when a likelihood function cannot be explicitly evaluated. Most of the literature is based on sample-based `Approximate Bayesian Computation' methods. Recent work suggests approaches based on deep neural conditional density estimators can obtain state-of-the-
0
0
0
Fri May 17 2019
Machine Learning
Automatic Posterior Transformation for Likelihood-Free Inference
Automatic posterior transformation (APT) can modify the posterior estimate using arbitrary, dynamically updatedproposals. APT is compatible with powerful flow-based density estimators. It is more flexible, scalable and efficient than previous simulation-based inference techniques.
0
0
0
Tue Jan 12 2021
Machine Learning
Benchmarking Simulation-Based Inference
A public benchmark with appropriate performance metrics for 'likelihood-free' algorithms has been lacking. We provide a benchmark with inference tasks andsuitable performance metrics. We highlight the potential of the benchmark to diagnose problems and improve algorithms.
2
26
60
Fri May 18 2018
Machine Learning
Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows
Sequential Neural Likelihood (SNL) is a new method for Bayesian mistakenlyinference in simulator models. SNL trains an autoregressive flow on simulated data in order to learn a model of the likelihood. A sequential training procedure guides simulations and reduces simulation cost by orders of magnitude.
1
0
2
Mon Feb 15 2021
Artificial Intelligence
Posterior-Aided Regularization for Likelihood-Free Inference
Given the diversity of simulation structures, it is difficult to find a single unified inference method for each simulation model. This paper proposes a universally applicable regularization technique, called Posterior-Aided Regularization (PAR)
0
0
0
Tue Jun 10 2014
Machine Learning
Generative Adversarial Networks
We propose a new framework for estimating generative models via an adversarial process. We simultaneously train two models: a generative model G that captures the data distribution, and a discriminative model D. The training procedure for G is to maximize the probability of D making a mistake.
9
3,421
23,915
Thu Dec 05 2019
Machine Learning
Normalizing Flows for Probabilistic Modeling and Inference
Normalizing flows provide a general mechanism for defining expressive probability distributions. We place special emphasis on the fundamental principles of flow design. We discuss foundational topics such as expressive power and computational trade-offs.
2
183
683
Tue Jan 12 2021
Machine Learning
Benchmarking Simulation-Based Inference
A public benchmark with appropriate performance metrics for 'likelihood-free' algorithms has been lacking. We provide a benchmark with inference tasks andsuitable performance metrics. We highlight the potential of the benchmark to diagnose problems and improve algorithms.
2
26
60
Fri May 18 2018
Machine Learning
Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows
Sequential Neural Likelihood (SNL) is a new method for Bayesian mistakenlyinference in simulator models. SNL trains an autoregressive flow on simulated data in order to learn a model of the likelihood. A sequential training procedure guides simulations and reduces simulation cost by orders of magnitude.
1
0
2
Fri May 19 2017
Machine Learning
Masked Autoregressive Flow for Density Estimation
Autoregressive models are among the best performing neural density estimators. By constructing a stack of autoregressive models, each modelling the random numbers of the next model in the stack, we achieve a type of normalizing flow suitable for density estimation.
0
0
0
Fri Nov 27 2020
Machine Learning
Simulation-efficient marginal posterior estimation with swyft: stop wasting your precious time
We present algorithms for nested neural likelihood-to-evidence ratio estimation and for simulation reuse via an inhomogeneous Poisson point process cache. The algorithms are applicable to a wide range of physics and astronomy problems.
0
0
0