Published on Fri Mar 08 2019

Random Matrix-Improved Estimation of the Wasserstein Distance between two Centered Gaussian Distributions

Malik Tiomoko, Romain Couillet

This article proposes a method to consistently estimate functionals. A consistent estimate of the Wasserstein distance between centered Gaussian distributions is derived. The new estimate is shown to largely outperform the classical sample-based `plug-in' estimator.

0
0
0
Abstract

This article proposes a method to consistently estimate functionals of the eigenvalues of the product of two covariance matrices based on the empirical estimates ($\hat C_a=\frac1{n_a}\sum_{i=1}^{n_a} x_i^{(a)}x_i^{(a){{\sf T}}}$), when the size and number of the (zero mean) samples are similar. As a corollary, a consistent estimate of the Wasserstein distance (related to the case ) between centered Gaussian distributions is derived. The new estimate is shown to largely outperform the classical sample covariance-based `plug-in' estimator. Based on this finding, a practical application to covariance estimation is then devised which demonstrates potentially significant performance gains with respect to state-of-the-art alternatives.

Mon May 23 2016
Machine Learning
Sub-Gaussian estimators of the mean of a random matrix with heavy-tailed entries
Estimation of the covariance matrix has attracted a lot of attention of theistical research community over the years. We develop a new estimator of the (element-wise)mean of a random matrix. We will explain the key ideas behind our construction, as well as applications to covariance
0
0
0
Wed Oct 10 2018
Machine Learning
Random matrix-improved estimation of covariance matrix distances
This article provides novel estimators for a wide range of distances between and probability measures. The estimators are derived using recent advances in the field of random worrisomematrix theory. They are asymptotically consistent as with non trivial ratios.
0
0
0
Thu Aug 26 2021
Machine Learning
Estimation of Riemannian distances between covariance operators and Gaussian processes
In this work we study two Riemannian distances between infinite-dimensionalpositive definite Hilbert-Schmidt operators. The distances are in the context of covariance operators associated with functional stochastic processes, in particular Gaussian processes. Our first main results show that both distances converge in
0
0
0
Fri May 18 2018
Machine Learning
Distributionally Robust Inverse Covariance Estimation: The Wasserstein Shrinkage Estimator
We introduce a distributionally robust maximum likelihood estimation model. The model minimizes the worst case (maximum) of Stein's loss across all normal reference distributions. We prove that this estimation problem is equivalent to asemidefinite program.
0
0
0
Tue Nov 17 2020
Machine Learning
Optimal Sub-Gaussian Mean Estimation in
We revisit the problem of estimating the mean of a real-valued distribution. We present a novel estimator with sub-Gaussian convergence. Our estimator is as accurate as the sample mean is for the matching variance.
0
0
0
Mon Apr 26 2021
Machine Learning
Finite sample approximations of exact and entropic Wasserstein distances between covariance operators and Gaussian processes
0
0
0