Published on Sat May 09 2020

Exact Asymptotics for Learning Tree-Structured Graphical Models with Side Information: Noiseless and Noisy Samples

Anshoo Tandon, Vincent Y. F. Tan, Shiyao Zhu

Given side information that an Ising tree-structured graphical model isHomogeneous and has no external field, we derive the exact asymptotics of learning its structure from independently drawn samples. Our theoretical results demonstrate keen agreement with experimental results for sample sizes as small as that in the hundreds.

0
0
0
Abstract

Given side information that an Ising tree-structured graphical model is homogeneous and has no external field, we derive the exact asymptotics of learning its structure from independently drawn samples. Our results, which leverage the use of probabilistic tools from the theory of strong large deviations, refine the large deviation (error exponents) results of Tan, Anandkumar, Tong, and Willsky [IEEE Trans. on Inform. Th., 57(3):1714--1735, 2011] and strictly improve those of Bresler and Karzand [Ann. Statist., 2020]. In addition, we extend our results to the scenario in which the samples are observed in random noise. In this case, we show that they strictly improve on the recent results of Nikolakakis, Kalogerias, and Sarwate [Proc. AISTATS, 1771--1782, 2019]. Our theoretical results demonstrate keen agreement with experimental results for sample sizes as small as that in the hundreds.

Fri Sep 20 2019
Machine Learning
Optimal Rates for Learning Hidden Tree Structures
We provide high probability finite sample complexity guarantees for hidden non-parametric structure learning of tree-shaped graphical models. We show that the Chow-Liu algorithm for ensuring exact tree structure recovery from noisy data is inversely proportional to the information threshold squared.
0
0
0
Wed Jun 13 2012
Machine Learning
Learning the Bayesian Network Structure: Dirichlet Prior versus Data
The presence of an edge in a Bayesian network is favoured over its absence even if both the Dirichlet prior and the data imply independence. In our second contribution, we focus on realistic ESS-values, and provide an analytical approximation to the "optimal"ESS-value.
0
0
0
Fri Jan 22 2021
Machine Learning
SGA: A Robust Algorithm for Partial Recovery of Tree-Structured Graphical Models with Noisy Samples
We consider learning Ising tree models when the observations from the nodes are corrupted by independent but non-identically distributed noise. Katiyar et al. (2020) showed that although the exact tree structure cannot be recovered, one can recover a partial tree structure. We propose a more
0
0
0
Wed Oct 28 2020
Machine Learning
Sample-Optimal and Efficient Learning of Tree Ising models
We show that -variable tree-structured Ising models can be learned computationally-efficiently to within total variation distance. Our guarantees hold, in fact, for the celebrated Chow-Liu [1968] algorithm, using the plug-in estimator for estimating mutual information. While we establish guarantees for a widely known and simple algorithm, the analysis
0
0
0
Mon Dec 03 2012
Machine Learning
Structure estimation for discrete graphical models: Generalized covariance matrices and their inverses
For certain graph structures, the support of the inverse. of indicator variables on the vertices of a graph reflects the.conditional independence structure of the graph. These population-level results have implications for graph selection methods, both known and novel.
0
0
0
Wed Sep 21 2011
Machine Learning
Robust estimation of latent tree graphical models: Inferring hidden states with inexact parameters
Latent tree graphical models are widely used in computational biology, signal and image processing, and network tomography. Here we design a new efficient,estimation procedure for latent tree models.
0
0
0