Published on Fri Feb 10 2012

High Dimensional Semiparametric Gaussian Copula Graphical Models

Han Liu, Fang Han, Ming Yuan, John Lafferty, Larry Wasserman

In this paper, we propose a semiparametric approach, named nonparanormal skeptic, for efficiently and robustly estimating high dimensional undirected graphical models. To achieve modeling flexibility, we consider Gaussian copula graphical models. The proposed methods are then applied on

0
0
0
Abstract

In this paper, we propose a semiparametric approach, named nonparanormal skeptic, for efficiently and robustly estimating high dimensional undirected graphical models. To achieve modeling flexibility, we consider Gaussian Copula graphical models (or the nonparanormal) as proposed by Liu et al. (2009). To achieve estimation robustness, we exploit nonparametric rank-based correlation coefficient estimators, including Spearman's rho and Kendall's tau. In high dimensional settings, we prove that the nonparanormal skeptic achieves the optimal parametric rate of convergence in both graph and parameter estimation. This celebrating result suggests that the Gaussian copula graphical models can be used as a safe replacement of the popular Gaussian graphical models, even when the data are truly Gaussian. Besides theoretical analysis, we also conduct thorough numerical simulations to compare different estimators for their graph recovery performance under both ideal and noisy settings. The proposed methods are then applied on a large-scale genomic dataset to illustrate their empirical usefulness. The R language software package huge implementing the proposed methods is available on the Comprehensive R Archive Network: http://cran. r-project.org/.

Thu Aug 15 2013
Machine Learning
High dimensional Sparse Gaussian Graphical Mixture Model
This paper considers the problem of networks reconstruction from heterogeneous data using a Gaussian Graphical Mixture Model (GGMM) We show that under certain regularity conditions the Penalized Maximum Likelihood (PML) estimates are consistent.
0
0
0
Thu Nov 12 2015
Machine Learning
Block-diagonal covariance selection for high-dimensional Gaussian graphical models
Gaussian graphical models are widely utilized to infer and visualize networks of dependencies between continuous variables. To reduce the number of parameters to estimate in the model, we propose a non-asymptotic model selection procedure. The performance of the procedure is illustrated on simulated data.
0
0
0
Wed Apr 16 2014
Machine Learning
Stable Graphical Models
Stable random variables are motivated by the central limit theorem for potentially unbounded variance. This makes penalized maximum-likelihood based learning computationally demanding. We establish the Bayesian information criterion can asymptotically be reduced to the computationally more tractable minimum dispersion criterion.
0
0
0
Thu Jan 17 2013
Machine Learning
On Graphical Models via Univariate Exponential Family Distributions
Undirected graphical models, or Markov networks, are a popular class ofistical models. Popular instances of this class include Gaussian graphical models and Ising models. In many settings, however, it might not be clear which subclass of graphical models to use.
0
0
0
Wed Oct 28 2015
Machine Learning
Robust Gaussian Graphical Modeling with the Trimmed Graphical Lasso
Many modern applications such as gene network discovery and social interactions analysis often involve high-dimensional noisy data with heavier tails than the Gaussian distribution. In this paper, we propose the Trimmed Graphical Lasso for robust estimation of sparse GGMs.
0
0
0
Fri Jun 26 2020
Machine Learning
The huge Package for High-dimensional Undirected Graph Estimation in R
We describe an R package named huge which provides easy-to-use functions for estimating high dimensional undirected graphs from data. This package implements recent results in the literature, including Friedman et al. (2007, 2012) and Liu and Liu (2010)
0
0
0