scispace - formally typeset
Open AccessPosted Content

On Semiparametric Exponential Family Graphical Models

Reads0
Chats0
TLDR
In this article, a new class of semiparametric exponential family graphical models for the analysis of high dimensional mixed data is proposed, and a symmetric pairwise score test for the presence of a single edge in the graph is proposed.
Abstract
We propose a new class of semiparametric exponential family graphical models for the analysis of high dimensional mixed data. Different from the existing mixed graphical models, we allow the nodewise conditional distributions to be semiparametric generalized linear models with unspecified base measure functions. Thus, one advantage of our method is that it is unnecessary to specify the type of each node and the method is more convenient to apply in practice. Under the proposed model, we consider both problems of parameter estimation and hypothesis testing in high dimensions. In particular, we propose a symmetric pairwise score test for the presence of a single edge in the graph. Compared to the existing methods for hypothesis tests, our approach takes into account of the symmetry of the parameters, such that the inferential results are invariant with respect to the different parametrizations of the same edge. Thorough numerical simulations and a real data example are provided to back up our results.

read more

Citations
More filters

Graphical Models In Applied Multivariate Statistics

Jessika Weiss
TL;DR: The graphical models in applied multivariate statistics is universally compatible with any devices to read and is available in the digital library an online access to it is set as public so you can download it instantly.
Journal ArticleDOI

Structure Learning in Graphical Modeling

TL;DR: A graphical model is a statistical model that is associated with a graph whose nodes correspond to variables of interest as discussed by the authors, and the edges of the graph reflect allowed conditional dependencies among the variables.
ReportDOI

Valid Post-Selection and Post-Regularization Inference: An Elementary, General Approach

TL;DR: This analysis provides a set of high-level conditions under which inference for the low-dimensional parameter based on testing or point estimation methods will be regular despite selection or regularization biases occurring in the estimation of the high-dimensional nuisance parameter.
Journal ArticleDOI

Valid Post-Selection and Post-Regularization Inference: An Elementary, General Approach

TL;DR: In this article, the authors present an expository, general analysis of valid post-selection or post-regularization inference about a low-dimensional target parameter, $\alpha$, in the presence of a very high-dimensional nuisance parameter, which is estimated using modern selection or regularization methods.
ReportDOI

High-dimensional econometrics and regularized GMM

TL;DR: This chapter presents key concepts and theoretical results for analyzing estimation and inference in high-dimensional models, and presents results in a framework where estimators of parameters of interest may be represented directly as approximate means.
References
More filters
Journal ArticleDOI

Regression Shrinkage and Selection via the Lasso

TL;DR: A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.
Journal ArticleDOI

Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties

TL;DR: In this article, penalized likelihood approaches are proposed to handle variable selection problems, and it is shown that the newly proposed estimators perform as well as the oracle procedure in variable selection; namely, they work as well if the correct submodel were known.
Journal ArticleDOI

Decoding by linear programming

TL;DR: F can be recovered exactly by solving a simple convex optimization problem (which one can recast as a linear program) and numerical experiments suggest that this recovery procedure works unreasonably well; f is recovered exactly even in situations where a significant fraction of the output is corrupted.
Book

Probabilistic graphical models : principles and techniques

TL;DR: The framework of probabilistic graphical models, presented in this book, provides a general approach for causal reasoning and decision making under uncertainty, allowing interpretable models to be constructed and then manipulated by reasoning algorithms.
Posted Content

Decoding by Linear Programming

TL;DR: In this paper, it was shown that under suitable conditions on the coding matrix, the input vector can be recovered exactly by solving a simple convex optimization problem (which one can recast as a linear program).
Related Papers (5)