Bayesian Inference in Graphical Gaussian Models

Authored by: Marloes Maathuis , Mathias Drton , Steffen Lauritzen , Martin Wainwright

Handbook of Graphical Models

Print publication date:  November  2018
Online publication date:  November  2018

Print ISBN: 9781498788625
eBook ISBN: 9780429463976
Adobe ISBN:

10.1201/9780429463976-10

 Download Chapter

 

Abstract

Graphical Gaussian models are one of the main tools for the analysis of high-dimensional data with applications in a variety of disciplines. A graphical Gaussian model for the random vector Z ∈ R r is a Gaussian model where the dependencies between the components of Z are represented by means of a graph. The parameter of a centered Gaussian model is its covariance Σ or equivalently its concentration or precision matrix K = Σ - 1 ∈ P r where P r is the cone of r × r positive definite matrices. Because the dependencies between the components of Z are represented by means of a graph G, the expression of Σ or K depends upon G. In a Bayesian framework, we express uncertainty on the parameters by putting prior distributions f 2 ( K | G ) and f 3 ( G ) on K given G and G respectively. Given a sample D = ( Z 1 , … , Z n ) from the Gaussian distribution with precision matrix K, the joint distribution for ( D , K , G ) is f ( z 1 , … , z n , K , G ) = f 1 ( z 1 , … , z n | K , G ) f 2 ( K | G ) f 3 ( G ) from which we derive the posterior distribution of the model, that is G, as p ( G ∣ D ) ∝ f 3 ( G ) ∫ f 1 ( z 1 , … , z n | K , G ) f 2 ( K | G ) d K .

 Cite
Search for more...
Back to top

Use of cookies on this website

We are using cookies to provide statistics that help us give you the best experience of our site. You can find out more in our Privacy Policy. By continuing to use the site you are agreeing to our use of cookies.