Exact and Approximate Inference in Graphical Models
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
CS242: Probabilistic Graphical Models Lecture 1B: Factor Graphs, Inference, & Variable Elimination
CS242: Probabilistic Graphical Models Lecture 1B: Factor Graphs, Inference, & Variable Elimination Professor Erik Sudderth Brown University Computer Science September 15, 2016 Some figures and materials courtesy David Barber, Bayesian Reasoning and Machine Learning http://www.cs.ucl.ac.uk/staff/d.barber/brml/ CS242: Lecture 1B Outline Ø Factor graphs Ø Loss Functions and Marginal Distributions Ø Variable Elimination Algorithms Factor Graphs set of hyperedges linking subsets of nodes f F ✓ V set of N nodes or vertices, 1, 2,...,N V 1 { } p(x)= f (xf ) Z = (x ) Z f f f x f Y2F X Y2F Z>0 normalization constant (partition function) (x ) 0 arbitrary non-negative potential function f f ≥ • In a hypergraph, the hyperedges link arbitrary subsets of nodes (not just pairs) • Visualize by a bipartite graph, with square (usually black) nodes for hyperedges • Motivation: factorization key to computation • Motivation: avoid ambiguities of undirected graphical models Factor Graphs & Factorization For a given undirected graph, there exist distributions with equivalent Markov properties, but different factorizations and different inference/learning complexities. 1 p(x)= (x ) Z f f f Y2F Undirected Pairwise (edge) Maximal Clique An Alternative Graphical Model Potentials Potentials Factorization 1 1 Recommendation: p(x)= (x ,x ) p(x)= (x ) Use undirected graphs Z st s t Z c c (s,t) c for pairwise MRFs, Y2E Y2C and factor graphs for higher-order potentials 6 K. Yamaguchi, T. Hazan, D. McAllester, R. Urtasun > 2pixels > 3pixels > 4pixels > 5pixels Non-Occ -
Symbolic Variable Elimination for Discrete and Continuous Graphical Models
Proceedings of the Twenty-Sixth AAAI Conference on Artificial Intelligence Symbolic Variable Elimination for Discrete and Continuous Graphical Models Scott Sanner Ehsan Abbasnejad NICTA & ANU ANU & NICTA Canberra, Australia Canberra, Australia [email protected] [email protected] Abstract Probabilistic reasoning in the real-world often requires inference in continuous variable graphical models, yet there are few methods for exact, closed-form inference when joint distributions are non-Gaussian. To address this inferential deficit, we introduce SVE – a symbolic Figure 1: The robot localization graphical model and all condi- extension of the well-known variable elimination al- tional probabilities. gorithm to perform exact inference in an expressive class of mixed discrete and continuous variable graphi- and xi are the observed measurements of distance (e.g., us- cal models whose conditional probability functions can ing a laser range finder). The prior distribution P (d) is uni- be well-approximated as oblique piecewise polynomi- form U(d; 0; 10) while the observation model P (xijd) is a als with bounded support. Using this representation, we mixture of three distributions: (red) is a truncated Gaussian ex- show that we can compute all of the SVE operations representing noisy measurements x of the actual distance d actly and in closed-form, which crucially includes defi- i nite integration w.r.t. multivariate piecewise polynomial within the sensor range of [0; 10]; (green) is a uniform noise functions. To aid in the efficient computation and com- model U(d; 0; 10) representing random anomalies leading pact representation of this solution, we use an extended to any measurement; and (blue) is a triangular distribution algebraic decision diagram (XADD) data structure that peaking at the maximum measurement distance (e.g., caused supports all SVE operations. -
Introrudction to Probabilistic Graphical Model
Introrudction to Probabilistic Graphical Model Modeling, Inference, and Learning Overview • What’s Probabilistic Graphical Model for ? • Tasks in Graphical Model: – Modeling – Learning – Inference • Examples – Topic Model – Hidden Markov Model – Markov Random Field Overview • What’s Probabilistic Graphical Model for ? • Tasks in Graphical Model: – Modeling – Inference – Learning • Examples – Topic Model – Hidden Markov Model – Markov Random Field What’s PGM for ? • Use “Probability” to “Model” dependencies among target of interest as a “Graph”. A unified framework for : – Prediction (Classification / Regression) – Discovery (Clustering / Pattern Recognition / System Modeling) – State Tracking (Localization/ Monitoring/ MotionTracking) – Ranking ( Search Engine/ Recommendation for Text/ Image/ Item ) What’s PGM for ? • Use “Probability” to “Model” dependencies among target of interest as a “Graph”. A unified framework for : – Prediction (Classification / Regression) – Discovery (Clustering / Pattern Recognition / System Modeling) – State Tracking (Localization/ Monitoring/ MotionTracking) – Ranking ( Search Engine/ Recommendation for Text/ Image/ Item ) Prediction (Lectures Before …) Variables of interest ? Perceptron SVM Linear K - Nearest Regression Neighbor Prediction with PGM One prediction are dependent on others. ( Collective Classification ) Data: (x1,x2,….,xd, y) (x1,x2,….,xd, y) (x1,x2,….,xd, y) … (x1,x2,….,xd, y) Prediction with PGM Labels are missing for most of data. ( Semi-supervised Learning ) Data: (x1,x2,….,xd, y=0) (x1,x2,….,xd, y=1) (x1,x2,….,xd, y=?) … (x1,x2,….,xd, y=?) What’s PGM for ? • Use “Probability” to “Model” dependencies among target of interest as a “Graph” . A unified framework for : – Prediction (Classification / Regression) – Discovery (Clustering / Pattern Recognition / System Modeling) – State Tracking (Localization/ Monitoring/ MotionTracking) – Ranking ( Search Engine/ Recommendation for Text/ Image/ Item ) Discovery with PGM We are interested about hidden variables. -
Exact Or Approximate Inference in Graphical Models: Why the Choice Is Dictated by the Treewidth, and How Variable Elimination Can Be Exploited
Exact or approximate inference in graphical models: why the choice is dictated by the treewidth, and how variable elimination can be exploited Nathalie Peyrarda, Marie-Josee´ Crosa, Simon de Givrya, Alain Francb, Stephane´ Robinc;d,Regis´ Sabbadina, Thomas Schiexa, Matthieu Vignesa;e a INRA UR 875 MIAT, Chemin de Borde Rouge, 31326 Castanet-Tolosan, France b INRA UMR 1202, Biodiversite,´ Genes` et Communautes,´ 69, route d’Arcachon, Pierroton, 33612 Cestas Cedex, France c AgroParisTech, UMR 518 MIA, 16 rue Claude Bernard, Paris 5e, France d INRA, UM R518 MIA, 16 rue Claude Bernard, Paris 5e, France e Institute of Fundamental Sciences, Massey University, Palmerston North, New Zealand Abstract Probabilistic graphical models offer a powerful framework to account for the dependence structure between variables, which is represented as a graph. However, the dependence be- tween variables may render inference tasks intractable. In this paper we review techniques exploiting the graph structure for exact inference, borrowed from optimisation and computer science. They are built on the principle of variable elimination whose complexity is dictated in an intricate way by the order in which variables are eliminated. The so-called treewidth of the graph characterises this algorithmic complexity: low-treewidth graphs can be processed arXiv:1506.08544v2 [stat.ML] 12 Mar 2018 efficiently. The first message that we illustrate is therefore the idea that for inference in graph- ical model, the number of variables is not the limiting factor, and it is worth checking for the treewidth before turning to approximate methods. We show how algorithms providing an upper bound of the treewidth can be exploited to derive a ’good’ elimination order enabling to perform exact inference. -
Symbolic Variable Elimination for Discrete and Continuous Graphical Models
Symbolic Variable Elimination for Discrete and Continuous Graphical Models Scott Sanner Ehsan Abbasnejad NICTA & ANU ANU & NICTA Canberra, Australia Canberra, Australia [email protected] [email protected] Abstract d P(d) = P(x |d) = Probabilistic reasoning in the real-world often requires i inference in continuous variable graphical models, yet d x x ... x there are few methods for exact, closed-form inference 1 2 n 0 10 0 10 when joint distributions are non-Gaussian. To address d xi this inferential deficit, we introduce SVE – a symbolic Figure 1: The robot localization graphical model and all condi- extension of the well-known variable elimination al- tional probabilities. gorithm to perform exact inference in an expressive class of mixed discrete and continuous variable graphi- and xi are the observed measurements of distance (e.g., us- cal models whose conditional probability functions can ing a laser range finder). The prior distribution P (d) is uni- be well-approximated as oblique piecewise polynomi- form U(d; 0, 10) while the observation model P (xi|d) is a als with bounded support. Using this representation, we mixture of three distributions: (red) is a truncated Gaussian ex- show that we can compute all of the SVE operations representing noisy measurements x of the actual distance d actly and in closed-form, which crucially includes defi- i nite integration w.r.t. multivariate piecewise polynomial within the sensor range of [0, 10]; (green) is a uniform noise functions. To aid in the efficient computation and com- model U(d; 0, 10) representing random anomalies leading pact representation of this solution, we use an extended to any measurement; and (blue) is a triangular distribution algebraic decision diagram (XADD) data structure that peaking at the maximum measurement distance (e.g., caused supports all SVE operations.