
Outline Bayesian Networks: Belief Propagation (Cont’d) Belief Propagation Review and Examples Huizhen Yu Generalized Belief Propagation – Max-Product Applications to Loopy Graphs [email protected].fi Dept. Computer Science, Univ. of Helsinki Probabilistic Models, Spring, 2010 Announcement: The last exercise will be posted online soon. Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 1 / 28 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 2 / 28 Belief Propagation Review and Examples Belief Propagation Review and Examples Outline Review of Last Lecture We studied an algorithm for computing marginal posterior distributions: • It works in singly connected networks, which are DAGs whose undirected versions are trees. • It is suitable for parallel implementation. • It is recursively derived by Belief Propagation (i) dividing the total evidence in pieces, according to the independence structure represented by the DAG, and then Review and Examples (ii) incorporating evidence pieces in either the probability terms Generalized Belief Propagation – Max-Product (π-messages) or the likelihood terms (conditional probability terms; Applications to Loopy Graphs λ-messages). Queries answerable by the algorithm for a singly connected network: • P(X = x |e) for a single x; • P(Xv = xv |e) for all xv and v ∈ V ; • Most probable configurations, arg maxx p(x & e). This can be related to finding global optimal solutions by distributed local computation. (Details are given today.) Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 3 / 28 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 4 / 28 Belief Propagation Review and Examples Belief Propagation Review and Examples Practice: Belief Propagation for HMM A Fault-Detection Example t = 0 t = n A logic circuit for fault detection and its Bayesian network (Pearl 1988): t = 1 Y0 = 1 Y0 = 1 t = n +1 t = 2 X1 P(Xi = 1) = pi , X1 AND P(Xi = 0) = 1 − pi = qi , t = 3 Y 1 Y1 Yi = Yi−1 AND Xi . X2 X2 • Y = 1 always. Observation variables (black) are instantiated; latent variables (white) are AND 0 Y2 X , X ,.... The total evidence at time t is e . How would you use X • Xi is normal if Xi = 1, 1 2 t Y2 3 message-passing to calculate and faulty if Xi = 0. X3 Y • Normally all variables are • p(xt |et ), ∀xt ? 3 AND on, and a failure occurs if (You’ll obtain as a special case the so-called forward algorithm.) Y3 = 0. Y3 • p(xt+1 |et ), ∀xt+1? (This is a prediction problem.) • p(xk |et ), ∀xk , k < t? (You’ll obtain as a special case the so-called backward algorithm.) Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 5 / 28 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 6 / 28 Belief Propagation Review and Examples Belief Propagation Review and Examples Example: Belief Updating Example: Explanations based on Beliefs Without observing any evidence, all the π-messages are prior probabilities: ˆ ˜ ˆ ˜ If q1 = 0.45 and q3 = 0.4, we obtain πXi ,Yi (xi ) = pi , qi , i = 1, 2, 3; πY0,Y1 (y0) = 1 , 0 , ˆ ˜ ˆ ˜ πY ,Y (y1) = p1 , q1 , πY ,Y (y2) = p1p2 , 1 − p1p2 , 1 2 2 3 P(X1 = 0|e) = 0.672 > P(X1 = 1|e) = 0.328, for xi = 1, 0 and yi = 1, 0. P(X3 = 0|e) = 0.597 > P(X3 = 1|e) = 0.403. Suppose e : {X2 = 1, Y3 = 0} is received. Then, X2 updates its message to Is I1 = {X1 = 0, X3 = 0} the most probable explanation of e, however? Y2 and Y2 updates its message to Y3: ˆ ˜ ˆ ˜ There are three possible explanations πX2,Y2 (x2) = p2 , 0 , πY2,Y3 (y2) = p1p2 , q1p2 . I1 = {X1 = 0, X3 = 0}, I2 = {X1 = 0, X3 = 1}, I3 = {X1 = 1, X3 = 0}. λ-messages starting from Y3 upwards are given by: ˆ ˜ ˆ ˜ Direct calculation shows λY3,X3 (x3) = p2q1 , p2 , λY3,Y2 (y2) = q3 , 1 ; ˆ ˜ ˆ ˜ q1q3 q1p3 p1q3 λY2,X2 (x2) = p1q3 + q1 , p1 + q1q3 , λY2,Y1 (y1) = p2q3 , p2 ; P(I1 |e) = , P(I2 |e) = , P(I3 |e) = . ˆ ˜ 1 − p1p3 1 − p1p3 1 − p1p3 λY1,X1 (x1) = p2q3 , p2 . So, if 0.5 > q1 > q2 > q3, then based on the evidence, I2 is the most So q3p2 q3 q3 probable explanation, while I1 is the least probable explanation. P(X3 = 0|e) = = = , p3p2q1 + q3p2 p3q1 + q3 1 − p1p3 q1p2 q1 q1 P(X1 = 0|e) = = = . p1p2q3 + q1p2 p1q3 + q1 1 − p1p3 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 7 / 28 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 8 / 28 Outline Outline Outline Outline Bayesian Networks: Message Passing in Singly Connected Bayesian Networks:Bayesian Message Networks: Passing Message in Singly Passing Connected in Singly Connected Networks Networks Networks Inference Engine: Overview Bayesian Networks: Message Passing in Singly Connected Inference Engine: OverviewInferenceOutline Engine:Outline Overview Belief Propagation in Chains Belief PropagationBelief in Chains Propagation in Chains Networks Inference Engine: Overview Huizhen Yu Belief Propagation in Trees Huizhen Yu Huizhen Yu Belief PropagationBelief in Trees Propagation in Trees BeliefBelief Propagation Propagation in Singly in Chains Connected Networks Belief PropagationOutline Belief in Singly Propagation Connected in Networks Singly Connected Networks [email protected].fi Belief Propagation in Trees Bayesian Networks:BayesianDept. Computer Message Networks:Huizhen Science, Passing Univ. Yu [email protected].fi Helsinki in Singly Passing [email protected].fi in Singly Connected Loopy Belief Propagation Dept. Computer Science, Univ.Dept. of Computer Helsinki Science, Univ. of Helsinki Loopy Belief PropagationLoopy Belief Propagation Networks Networks InferenceConditioningBelief Engine: Propagation Overview in Singly Connected Networks Inference Engine: OverviewConditioning [email protected].fi Conditioning ProbabilisticDept. Computer Models, Science, Spring, Univ. of Helsinki 2010 BeliefLoopy Propagation BeliefBelief Propagation in ChainsPropagation in Chains Bayesian Networks: Message PassingProbabilistic in Singly Models, ConnectedProbabilistic Spring, 2010 Models, Spring, 2010 Huizhen Yu Huizhen Yu BeliefConditioning PropagationBelief in TreesPropagation in Trees Networks Inference Engine: Overview Belief Propagation Generalized Belief Propagation – Max-Product Belief Propagation Generalized BeliefProbabilistic Propagation Models, – Max-Product Spring, 2010 Belief PropagationBelief in SinglyPropagation Connected in Singly Networks Connected Networks Belief Propagation in Chains [email protected].fi [email protected].fi Loopy Belief Propagation Dept. Computer Science,Dept. Univ. Computer of Helsinki Science, Univ. of Helsinki Loopy Belief Propagation Huizhen Yu Belief Propagation in Trees Outline Recall Notation for Singly Connected Networks Conditioning Conditioning Belief Propagation in Singly Connected Networks Huizhen Yu (U.H.) BayesianProbabilistic Networks: Message Models, Passing inSpring, Singly Connected 2010 Networks Feb. 16 1 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 2 / 17 Huizhen [email protected].fi (U.H.) HuizhenBayesianProbabilistic Yu (U.H.) Networks: Message Models, PassingBayesian in Spring, Singly Networks: Connected 2010 Message Networks Passing in Singly ConnectedFeb. Networks 16 1 / 17 Feb. 16 1 / 17 Huizhen Yu (U.H.) HuizhenBayesian Yu (U.H.) Networks: Message PassingBayesian in Singly Networks: Connected Message Networks Passing in Singly ConnectedFeb. Networks 16 2 / 17 Feb. 16 2 / 17 Consider a vertex v. Dept. Computer Science, Univ. of Helsinki Loopy Belief Propagation Conditioning • pa(v) = {u1,..., un}, ch(v) =Huizhen{ Yuw1 (U.H.),..., wmInference}Bayesian; Engine: Networks: Overview Message Passing in Singly Connected Networks Feb. 16 1 / 17 Huizhen Yu (U.H.) Inference Engine:Bayesian Overview Networks:Belief Message Propagation Passing in in Chains Singly Connected Networks Feb. 16 2 / 17 Probabilistic Models, Spring,Inference Engine:2010 Overview Inference Engine: Overview Inference Engine: Overview BeliefInference Propagation Engine: in Overview Chains Belief Propagation in Chains • Outline Tvu, u ∈ pa(v): the sub-polytree containing the parent u, resulting Outline Outline from removing the edge (u, v); Inference Engine: Overview Inference Engine: Overview Belief Propagation in Chains Huizhen Yu (U.H.) Huizhen YuBayesian (U.H.) Networks: MessageBayesian Passing Networks: in Singly Connected Message Passing Networks in Singly Connected NetworksFeb. 16 1 / 17 Feb. 16 1 / 17 Huizhen Yu (U.H.) Huizhen YuBayesian (U.H.) Networks: MessageBayesian Passing Networks: in Singly Connected Message Passing Networks in Singly Connected NetworksFeb. 16 2 / 17 Feb. 16 2 / 17 • Tvw , w ∈ ch(v): the sub-polytree containing the child w, resulting Outline Belief Propagation from removing the edge (v, w). Inference Engine: OverviewInference Engine: Overview Inference Engine: OverviewInferenceBelief Engine: Propagation Overview in ChainsBelief Propagation in Chains Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 1 / 17 Huizhen YuInference (U.H.) Engine:Bayesian Overview Networks: Message Passing in Singly Connected Networks Feb. 16 2 / 17 Review and Examples Form of evidence Inference Engine: OverviewInferenceOutline Engine:Outline Overview For a sub-polytree T , denote Form of evidence Form of evidence Belief Propagation in Chains Belief PropagationBelief in Chains Propagation in Chains InferenceeT Engine:un Overviewvw1 wm e + e λπ Belief PropagationInference Engine: in Overview Trees Belief Propagation in Chains Generalized Belief Propagation – Max-Product •
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages7 Page
-
File Size-