<<

Outline

Bayesian Networks: Belief Propagation (Cont’d) Belief Propagation Review and Examples Huizhen Yu Generalized Belief Propagation – Max-Product Applications to Loopy Graphs [email protected].fi Dept. Computer Science, Univ. of Helsinki

Probabilistic Models, Spring, 2010

Announcement: The last exercise will be posted online soon.

Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 1 / 28 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 2 / 28

Belief Propagation Review and Examples Belief Propagation Review and Examples Outline Review of Last Lecture We studied an for computing marginal posterior distributions: • It works in singly connected networks, which are DAGs whose undirected versions are trees. • It is suitable for parallel implementation. • It is recursively derived by Belief Propagation (i) dividing the total evidence in pieces, according to the independence structure represented by the DAG, and then Review and Examples (ii) incorporating evidence pieces in either the probability terms Generalized Belief Propagation – Max-Product (π-messages) or the likelihood terms (conditional probability terms; Applications to Loopy Graphs λ-messages).

Queries answerable by the algorithm for a singly connected network: • P(X = x |e) for a single x;

• P(Xv = xv |e) for all xv and v ∈ V ;

• Most probable configurations, arg maxx p(x & e). This can be related to finding global optimal solutions by distributed local computation. (Details are given today.)

Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 3 / 28 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 4 / 28 Belief Propagation Review and Examples Belief Propagation Review and Examples Practice: Belief Propagation for HMM A Fault-Detection Example t = 0 t = n A logic circuit for fault detection and its (Pearl 1988): t = 1

Y0 = 1 Y0 = 1 t = n +1 t = 2 X1 P(Xi = 1) = pi , X1 AND P(Xi = 0) = 1 − pi = qi , t = 3 Y 1 Y1 Yi = Yi−1 AND Xi . X2 X2 • Y = 1 always. Observation variables (black) are instantiated; latent variables (white) are AND 0 Y2 X , X ,.... The total evidence at time t is e . How would you use X • Xi is normal if Xi = 1, 1 2 t Y2 3 message-passing to calculate and faulty if Xi = 0. X3 Y • Normally all variables are • p(xt |et ), ∀xt ? 3 AND on, and a failure occurs if (You’ll obtain as a special case the so-called forward algorithm.) Y3 = 0. Y3 • p(xt+1 |et ), ∀xt+1? (This is a prediction problem.)

• p(xk |et ), ∀xk , k < t? (You’ll obtain as a special case the so-called backward algorithm.)

Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 5 / 28 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 6 / 28

Belief Propagation Review and Examples Belief Propagation Review and Examples Example: Belief Updating Example: Explanations based on Beliefs Without observing any evidence, all the π-messages are prior probabilities:

ˆ ˜ ˆ ˜ If q1 = 0.45 and q3 = 0.4, we obtain πXi ,Yi (xi ) = pi , qi , i = 1, 2, 3; πY0,Y1 (y0) = 1 , 0 , ˆ ˜ ˆ ˜ πY ,Y (y1) = p1 , q1 , πY ,Y (y2) = p1p2 , 1 − p1p2 , 1 2 2 3 P(X1 = 0|e) = 0.672 > P(X1 = 1|e) = 0.328, for xi = 1, 0 and yi = 1, 0. P(X3 = 0|e) = 0.597 > P(X3 = 1|e) = 0.403.

Suppose e : {X2 = 1, Y3 = 0} is received. Then, X2 updates its message to Is I1 = {X1 = 0, X3 = 0} the most probable explanation of e, however? Y2 and Y2 updates its message to Y3: ˆ ˜ ˆ ˜ There are three possible explanations πX2,Y2 (x2) = p2 , 0 , πY2,Y3 (y2) = p1p2 , q1p2 .

I1 = {X1 = 0, X3 = 0}, I2 = {X1 = 0, X3 = 1}, I3 = {X1 = 1, X3 = 0}. λ-messages starting from Y3 upwards are given by: ˆ ˜ ˆ ˜ Direct calculation shows λY3,X3 (x3) = p2q1 , p2 , λY3,Y2 (y2) = q3 , 1 ; ˆ ˜ ˆ ˜ q1q3 q1p3 p1q3 λY2,X2 (x2) = p1q3 + q1 , p1 + q1q3 , λY2,Y1 (y1) = p2q3 , p2 ; P(I1 |e) = , P(I2 |e) = , P(I3 |e) = . ˆ ˜ 1 − p1p3 1 − p1p3 1 − p1p3 λY1,X1 (x1) = p2q3 , p2 . So, if 0.5 > q1 > q2 > q3, then based on the evidence, I2 is the most So q3p2 q3 q3 probable explanation, while I1 is the least probable explanation. P(X3 = 0|e) = = = , p3p2q1 + q3p2 p3q1 + q3 1 − p1p3 q1p2 q1 q1 P(X1 = 0|e) = = = . p1p2q3 + q1p2 p1q3 + q1 1 − p1p3

Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 7 / 28 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 8 / 28 Outline Outline Outline Outline

Bayesian Networks: Message Passing in Singly Connected Bayesian Networks:Bayesian Message Networks: Passing Message in Singly Passing Connected in Singly Connected Networks Networks Networks Inference Engine: Overview Bayesian Networks: Message Passing in Singly Connected Inference Engine: OverviewInferenceOutline Engine:Outline Overview Belief Propagation in Chains Belief PropagationBelief in Chains Propagation in Chains Networks Inference Engine: Overview Huizhen Yu Belief Propagation in Trees Huizhen Yu Huizhen Yu Belief PropagationBelief in Trees Propagation in Trees BeliefBelief Propagation Propagation in Singly in Chains Connected Networks Belief PropagationOutline Belief in Singly Propagation Connected in Networks Singly Connected Networks [email protected].fi Belief Propagation in Trees Bayesian Networks:BayesianDept. Computer Message Networks:Huizhen Science, Passing Univ. Yu [email protected].fi Helsinki in Singly Passing [email protected].fi in Singly Connected Loopy Belief Propagation Dept. Computer Science, Univ.Dept. of Computer Helsinki Science, Univ. of Helsinki Loopy Belief PropagationLoopy Belief Propagation Networks Networks InferenceConditioningBelief Engine: Propagation Overview in Singly Connected Networks Inference Engine: OverviewConditioning [email protected].fi Conditioning ProbabilisticDept. Computer Models, Science, Spring, Univ. of Helsinki 2010 BeliefLoopy Propagation BeliefBelief Propagation in ChainsPropagation in Chains Bayesian Networks: Message PassingProbabilistic in Singly Models, ConnectedProbabilistic Spring, 2010 Models, Spring, 2010 Huizhen Yu Huizhen Yu BeliefConditioning PropagationBelief in TreesPropagation in Trees Networks Inference Engine: Overview Belief Propagation Generalized Belief Propagation – Max-Product Belief Propagation Generalized BeliefProbabilistic Propagation Models, – Max-Product Spring, 2010 Belief PropagationBelief in SinglyPropagation Connected in Singly Networks Connected Networks Belief Propagation in Chains [email protected].fi [email protected].fi Loopy Belief Propagation Dept. Computer Science,Dept. Univ. Computer of Helsinki Science, Univ. of Helsinki Loopy Belief Propagation Huizhen Yu Belief Propagation in Trees Outline Recall Notation for Singly Connected Networks Conditioning Conditioning Belief Propagation in Singly Connected Networks Huizhen Yu (U.H.) BayesianProbabilistic Networks: Message Models, Passing inSpring, Singly Connected 2010 Networks Feb. 16 1 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 2 / 17 Huizhen [email protected].fi (U.H.) HuizhenBayesianProbabilistic Yu (U.H.) Networks: Message Models, PassingBayesian in Spring, Singly Networks: Connected 2010 Message Networks Passing in Singly ConnectedFeb. Networks 16 1 / 17 Feb. 16 1 / 17 Huizhen Yu (U.H.) HuizhenBayesian Yu (U.H.) Networks: Message PassingBayesian in Singly Networks: Connected Message Networks Passing in Singly ConnectedFeb. Networks 16 2 / 17 Feb. 16 2 / 17 Consider a vertex v. Dept. Computer Science, Univ. of Helsinki Loopy Belief Propagation Conditioning • pa(v) = {u1,..., un}, ch(v) =Huizhen{ Yuw1 (U.H.),..., wmInference}Bayesian; Engine: Networks: Overview Message Passing in Singly Connected Networks Feb. 16 1 / 17 Huizhen Yu (U.H.) Inference Engine:Bayesian Overview Networks:Belief Message Propagation Passing in in Chains Singly Connected Networks Feb. 16 2 / 17 Probabilistic Models, Spring,Inference Engine:2010 Overview Inference Engine: Overview Inference Engine: Overview BeliefInference Propagation Engine: in Overview Chains Belief Propagation in Chains • Outline Tvu, u ∈ pa(v): the sub- containing the parent u, resulting Outline Outline from removing the edge (u, v); Inference Engine: Overview Inference Engine: Overview Belief Propagation in Chains Huizhen Yu (U.H.) Huizhen YuBayesian (U.H.) Networks: MessageBayesian Passing Networks: in Singly Connected Message Passing Networks in Singly Connected NetworksFeb. 16 1 / 17 Feb. 16 1 / 17 Huizhen Yu (U.H.) Huizhen YuBayesian (U.H.) Networks: MessageBayesian Passing Networks: in Singly Connected Message Passing Networks in Singly Connected NetworksFeb. 16 2 / 17 Feb. 16 2 / 17 • Tvw , w ∈ ch(v): the sub-polytree containing the child w, resulting Outline Belief Propagation from removing the edge (v, w). Inference Engine: OverviewInference Engine: Overview Inference Engine: OverviewInferenceBelief Engine: Propagation Overview in ChainsBelief Propagation in Chains Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 1 / 17 Huizhen YuInference (U.H.) Engine:Bayesian Overview Networks: Message Passing in Singly Connected Networks Feb. 16 2 / 17 Review and Examples Form of evidence Inference Engine: OverviewInferenceOutline Engine:Outline Overview For a sub-polytree T , denote Form of evidence Form of evidence Belief Propagation in Chains Belief PropagationBelief in Chains Propagation in Chains InferenceeT Engine:un Overviewvw1 wm e + e λπ Belief PropagationInference Engine: in Overview Trees Belief Propagation in Chains Generalized Belief Propagation – Max-Product • vu1 u v − XT : the variables associated with u u vwu1w un e +vwe 1 wλπm e + e λπ Belief PropagationBelief in Trees Propagation in Trees 1 n 1 m u v − u v − Inference Engine: Overview Form of evidence Belief Propagation in SinglyOutline Connected Networks nodes in T Belief PropagationBelief Propagation in ChainsBelief in Singly Propagation Connected in Networks Singly Connected Networks Applications to Loopy Graphs Loopy Belief Propagation Loopy Belief PropagationLoopy Belief Propagation • eT : the partial evidence of XT u v w Belief Propagation in Trees InferenceConditioning Engine:Inference Overview Engine: Overview Form of evidenceForm of evidence Belief PropagationConditioning in SinglyConditioning Connected Networks Divide the total evidence e in pieces: Belief PropagationBelief in ChainsPropagation in Chains Loopy Belief Propagation u v w1 wm e + e λπ+ Belief PropagationBelief in TreesPropagation in Trees • eT , u ∈ pa(v); u vu w1 vw−m eu ev λπ vu − Inference Engine:Conditioning Overview Form of evidence Belief PropagationBelief in SinglyPropagation Connected in Singly Networks Connected Networks • ev ; Belief Propagation in Chains Loopy Belief PropagationLoopy Belief Propagation eTvu un v eTvw wm eu+ ev λπ Belief Propagation in Trees • eTvw , w ∈ ch(v). 1 1 − Conditioning Conditioning Belief Propagation in Singly Connected Networks Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 3 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 4 / 17 Huizhen Yu (U.H.) HuizhenBayesian Yu (U.H.) Networks: Message PassingBayesian in Singly Networks: Connected Message Networks Passing in Singly ConnectedFeb. Networks 16 3 / 17 LoopyFeb. 16 Belief 3 / 17 PropagationHuizhen Yu (U.H.) HuizhenBayesian Yu (U.H.) Networks: Message PassingBayesian in Singly Networks: Connected Message Networks Passing in Singly ConnectedFeb. Networks 16 4 / 17 Feb. 16 4 / 17 We want to solve: maxx p(x & e). Conditioning

Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 3 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 4 / 17 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 9 / 28 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 10 / 28

Huizhen Yu (U.H.) Huizhen YuBayesian (U.H.) Networks: MessageBayesian Passing Networks: in Singly Connected Message Passing Networks in Singly Connected NetworksFeb. 16 3 / 17 Feb. 16 3 / 17 Huizhen Yu (U.H.) Huizhen YuBayesian (U.H.) Networks: MessageBayesian Passing Networks: in Singly Connected Message Passing Networks in Singly Connected NetworksFeb. 16 4 / 17 Feb. 16 4 / 17

Belief Propagation Generalized Belief Propagation – Max-Product Belief Propagation Generalized Belief Propagation – Max-Product

Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 3 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 4 / 17 Derivation of the Message Passing Algorithm Derivation of the Message Passing Algorithm

Evidence structure: We can express the joint distribution P(X ) as Consider first the maximization with respect to xTvw , w ∈ ch(v). We have Y Y “ Y ” p(x) = p(x ) · p(x |x ) · p(x |x ). (1) max ··· max p(x & e) = p(xT & eT ) · p(xv & ev |xpa(v))· Tvu v pa(v) Tvw v x x vu vu Tvw Tvwm u∈pa(v) w∈ch(v) 1 u∈pa(v) Y We then enter the evidence e (put each piece in a proper term) to obtain max p(xTvw & eTvw |xv ). xTvw Y Y w∈ch(v) p(x & e) = p(xT & eT )·p(xv & ev |xpa(v))· p(xT & eT |xv ). (2) vu vu vw vw Maximizing the above expression with respect to x ,..., x , we u∈pa(v) w∈ch(v) Tvu1 \{u1} Tvun \{un} obtain (For a detailed derivation of Eqs. (1) and (2), see slides 24-27.) “ Y ” Y max p(xTvu & eTvu ) · p(xv & ev |xpa(v)) · max p(xTvw & eTvw |xv ). xT \{u} xT Max-Product: To solve maxx p(x & e), we consider maximizing with respect to u∈pa(v) vu w∈ch(v) vw groups of variables in the following order: Define max ⇔ max max max ··· max max ··· max , ∗ ∗ x xv xpa(v) xT \{u } xT \{u } xT xT vu1 1 vun n vw1 vwm p (xu & eTvu ) = max p(xTvu & eTvu ), p (eTvw |xv ) = max p(xTvw , eTvw |xv ). xTvu \{u} xTvw where Tvu \{u} denotes the set of nodes in the sub-polytree Tvu except for {u}. (3)

Notice that for any two functions f1(x), f2(x, y), we have the identity We obtain ˘ ¯ n ` ´o “ Y ∗ ” Y ∗ max f1(x)f2(x, y) = max f1(x) · max f2(x, y) . max p(x & e) = max max p (xu & eTvu )·p(xv & ev |xpa(v)) · p (eTvw |xv ). x,y x y x xv xpa(v) u∈pa(v) w∈ch(v) We will similarly move certain maximization operations inside the products in We will call the expression inside ‘max x ’ the max-margin of X , denoted Eq. (2) to obtain a desirable factor form of max p(x & e). v v x ∗ p (xv & e).

Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 11 / 28 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 12 / 28 Outline Outline Outline Outline

Bayesian Networks: Message Passing in Singly Connected Bayesian Networks:Bayesian Message Networks: Passing Message in Singly Passing Connected in Singly Connected Networks Networks Networks Inference Engine: Overview Bayesian Networks: Message Passing in Singly Connected Inference Engine: OverviewInferenceOutline Engine:Outline Overview Belief Propagation in Chains Belief PropagationBelief in Chains Propagation in Chains Networks Inference Engine: Overview Huizhen Yu Belief Propagation in Trees Huizhen Yu Huizhen Yu Belief PropagationBelief in Trees Propagation in Trees BeliefBelief Propagation Propagation in Singly in Chains Connected Networks Belief PropagationOutline Belief in Singly Propagation Connected in Networks Singly Connected Networks [email protected].fi Belief Propagation in Trees Bayesian Networks:BayesianDept. Computer Message Networks:Huizhen Science, Passing Univ. Yu [email protected].fi Helsinki in Singly Passing [email protected].fi in Singly Connected Loopy Belief Propagation Dept. Computer Science, Univ.Dept. of Computer Helsinki Science, Univ. of Helsinki Loopy Belief PropagationLoopy Belief Propagation Networks Networks InferenceConditioningBelief Engine: Propagation Overview in Singly Connected Networks Inference Engine: OverviewConditioning [email protected].fi Conditioning ProbabilisticDept. Computer Models, Science, Spring, Univ. of Helsinki 2010 BeliefLoopy Propagation BeliefBelief Propagation in ChainsPropagation in Chains Bayesian Networks: Message PassingProbabilistic in Singly Models, ConnectedProbabilistic Spring, 2010 Models, Spring, 2010 Belief Propagation Generalized Belief Propagation – Max-Product Belief Propagation Generalized Belief Propagation – Max-Product Huizhen Yu Huizhen Yu BeliefConditioning PropagationBelief in TreesPropagation in Trees Networks Inference Engine: Overview Probabilistic Models, Spring, 2010 Belief PropagationBelief in SinglyPropagation Connected in Singly Networks Connected Networks Belief Propagation in Chains Derivation of the Message Passing [email protected].fi [email protected].fi Loopy Belief Propagation Meanings of the Messages and Max-Margin Dept. Computer Science,Dept. Univ. Computer of Helsinki Science, Univ. of Helsinki Loopy Belief Propagation Huizhen Yu Belief Propagation in Trees Conditioning Conditioning Thus we obtain Belief Propagation in Singly∗ Connected Networks Huizhen Yu (U.H.) BayesianProbabilistic Networks: Message Models,Probabilistic Passing inSpring, Singly Connected Models, 2010 Networks Spring, 2010 Feb. 16 1 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 2 / 17 Huizhen∗ [email protected].fi (U.H.) HuizhenBayesian Yu (U.H.) Networks: Message PassingBayesian in Singly Networks: Connected Message Networks Passing in Singly ConnectedFeb. Networks 16 1 / 17 Feb. 16 1 / 17 Huizhen• p Yu (U.H.)(xu & eHuizhenTvuBayesian): Yu (U.H.) Networks: If X Messageu = PassingBayesianxu in, Singly Networks: there Connected Message Networks exists Passing in Singly some ConnectedFeb. Networks 16 configuration 2 / 17 Feb. 16 of 2 / 17xTvu which max p(x & e) = max pDept.( Computerxv & Science,e) Univ. of Helsinki Loopy Belief Propagation x xv Conditioning best explains the partial evidence eTvu with this probability. where Huizhen Yu (U.H.) InferenceBayesian Engine: Networks: Overview Message Passing in Singly Connected Networks Feb. 16 1 / 17 Huizhen Yu (U.H.) Inference∗ Engine:Bayesian Overview Networks:Belief Message Propagation Passing in in Chains Singly Connected Networks Feb. 16 2 / 17 Probabilistic Models, Spring,Inference Engine:2010 Overview Inference Engine: Overview • p (e |xInference): Engine: If X Overview= BeliefxInference Propagation, Engine:there in Overview Chains existsBelief Propagation some in Chains configuration of x which “ Y ” Y Tvw v v v Tvw p∗(x & e) = max p∗(x & e ) · p(x & e |x ) · p∗(e |x ). Outline v u Tvu v v pa(v) Tvw v best explains the partialOutline evidenceOutlineeTvw conditional on Xv , with this xpa(v) u∈pa(v) Inference Engine: Overvieww∈ch(v) Inference Engine: Overview Belief Propagation in Chains Huizhen Yu (U.H.) Huizhen YuBayesian (U.H.) Networks: MessageBayesian Passing Networks: in Singly Connected Message Passing Networks in Singly Connected NetworksFeb. 16 1 / 17 Feb. 16 1 / 17 Huizhen Yu (U.H.) Huizhenprobability. YuBayesian (U.H.) Networks: MessageBayesian Passing Networks: in Singly Connected Message Passing Networks in Singly Connected NetworksFeb. 16 2 / 17 Feb. 16 2 / 17 (4) ∗ Outline • p (xv & e): If Xv = xv , there exists some configuration of the rest of

If v can receive messages Inference Engine: OverviewInference Engine: Overview Inference Engine: OverviewInferenceBelief Engine: Propagation Overview in ChainsBelief Propagation in Chains Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 1 / 17 Huizhen Yu (U.H.) Bayesian Networks:the Message variables Passing in Singly Connected which Networks best explainsFeb. 16 2 / the 17 evidence e with this probability. ∗ Inference Engine: Overview • πu,v from all parents, where Form of evidence Inference Engine: OverviewInferenceOutline Engine:Outline Overview Form of evidence Form of evidence Belief Propagation in Chains ∗ ∗ Belief PropagationBelief in∗ Chains Propagation in Chains π (x ) = p (x & e ), ∀x , InferenceeT Engine:un Overviewvw1 wm e + e λπ Belief PropagationHowInference Engine: to in Overview TreesobtainBelief Propagationx ∈ in Chainsarg max p(x & e)? u,v u u Tvu u vu1 u v − x u u vwu1w un e +vwe 1 wλπm e + e λπ Belief PropagationBelief in Trees Propagation in Trees 1 n 1 m u v − u v − Inference Engine: Overview ∗ Form of evidence Belief Propagation in SinglyOutline∗ Connected Networks ∗ ∗ • λ from all children, where Belief Propagation•BeliefIf x Propagation inis Chains unique,Belief in Singly Propagation then Connected the in Networks Singly solutions Connectedx Networks∈ arg max p (xv & e) for all v w,v Loopy Belief Propagation v xv Loopy Belief PropagationLoopy Belief Propagation ∗ ∗ ∗ u v w Belief Propagationformin the Trees global optimal solution (best explanation) x . InferenceConditioning Engine: Overview λw,v (xv ) = p (eTvw |xv ), ∀xv , Inference Engine: OverviewConditioning Form of evidenceForm of evidence Belief PropagationConditioning∗ in Singly Connected Networks Belief Propagation• IfBeliefx in ChainsPropagationis not unique, in Chains then we will need to trace out a solution from some then v can calculate its max-margin Loopy Belief Propagation Belief Propagation Applications to Loopy Graphs Belief Propagation Applications to Loopy∗ Graphs ∗ u v w1 wum veu+ w1 evwm λπe + e λπ Belief PropagationnodeBelief in TreesPropagationv. This in showsTrees that for each x ∈ arg max p (x & e), v should − u v − v xv v ∗ Belief Propagation ApplicationsInference to Loopy Graphs Engine:Conditioning Overview Belief Propagation Applications to Loopy Graphs p (xv & e), ∀xv , Form of evidence Belief PropagationBelief in SinglyPropagation Connected in Singly Networks Connected Networks ∗ Outline Belief Propagation in Chainsrecord the correspondingIllustration of best Conditioning values xpa(v) of the parents in the Loopy Belief PropagationLoopy Belief Propagation ∗ and from which Outline maximization problem defining p (xv & e) [Eq.Illustration (4)]: of Conditioning eTvu un v eTvw wm eu+ ev λπ Belief Propagation in Trees 1 1 − Conditioning Conditioning ∗ Belief Propagation in Singly Connected Networks max p (xv & e) = max p(x & e). Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 3 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly ConnectedY Networks ∗ Feb. 16 4 / 17 ∗ x x Huizhen Yu (U.H.) HuizhenBayesian Yu (U.H.) Networks: Message PassingBayesian in Singly Networks: Connected Message Networks Passing in Singly ConnectedFeb. Networks 16 3 / 17 Feb. 16 3 / 17 Huizhen Yu (U.H.) HuizhenBayesian Yu (U.H.) Networks: Message PassingBayesian in Singly Networks: Connected Message Networks Passing in Singly ConnectedFeb. Networks 16 4 / 17 Feb. 16 4 / 17 v Loopy Belief Propagation max p (xu & eTvu ) · p(xv & ev |xpa(v)). xpa(v) Conditioning u∈pa(v) X = 0 X = 0 X = 1 X = 1 X1 0 0 0 0 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 3 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 4 / 17 X = 0 X = 0 X = 1 X = 1 X1 0 0 0 0 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 13 / 28 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 14 / 28

Belief Propagation X2 X3 X2 X3 X2 X3 Huizhen Yu (U.H.) Huizhen YuBayesian (U.H.) Networks: MessageBayesian Passing Networks: in Singly Connected Message Passing Networks in Singly Connected NetworksFeb. 16 3 / 17 Feb. 16 3 / 17 Huizhen Yu (U.H.) Huizhen YuBayesian (U.H.) Networks: MessageBayesian Passing Networks: in Singly Connected Message Passing Networks in Singly Connected NetworksFeb. 16 4 / 17 Feb. 16 4 / 17 Belief Propagation X = 0 X2 X = 1X3 X X X X Generalized Belief Propagation – Max-Product 0 0 2 3 2 3 X = 0 X = 1 X5 0 0 Generalized Belief Propagation – Max-Product X5 X5 X4 X Applications to Loopy Graphs X4 5 X4 X X Belief Propagation Generalized Belief Propagation – Max-Product Belief Propagation Generalized Belief Propagation – Max-Product5 5 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 3 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16X4 4 / 17 Applications to Loopy Graphs X4 X4 X6 Outline X6 X6 Outline Outline X6 Outline Derivation of the Message Passing Algorithm Derivation of the Message Passing AlgorithmX6 Outline X6 Outline Outline Outline Now we only need to check if v can compose messages for its parents and children Bayesian Networks: Message Passing∗ in Singly Connected • A childBayesianw needs Networks:Bayesianp Message(x Networks:v & PassingeT Messagewv in) Singly for Passing Connected all inxv Singly, which Connected incorporates the partial Networks Bayesian Networks: Message Passing in Singly Connected to calculate their max-margins. Networks Networks BayesianInference Networks:Bayesian Engine: Message Overview Networks: Passing Message in Singly Passing Connected in Singly Connected Bayesianevidence Networks:eT Messagefrom Passing the in Singly sub-polytree Connected on v’s side withNetworks respectInference to Engine:w: OverviewInferenceOutline Engine:Outline Overview wv Belief PropagationNetworks in Chains Networks Inference Engine: Overview ∗ Networks ∗ Bayesian Networks: Message PassingBelief in Singly Propagation ConnectedBelief in Chains Propagation in Chains Inference Engine: OverviewInferenceOutline Engine:Outline Overview BeliefInference Propagation Engine: in Overview Trees Belief Propagation in Chains • A parent u needs p (eT |xu) for all xu based on the partial evidence eT Huizhen Yup (xv & eT ) = max p(xT & eT ). Belief PropagationBelief in Chains Propagation in Chains uv uv Huizhen Yu wvHuizhen Yu wv Networkswv Belief PropagationBelief in Trees Propagation in Trees Inference Engine: Overview xT \{v} HuizhenBeliefBelief Propagation Yu Propagation in Singly in Chains Connected Networks Belief Propagation in Trees from the sub-polytree on v’s side with respect to u: wv HuizhenBelief PropagationOutline Yu HuizhenBelief in Singly Propagation Yu Connected in Networks Singly Connected Networks Belief PropagationBelief in Trees Propagation in Trees [email protected].fi Belief Propagation in Trees BeliefBelief Propagation Propagation in Singly in Chains Connected Networks Bayesian Networks:Bayesian Message Networks:Huizhen Passing Yu [email protected].fi in Singly Passing [email protected].fi in Singly Connected Loopy Belief Propagation Outline Dept. Computer Science, Univ. of Helsinki Loopy Belief PropagationLoopy Belief Propagation Belief PropagationBelief in Singly Propagation Connected in Networks Singly Connected Networks ∗ NetworksDept. Computer Science, Univ.Dept. of Computer Helsinki Science, Univ. of HelsinkiBayesian Networks: [email protected].fiHuizhen Passing Yu in Singly Connected LoopyBelief Belief Propagation Propagation in Trees By a similar calculationNetworks as in the previous slides,BayesianDept. one ComputerInference Networks:Conditioningcan Science,BeliefUniv. Engine: show [email protected].fi HelsinkiInference Overview that Passing Engine: in [email protected].fi Overviewin ConnectedSingly Connected Networks p (e |x ) = max p(x & e |x ). Dept. Computer Science, Univ.Dept. of Computer Helsinki Conditioning Science, Univ. of Helsinki Loopy Belief PropagationLoopy Belief Propagation Tuv u Tuv Tuv u [email protected].fi Conditioning NetworksLoopy BeliefNetworks Propagation InferenceConditioningBelief Engine: Propagation Overview in Singly Connected Networks xT ProbabilisticDept. Computer Models, Science, Spring, Univ. of Helsinki 2010 “ Belief PropagationBelief in ChainsPropagation” in Chains Inference Engine: OverviewConditioning uv ∗ Probabilistic Models,Probabilistic Spring, 2010 Models, Spring, 2010 [email protected].fiY ∗ Conditioning Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 17 / 23 Bayesian Networks:Huizhen Yu (U.H.)Message Passing in SinglyBayesian Connected Networks: Belief Propagation` (Cont’d) ´ Feb. 18 18 / 23 Loopy Belief Propagation p (xv & e ) = max p xv |x `vProbabilistic(xDept.v ) Computer· Models, Science,Conditioning Spring, Univ. ofπ Helsinki 2010 (xu) Belief PropagationBelief in ChainsPropagation in Chains NetworksHuizhenT Yuwv Huizhen Yu Bayesianpa( Networks:v) Message PassingBeliefProbabilistic in Propagation Singly Models,Beliefu Connected,v inProbabilistic TreesPropagationSpring, 2010 Models, in Trees Spring, 2010 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 17 / 23 xpa(v) Huizhen Yu (U.H.) InferenceBayesian Engine: Networks: Overview Belief Propagation (Cont’d) Feb. 18Conditioning 18 / 23 Indeed it is given by Probabilistic Models, Spring, 2010 NetworksuHuizhenBelief∈pa( Propagation Yuv) HuizhenBelief in SinglyPropagation Yu Connected in Singly Networks Connected Networks Belief PropagationBelief in TreesPropagation in Trees Belief Propagation in Chains Inference Engine: Overview [email protected].fi [email protected].fi Probabilistic Models, Spring, 2010 Belief Propagation in Singly Connected Networks Loopy Belief PropagationLoopy Belief Propagation Belief Propagation in Singly Connected Networks Dept. Computer Science,Dept. Univ. Computer of Helsinki Science, Univ. of Helsinki Belief Propagation in Chains  Huizhen Yu Belief [email protected].fi in [email protected].fi “ ” Y ∗ Dept. Computer Science, Univ. of Helsinki Loopy Belief PropagationLoopy Belief Propagation ∗ ` ´ Y ∗ ConditioningDept. ComputerConditioning Science, Univ. of Helsinki p (e |x ) = max max p x & e |x · Belief Propagationp Applications(x 0 & to Loopye Graphs) Belief Propagation· Applications to Loopyλ Graphsw 0,v (xv )Huizhen.Belief Propagation Yu in Singly Connected Networks Belief Propagation in Trees Tuv u v v pa(v) u T 0 Huizhen Yu (U.H.) BayesianProbabilistic Networks: Message Models, Passing inSpring, Singly Connected 2010 Networks Feb. 16 1 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 2 / 17 Conditioning Conditioning vu Belief Propagation Applications to Loopy Graphs Huizhen [email protected].fi (U.H.) HuizhenBayesianProbabilistic Yu (U.H.) Networks: Message Models, PassingBayesian in Spring, Singly Networks: Connected 2010 Message Networks Passing in Singly ConnectedFeb. Networks 16 1 / 17 Feb. 16Belief 1 / 17 PropagationHuizhen YuApplications (U.H.) Huizhen toBayesian Loopy Yu (U.H.) Networks: Graphs Message PassingBayesian in Singly Networks: Connected Message Networks Passing in Singly ConnectedFeb. Networks 16 2 / 17 Feb. 16 2 / 17 xv xpa(v)\{u} 0 Loopy Belief Propagation Belief Propagation in Singly Connected Networks 0 Dept. Computer Science, Univ. of Helsinki w ∈ch(v)\{wHuizhen} Yu (U.H.) BayesianProbabilistic Networks: Message Models,Probabilistic Passing inSpring, Singly Connected Models, 2010 Networks Spring, 2010 Feb. 16 1 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 2 / 17 u ∈pa(v)\{u} Huizhen [email protected].fi (U.H.) HuizhenBayesian Yu (U.H.) Networks: Message PassingBayesian in Singly Networks: Connected Message Networks Passing in Singly ConnectedFeb. Networks 16 1 / 17 LoopyFeb. 16 Belief 1 / 17 PropagationHuizhen Yu (U.H.) HuizhenBayesian Yu (U.H.) Networks: Message PassingBayesian in Singly Networks: Connected Message Networks Passing in Singly ConnectedFeb. Networks 16 2 / 17 Feb. 16 2 / 17 Turbo Decoding Example Further Readings Dept. ComputerConditioning Science, Univ. of Helsinki Huizhen Yu (U.H.) InferenceBayesian Engine: Networks: Overview Message Passing in Singly Connected∗ Networks Feb. 16 1 / 17 Huizhen Yu (U.H.) Inference Engine:Bayesian Overview Networks:Belief Message Propagation Passing in in Chains Singly Connected Networks ConditioningFeb. 16 2 / 17 ff Turbo Decoding Example SoProbabilistic this is Models, the Spring,messageInference Engine:2010 Overview πInference Engine:(x Overviewv ) that v needs to sendFurther to w; it Readings can beInference Engine: Overview BeliefInference Propagation Engine: in Overview Chains Belief Propagation in Chains Y v,w Huizhen Yu (U.H.) InferenceBayesian Engine: Networks: Overview Message Passing in Singly Connected Networks Feb. 16 1 / 17 Huizhen Yu (U.H.) Inference Engine:Bayesian Overview Networks:Belief Message Propagation Passing in in Chains Singly Connected Networks Feb. 16 2 / 17 ∗ Probabilistic Models, Spring,Inference Engine:2010 Overview Inference Engine:Outline Overview Inference Engine: Overview BeliefInference Propagation Engine: in Overview Chains Belief Propagation in Chains · p (e |xv ) (5) Outline Tvw composed once v receives the messages from all the other linked nodes. Outline Outline Inference Engine: Overview Inference Engine: Overview Belief Propagation in Chains Outline Outline Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 1 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 2 / 17 w∈ch(v) Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 1 / 17 Inference Engine: Overview Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 2 / 17 Inference Engine: Overview Belief Propagation in Chains Illustration of the partial evidence that theHuizhen messages Yu (U.H.) Huizhen YuλBayesian (U.H.)∗ Networks:(x Message),Bayesian Passingπ Networks: in∗ Singly Connected Message(x Passing Networks) in Singly carry: ConnectedOutline NetworksFeb. 16 1 / 17 Feb. 16 1 / 17 Huizhen Yu (U.H.) Huizhen YuBayesian (U.H.) Networks: MessageBayesian Passing Networks: in Singly Connected Message Passing Networks in Singly Connected NetworksFeb. 16 2 / 17 Feb. 16 2 / 17 “ Y ” v,u u v,w v Outline ` ´ ∗ Inference Engine: OverviewInference Engine: Overview Inference Engine: OverviewInferenceBelief Engine: Propagation Overview in ChainsBelief Propagation in Chains = max max p xv |x `v (xv ) · π 0 (x 0 ) Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 1 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 2 / 17 pa(v) u ,v u 1. Judea Pearl. Probabilistic Reasoning in Intelligent SystemsInference, Morgan Engine:Inference OverviewInference Engine: Engine: Overview Overview Inference Engine: OverviewInferenceBelief Engine: Propagation Overview in ChainsBelief Propagation in Chains xv x Form of evidence Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly ConnectedInference Networks Engine: OverviewInferenceFeb. 16 1 / Engine:17 Overview Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 2 / 17 pa(v)\{u} 0 Form of evidence Form of evidence 1. Judea Pearl. Probabilistic Reasoning in IntelligentOutline SystemsOutline , Morgan Inference Engine: Overview u ∈pa(v)\{u} Kaufmann, 1988. Chap. 5. Form of evidence Belief Propagation in Chains Inference Engine: OverviewInferenceOutline Engine:Outline Overview Form of evidence Form of evidenceBelief PropagationBelief in Chains Propagation in Chains Belief Propagation in Chains Inferencee Engine:u Overviewvw w e + e λπ Belief PropagationInference Engine: in Overview Trees Belief Propagation in Chains Tvu1 n 1 m u v − Kaufmann, 1988. Chap. 5. Belief PropagationBelief in Chains Propagation in Chains u1 un vw1 u1wm un e +vwe 1 wλπm eu+ e λπInference Engine: Overview Belief PropagationBelief in Trees Propagation in Trees Inference Engine: Overview Belief Propagation in Chains ff u v − v − eT un vw1 Inferencewm e Engine:+ e Overviewλπ Belief Propagation in Trees Form of evidence vu1 Belief Propagationu v − in Singly Connected Networks Y ∗ u1 un vw1 u1wm unOutlineeu+vwev1 wλπm eu+ ev λπ Inference Engine:Belief Overview PropagationBelief in Trees Propagation in Trees · λ (x ) . Form of evidence Belief PropagationBelief Propagation in Chains−Belief in Singly Propagation Connected− in Networks Singly Connected NetworksBelief Propagation in Singly Connected Networks w,v v Loopy Belief Propagation Belief PropagationOutline Belief in Singly Propagation Connected in Networks Singly Connected Networks Loopy Belief PropagationLoopy Belief Propagation Belief Propagation in Chains u v w eT eT Belief Propagation in Trees Loopy Belief Propagation w∈ch(v) u1v w1v InferenceConditioning Engine: Overview Loopy Belief PropagationLoopy Belief Propagation Form of evidence u veT w InferenceConditioningeT Engine: OverviewConditioning Belief Propagation in Trees Form of evidence Belief Propagationu1v w1 inv Singly Connected Networks InferenceConditioning Engine:Inference Overview Engine: Overview Form of evidenceForm of evidence Belief PropagationBelief in ChainsPropagation in Chains Belief PropagationConditioning in SinglyConditioning Connected Networks ∗ Loopy Belief Propagation Belief PropagationBelief in ChainsPropagation in Chains So this is the message λ (x ) that v needs to send to u; it can be composed u v w1 wum veu+ w1 evwm λπe + e λπ Belief PropagationBelief in TreesPropagation in Trees v,u u − u v − Loopy Belief Propagation uInference v w1 Engine:wm Overviewe + e λπ+ Belief PropagationBelief in TreesPropagation in Trees Form of evidence uConditioning vu w1 vw−m eu ev λπ Belief PropagationBelief in SinglyPropagation Connected− in Singly Networks Connected Networks Inference Engine:Conditioning Overview once v receives the messages from all the other linked nodes. Form of evidence Belief Propagation in Chains Belief PropagationBelief in SinglyPropagation Connected in Singly Networks Connected Networks Loopy Belief PropagationLoopy Belief Propagation Belief Propagation in Chains eTvu un v eTvw wm eu+ ev λπ Belief Propagation in Trees Loopy Belief PropagationLoopy Belief Propagation 1 1 − Conditioning eTvu un v eTvw wm eu+ ev λπConditioning Belief Propagation in Trees 1 Belief1 Propagation in− Singly Connected Networks Conditioning Conditioning (For the details of derivation of Eq. (5), see slide 28.) Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 3 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 4 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 3 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message PassingBelief in Singly Propagation Connected Networks in Singly ConnectedFeb. 16 Networks 4 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Huizhen YuFeb. (U.H.) 16 3 / 17 BayesianLoopy Networks: Belief Message Propagation Passing in SinglyHuizhen Connected Yu (U.H.) Networks BayesianFeb. Networks: 16 Message 3 / 17 Passing in Singly Connected Networks Huizhen Yu (U.H.)Feb. 16 4 / 17 Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 4 / 17 Huizhen Yu (U.H.) HuizhenBayesian Yu (U.H.) Networks: Message PassingBayesian in Singly Networks: Connected Message Networks Passing in Singly ConnectedFeb. Networks 16 3 / 17 LoopyFeb. 16 Belief 3 / 17 PropagationHuizhen Yu (U.H.) HuizhenBayesian Yu (U.H.) Networks: Message PassingBayesian in Singly Networks: Connected Message Networks Passing in Singly ConnectedFeb. Networks 16 4 / 17 Feb. 16 4 / 17 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 15 / 28 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d)Conditioning Feb. 18 16 / 28 Conditioning

Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 3 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 4 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 3 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 4 / 17

Huizhen Yu (U.H.) Huizhen YuBayesian (U.H.) Networks: MessageBayesian Passing Networks: in Singly Connected Message Passing Networks in Singly Connected NetworksFeb. 16 3 / 17 Feb. 16 3 / 17 Huizhen Yu (U.H.) Huizhen YuBayesian (U.H.) Networks: MessageBayesian Passing Networks: in Singly Connected Message Passing Networks in Singly Connected NetworksFeb. 16 4 / 17 Feb. 16 4 / 17 Huizhen Yu (U.H.) Huizhen YuBayesian (U.H.) Networks: MessageBayesian Passing Networks: in Singly Connected Message Passing Networks in Singly Connected NetworksFeb. 16 3 / 17 Feb. 16 3 / 17 Huizhen Yu (U.H.) Huizhen YuBayesian (U.H.) Networks: MessageBayesian Passing Networks: in Singly Connected Message Passing Networks in Singly Connected NetworksFeb. 16 4 / 17 Feb. 16 4 / 17

Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 3 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 4 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 3 / 17 Huizhen Yu (U.H.) Bayesian Networks: Message Passing in Singly Connected Networks Feb. 16 4 / 17 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 19 / 23 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 20 / 23 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 19 / 23 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 20 / 23 Belief Propagation Generalized Belief Propagation – Max-Product Belief Propagation Generalized Belief Propagation – Max-Product Max-Product Message Passing Algorithm Summary HMM Example Each node v • sends to each u of its parents  ∗ Y ∗ λv,u(xu) = max max p(xv |xpa(v)) `v (xv ) · πu0,v (xu0 ) xv xpa(v)\{u} u0∈pa(v)\{u} t = 0 ff t = n Y ∗ t = 1 · λw,v (xv ) , ∀xu; w∈ch(v) t = n +1 • sends to each w of its children t = 2 ∗ Y ∗ Y ∗ πv,w (xv ) = λw 0,v (xv )·max p(xv |xpa(v)) `v (xv )· πu,v (xu), ∀xv ; xpa(v) w 0∈ch(v)\{w} u∈pa(v) t = 3 • when receiving all messages from parents and children, calculates

∗ “ Y ∗ ” Y ∗ p (xv & e) = λw,v (xv ) ·max πu,v (xu)·p(xv |xpa(v)) `v (xv ), ∀xv . How would you use message-passing to calculate xpa(v) w∈ch(v) u∈pa(v) • maxx p(x1,..., xt |et )? This is identical to the algorithm in the last lecture, with maximization replacing the summation. (You’ll obtain as a special case the .)

∗ To obtain a x ∈ arg maxx p(x & e): • ∗ ∗ ∗ If x is unique, then it is given by xv ∈ arg maxxv p (xv & e) for all v. ∗ ∗ • If x is not unique, we can start from any node v, fix xv and then trace out the solutions at other nodes.

Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 17 / 28 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 18 / 28

Belief Propagation Generalized Belief Propagation – Max-Product Belief Propagation Applications to Loopy Graphs Discussion on Differences between Outline

a Belief Propagation Review and Examples b Generalized Belief Propagation – Max-Product Applications to Loopy Graphs

Node a is instantiated. Node b never receives any evidence. New pieces of evidence arrive to other nodes. • Does a need to update messages to all the linked nodes for belief updating? for finding the most probable configuration? • Does b need to update messages to all the linked nodes for belief updating? for finding the most probable configuration?

Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 19 / 28 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 20 / 28 Belief Propagation Applications to Loopy Graphs Belief Propagation Applications to Loopy Graphs Illustration of Conditioning Turbo Decoding Example

Modified from McEliece et al., 1998: Example (Pearl, 1988): Instantiating variable X1 renders the network singly connected. Noisy codeword received

X = 0 X = 0 X = 1 X = 1 X1 0 0 0 0

Codeword

X2 X3 X2 X3 X2 X3

X 0 = 0 X 0 = 1

X5 X X 5 5 Check bits X4 X X 4 4 (functions of the codeword)

X6 X6 X6

Noisy check bits received

Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 21 / 28 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 22 / 28

Belief Propagation Applications to Loopy Graphs Belief Propagation Applications to Loopy Graphs Further Reading Details of Derivation for Eq. (1)

1. First we argue that XTvu , u ∈ pa(v) are mutually independent. Abusing notation, for a sub-polytree T , we use T also for the set of nodes in T . Since G is singly connected, the subgraph G ` ´ consists of n = |pa(v)| disconnected An ∪u∈pa(v)Tvu components, Tvu, u ∈ pa(v). For any two disjoint subsets U1, U2 ⊆ pa(v), the set

of nodes ∪u∈U1 Tvu and ∪u∈U2 Tvu are disconnected, implying that 1. Judea Pearl. Probabilistic Reasoning in Intelligent Systems, Morgan X∪ T ⊥ X∪ T Kaufmann, 1988. Chap. 5. u∈U1 vu u∈U2 vu

for any disjoint subsets U1, U2. This shows that XTvu , u ∈ pa(v) are mutually independent, so Y p(x ,..., x ) = p(x ). Tvu1 Tvun Tvu u∈pa(v)

2. Next, choosing any well-ordering such that all the nodes in Tvu, u ∈ pa(v) have smaller numbers than v, we can argue by (DO) that

p(xv |x ,..., x ) = p(xv |x ). Tvu1 Tvun pa(v) Combining this with the preceding equation, we have Y p(x ,..., x , xv ) = p(x ) · p(xv |x ). Tvu1 Tvun Tvu pa(v) u∈pa(v)

Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 23 / 28 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 24 / 28 Belief Propagation Applications to Loopy Graphs Belief Propagation Applications to Loopy Graphs Details of Derivation for Eq. (1) Details of Derivation for Eq. (2) Recall that the total evidence e has a factor form: Y e(x) = `v (xv ). v∈V 3. Finally, we consider X , w ∈ ch(v). Since G is singly connected, from G m we Tvw For a given node v, we can also express e in terms of the pieces of evidence, e , see that v separates nodes in Tvw , w ∈ ch(v) from nodes in Tvu, u ∈ pa(v). v Therefore, eTvu , u ∈ pa(v) and eTvw , w ∈ ch(v) as

{XTvw , w ∈ ch(v)} ⊥ {XTvu , u ∈ pa(v)} | Xv . “ Y ” Y e(x) = eT (xT ) · ev (xv ) · eT (xT ), Furthermore, removing the node v, the subgraph of G m induced by T , w ∈ ch(v) vu vu vw vw vw u∈pa(v) w∈ch(v) is disconnected and has m = |ch(v)| components, each corresponding to a Tvw . So arguing as in the first step, we have that given Xv , the variables XTvw , w ∈ ch(v) where are mutually independent. This gives us Eq. (1): Y Y eTvu (xTvu ) = `v 0 (xv 0 ), ev (xv ) = `v (xv ), eTvw (xTvw ) = `v 0 (xv 0 ). Y Y 0 0 v ∈Tvu v ∈Tvw p(x) = p(xTvu ) · p(xv |xpa(v)) · p(xTvw |xv ). u∈pa(v) w∈ch(v) We now combine each piece of evidence with the respective term in p(x), which by Eq. (1) is Y Y p(x) = p(xTvu ) · p(xv |xpa(v)) · p(xTvw |xv ), u∈pa(v) w∈ch(v) to obtain Y Y p(x)·e(x) = p(xTvu ) eTvu (xTvu ) ·p(xv |xpa(v)) ev (xv )· p(xTvw |xv )eTvw (xTvw ). u∈pa(v) w∈ch(v)

Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 25 / 28 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 26 / 28

Belief Propagation Applications to Loopy Graphs Belief Propagation Applications to Loopy Graphs Details of Derivation for Eq. (2) Details of Derivation for Eq. (5)

We derive the expression for p∗(e |x ). Similar to the derivation of Eqs. (1)-(2), Using short-hand notation for probabilities of events (defined in Lec. 9), we have Tuv u Y Y p(x & e |x ) = p(x & e )·p`x & e |x ´· p(x & e |x ). p(x) · e(x) = p(x & e), Tuv Tuv u Tvu0 Tvu0 v v pa(v) Tvw Tvw v u0∈pa(v)\{u} w∈ch(v) p(xv |xpa(v)) · ev (xv ) = p(xv & ev |xpa(v)), Also, p(xT ) · eT (xT ) = p(xT & eT ), vu vu vu vu vu max ⇔ max max max max , x x x x 0 xT p(xTvw |xv ) · eTvw (xTvw ) = p(xTvw & eTvw |xv ). Tuv v pa(v)\{u} T 0 \{u } vw vu w∈ch(v) u0∈pa(v) So, we may write P(X = x, e) = p(x) · e(x) as Moving certain maximization operations inside the products, we obtain Y Y p(x & e) = p(x & e ) · p(x & e |x ) · p(x & e |x ), Tvu Tvu v v pa(v) Tvw Tvw v ∗ ` ´ Y ∗ Y ∗ p (e |x ) = max max p x & e |x · p (x 0 & e ) · p (e |x ). u∈pa(v) w∈ch(v) Tuv u v v pa(v) u Tvu0 Tvw v xv xpa(v)\{u} u0∈pa(v)\{u} w∈ch(v) which is Eq. (2). By the definitions of messages in slide 13, this is

∗ “ ` ´ Y ∗ ” Y ∗ p (eTuv |xu) = max max p xv |xpa(v) `v (xv ) · πu0,v (xu0 ) · λw,v (xv ). xv xpa(v)\{u} u0∈pa(v)\{u} w∈ch(v)

Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 27 / 28 Huizhen Yu (U.H.) Bayesian Networks: Belief Propagation (Cont’d) Feb. 18 28 / 28