<p> Goal-Oriented Symbolic Propagation in Bayesian Networks</p><p>Goal-Oriented Symbolic Propagation in Bayesian Networks - Sachit Kachraj</p><p>Example 10.3:</p><p>Consider the following network (10.1 on P.446) and given the evidence </p><p> e = {X2 = 1, X5 = 1}</p><p>From the table 10.1, X3 and X6 are the only symbolic nodes.</p><p>Page 1 of 11 Goal-Oriented Symbolic Propagation in Bayesian Networks</p><p>Therefore the set of feasible monomials is given by</p><p>M = Ө3 X Ө6 = {θ300 θ600, θ300 θ610, θ310 θ600, θ310 θ610} = {m1, m2, m3, m4}</p><p>From equation (10.12) we get the equation for the un-normalized Conditional probability in a polynomial form.</p><p>Our objective here is to calculate the coefficients for each node Xi. The canonical components associated with the symbolic parameters Ө are required for doing so. As we are dealing with binary variables for any parameter Өi all we can have is {1, 0}, {0, 1}</p><p>Intuitively this means that for all the combinations of the extreme values for the parameters </p><p>θ300 θ310 θ600 θ610</p><p>0 1 0 1 0 1 1 0 1 0 0 1 1 0 1 0</p><p>Are the only possible combinations All other violating the rule </p><p>(OR)</p><p>Corresponding to M we can have the following set of canonical components</p><p>Page 2 of 11 Goal-Oriented Symbolic Propagation in Bayesian Networks</p><p>Then by instantiating the symbolic parameters to the corresponding values given in its canonical components, all the monomials in 10.18 become either 0 or 1. This we obtain an expression that only depends on the coefficients</p><p>Here the matrix Tij is an identity matrix because all the parameters are associated with the same instantiation of the set of parents We’ll have</p><p>The following figure (10.6) shows the un-normalized conditional probabilities of all nodes given X2 = 1, X5 = 1, associated with four possible canonical components c1, c2, c3 and c4. This can be obtained by using any of the numeric propagation techniques (from Ch 8 & Ch 9) that use the JPD to calculate these un-normalized probability values.</p><p>Page 3 of 11 Goal-Oriented Symbolic Propagation in Bayesian Networks</p><p>Using these values we can get all the rational functions in Table 10.3</p><p>For example from 10.6 we get the following values for X6:</p><p>These are the coefficients of the numerator polynomials for X6 = 0, X6 = 1 respectively. By substituting these values in 10.18 we get</p><p>Page 4 of 11 Goal-Oriented Symbolic Propagation in Bayesian Networks</p><p>Adding both the polynomial we get the denominator polynomial</p><p>Thus we have</p><p>Where </p><p>Finally eliminating the dependent parameters θ310, θ610 we get the expressions in table 10.3.</p><p>Page 5 of 11 Goal-Oriented Symbolic Propagation in Bayesian Networks</p><p>The Goal Oriented Algorithm</p><p>The problem theory discussed in the last lecture and the related examples are generic and can be used to get a symbolic expression for all the nodes in a given graph. This method can be modified a bit to do the goal oriented symbolic propagation in Bayesian networks.</p><p>Let Xi be our goal node for which we want to find the conditional probabilities P (Xi = j | E = e), where E is the set of evidential nodes with known values E = e.</p><p>According to Castillo, Gutierrez and Hadi the un-normalized probabilities are polynomials of the form:</p><p>Where mj are monomials in the symbolic parameters Θ</p><p>Consider the following Bayesian network with 5 nodes and the Table 1 that shows the corresponding parameters.</p><p>Figure 1</p><p>Page 6 of 11 Goal-Oriented Symbolic Propagation in Bayesian Networks</p><p>Table 1</p><p>Note that some nodes are numeric and some are parametric. X4 is numeric and the other four are parametric.</p><p>This algorithm solves the goal oriented problem by calculating the coefficients in equation (4). It is organized into four parts.</p><p>Let’s solve this problem for goal node X3 and evidence X2 = 1</p><p>PART 1: Identify all relevant Nodes</p><p>This involves implementing any of the d-separation algorithms that lets us know about the relevant nodes for a given node. I am yet to make a choice on this. But I already have a couple of them in mind, Geiger’s algorithm and the one proposed by Dr.Vargas himself.</p><p>The advantage with the Geiger’s algorithm is that given a node it fetches all the nodes that are d-separated from it at once. The algorithm proposed by Dr.Vargas claims to be much faster in the sense that it uses the sparseness of the real world Bayesian networks to reduce the number of paths to be traversed. It tells you weather the two given nodes are d-separated or not. We could modify this to actually find the d-separated set.</p><p>Page 7 of 11 Goal-Oriented Symbolic Propagation in Bayesian Networks</p><p>Thus, this part involves identifying relevant nodes and eliminating the rest from the graph.</p><p>From the above step, eliminating the nodes that are D-separated we get the following graph</p><p>X1</p><p>X2 X3</p><p>And Ө = {Ө1, Ө2, Ө3}</p><p>We proceed with the parameters set Ө that also includes the numeric parameters, only to pull them out as a part of coefficients in the third stage.</p><p>PART 2: Identify sufficient parameters</p><p>With the given evidence, we can identify and remove the parameters that contradict with the evidence.</p><p>We do that using the following two rules</p><p>Rule 1: Eliminate the parameters θjkπ if xj ≠ k for every Xj € E</p><p>θ 200 and θ201 are contradictory given that X2 = 1.</p><p>Rule 2: Eliminate θjkп if the parents’ instantiations are incompatible with the evidence</p><p>(This is just to illustrate the concept: If X1 be the only parent of X2 and given that X1 =</p><p>1, then θ 200 would be a contradiction to the given evidence, so eliminate the parameter) </p><p>For the given problem this case doesn’t exists, since the only evidential node X2 has no children.</p><p>So the minimum set of sufficient parameters is given by</p><p>Ө = {{θ10, θ11}; {θ210, θ211}; {θ300, θ310, θ301, θ311}}</p><p>Page 8 of 11 Goal-Oriented Symbolic Propagation in Bayesian Networks</p><p>PART 3: Identify the feasible monomials</p><p>Once the minimal Θ is identified they are combined to form the monomials by taking the Cartesian product of the parameters. The infeasible monomials are eliminated by using the following rule Rule 3: Parameters associated with contradictory conditioning instantiations cannot appear in same monomial</p><p>In the monomial θ10 θ210 θ301, the first term means that X1 = 0, and the third term means that X1 = 1(since X1 is the parent of X3), hence the contradiction.</p><p>Also pull out the numerical parameters out, because they become the part of the coefficients.</p><p>Doing that results in the following set of feasible monomials </p><p>M = {θ10 θ210, θ11 θ301, θ11 θ311}</p><p>PART 4: Calculate coefficients of all the polynomials</p><p>The discussion here follows from Example 10.3 </p><p>For the goal node Xi and for each of its states j build the subset Mj by considering those monomials in M which does not contain any parameter of the form θiqπ with q ≠ j</p><p>So M0 = {θ10 θ210, θ11 θ311} and M1 = {θ10 θ210, θ11 θ301}</p><p>Substituting these monomials in Eq 10.18 we get the following polynomials for the un- normalized probabilities P(X1 = j | E = e)</p><p>Taking the appropriate combinations of the extreme values for the symbolic parameters we can calculate the values of p0 (θ1) and p1 (θ2), which are P(X1 = 1 | X2 = 1, Ө = θ1 ) and P(X1 = 1 | X2= 1 , Ө = θ2 ) respectively.</p><p>Page 9 of 11 Goal-Oriented Symbolic Propagation in Bayesian Networks</p><p>Using any of the numeric propagation methods we can calculate the values of p0 (θ1) and p1 (θ2).</p><p>And solving the following equation we get the values of the corresponding coefficients</p><p>Similarly taking the canonical components corresponding to X1 = 1 we get </p><p>By, substituting in (8) and (9) we get</p><p>The sum of the coefficients gives the coefficients for the denominator</p><p>Normalizing the values in 13 and 14 with the denominator polynomial we get the probabilities for the goal node as</p><p>Eliminating the dependent parameters, we get the final expressions for the conditional probabilities.</p><p>Page 10 of 11 Goal-Oriented Symbolic Propagation in Bayesian Networks</p><p>References:</p><p>1. Goal Oriented Symbolic Propagation in Bayesian Networks. Castillo, Gutierrez and Hadi</p><p>2. Expert Systems and Probabilistic Network Models, Section 10.6. Castillo, Gutierrez and Hadi</p><p>Page 11 of 11</p>
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages11 Page
-
File Size-