Preventing Unnecessary Groundings in the Lifted Dynamic Junction Tree Algorithm
Total Page:16
File Type:pdf, Size:1020Kb
Preventing Unnecessary Groundings in the Lifted Dynamic Junction Tree Algorithm Marcel Gehrke, Tanya Braun, and Ralf Moller¨ Institute of Information Systems, Universitat¨ zu Lubeck,¨ Lubeck¨ fgehrke, braun, moellerg@ifis.uni-luebeck.de Abstract tree algorithm (LDJT) to exactly answer multiple filtering and prediction queries for multiple time steps efficiently The lifted dynamic junction tree algorithm (LDJT) efficiently answers filtering and prediction queries for probabilistic rela- (Gehrke, Braun, and Moller¨ 2018). LDJT combines the tional temporal models by building and then reusing a first- advantages of the interface algorithm (Murphy 2002) and order cluster representation of a knowledge base for multiple the lifted junction tree algorithm (LJT) (Braun and Moller¨ queries and time steps. Unfortunately, a non-ideal elimina- 2016). Specifically, this paper extends LDJT and contributes tion order can lead to groundings even though a lifted run is (i) means to identify whether groundings occur and (ii) an possible for a model. We extend LDJT (i) to identify unnec- approach to prevent unnecessary groundings by extending essary groundings while proceeding in time and (ii) to pre- inter first-order junction tree (FO jtree) separators. vent groundings by delaying eliminations through changes LDJT reuses an FO jtree structure to answer multiple in a temporal first-order cluster representation. The extended queries and reuses the structure to answer queries for all version of LDJT answers multiple temporal queries orders of time steps t > 0. Additionally, LDJT ensures a minimal magnitude faster than the original version. exact inter FO jtree information propagation over a sepa- rator. Unfortunately, due to a non-ideal elimination order 1 Introduction unnecessary groundings can occur. In the static case, LJT Areas like healthcare, logistics or even scientific publish- prevents groundings by fusing parclusters, the nodes of an ing deal with probabilistic data with relational and tempo- FO jtree. For the temporal case, fusing parclusters is not ap- ral aspects and need efficient exact inference algorithms. plicable, as LDJT would need to fuse parclusters of different These areas involve many objects in relation to each other FO jtrees. We propose to prevent groundings by extending with changes over time and uncertainties about object ex- inter FO jtree separators and thereby changing the elimina- istence, attribute value assignments, or relations between tion order by delaying eliminations to the next time step. objects. More specifically, publishing involves publications The remainder of this paper has the following structure: (the relational part) for many authors (the objects), streams We begin by recapitulating PDMs as a representation for of papers over time (the temporal part), and uncertain- relational temporal probabilistic models and present LDJT, ties for example due to missing or incomplete informa- an efficient reasoning algorithm for PDMs. Afterwards, we tion. By performing model counting, probabilistic databases present LJT’s techniques to prevent unnecessary groundings (PDBs) can answer queries for relational temporal mod- and extend LDJT to prevent unnecessary groundings. Lastly, els with uncertainties (Dignos,¨ Bohlen,¨ and Gamper 2012; we evaluate the extended version of LDJT against LDJT’s Dylla, Miliaraki, and Theobald 2013). However, each query orignal version and LJT. We conclude by looking at possi- embeds a process behaviour, resulting in huge queries with ble extensions. arXiv:1807.00744v1 [cs.AI] 2 Jul 2018 possibly redundant information. In contrast to PDBs, we build more expressive and compact models including be- 2 Related Work haviour (offline) enabling efficient answering of more com- We take a look at inference for propositional temporal mod- pact queries (online). For query answering, our approach els, relational static models, and give an overview about re- performs deductive reasoning by computing marginal dis- lational temporal model research. tributions at discrete time steps. In this paper, we study For exact inference on propositional temporal models, a the problem of exact inference and investigate how to pre- naive approach is to unroll the temporal model for a given vent unnecessary groundings in large temporal probabilistic number of time steps and use any exact inference algorithm models that exhibit symmetries. for static, i.e., non-temporal, models. In the worst case, once We propose parameterised probabilistic dynamic mod- the number of time steps changes, one has to unroll the els (PDMs) to represent probabilistic relational tempo- model and infer again. Murphy (2002) proposes the inter- ral behaviour and introduce the lifted dynamic junction face algorithm consisting of a forward and backward pass Copyright c 2018, Association for the Advancement of Artificial that uses a temporal d-separation with a minimal set of nodes Intelligence (www.aaai.org). All rights reserved. to apply static inference algorithms to the dynamic model. First-order probabilistic inference leverages the relational P ub(X; P ) Hot aspect of a static model. For models with known domain g0 size, first-order probabilistic inference exploits symmetries DoR(X) AttC(A) in a model by combining instances to reason with repre- g1 sentatives, known as lifting (Poole 2003). Poole (2003) in- troduces parametric factor graphs as relational models and ex proposes lifted variable elimination (LVE) as an exact in- Figure 1: Parfactor graph for G ference algorithm on relational models. Further, de Salvo Braz (2007), Milch et al. (2008), and Taghipour et al. (2013) extend LVE to its current form. Lauritzen and Spiegelhal- PRV A. Constraint (X;CX) allows to restrict logvars to cer- ter (1988) introduce the junction tree algorithm. To benefit tain domain values and is a tuple with a sequence of logvars 1 n n i from the ideas of the junction tree algorithm and LVE, Braun X = (X ; :::; X ) and a set CX ⊆ ×i=1D(X ). > de- and Moller¨ (2016) present LJT, which efficiently performs notes that no restrictions apply and may be omitted. The exact first-order probabilistic inference on relational models term lv(Y ) refers to the logvars in some element Y . The given a set of queries. term gr(Y ) denotes the set of instances of Y with all logvars To handle inference for relational temporal models most in Y grounded w.r.t. constraints. approaches are approximative. Additional to being approxi- Let us set up a PM for publications on some topic. We mative, these approaches involve unnecessary groundings or model that the topic may be hot, conferences are attractive, are only designed to handle single queries efficiently. Ah- people do research, and publish in publications. From R = madi et al. (2013) propose lifted (loopy) belief propaga- fHot; DoRg and L = fA; P; Xg with D(A) = fa1; a2g, tion. From a factor graph, they build a compressed factor D(P ) = fp1; p2g, and D(X) = fx1; x2; x3g, we build the graph and apply lifted belief propagation with the idea of boolean PRVs Hot and DoR(X). With C = (X; fx1; x2g), the factored frontier algorithm (Murphy and Weiss 2001), gr(DoR(X)jC) = fDoR(x1); DoR(x2)g. which is an approximate counterpart to the interface algo- g rithm. Thon et al. (2011) introduce CPT-L, a probabilistic Definition 2. We denote a parametric factor (parfactor) 8X : φ(A) jC X ⊆ L model for sequences of relational state descriptions with a with . being a set of logvars over A = (A1; :::; An) partially lifted inference algorithm. Geier and Biundo (2011) which the factor generalises and a se- (8X :) X = lv(A) present an online interface algorithm for dynamic Markov quence of PRVs. We omit if . A func- φ : ×n range(Ai) 7! + φ 2 Φ logic networks (DMLNs), similar to the work of Papai et tion i=1 R with name is de- A al. (Papai, Kautz, and Stefankovic 2012). Both approaches fined identically for all grounded instances of . A list of φ slice DMLNs to run well-studied static MLN (Richardson all input-output values is the complete specification for . C X G := fgign−1 and Domingos 2006) inference algorithms on each slice in- is a constraint on . A PM i=0 is a set of dividually. Two ways of performing online inference us- parfactors and semantically represents the full joint proba- bility distribution P (G) = 1 Q φ(A ) where Z is ing particle filtering are described in (Manfredotti 2009; Z f2gr(G) f Nitti, De Laet, and De Raedt 2013). a normalisation constant. Vlasselaer et al. (2014; 2016) introduce an exact ap- Adding boolean PRVs P ub(X; P ) and AttC(A), Gex = i 1 0 0 1 proach, which involves computing probabilities of each pos- fg gi=0, g = φ (P ub(X; P ); AttC(A); Hot), g = sible interface assignment on a ground level. φ1(DoR(X); AttC(A); Hot) forms a model. All parfactors have eight input-output pairs (omitted). Constraints are >, 3 Parameterised Probabilistic Models i.e., the φ’s hold for all domain values. E.g., gr(g1) contains four factors with identical φ. Figure 1 depicts Gex as a graph Based on (Braun and Moller¨ 2018), we present parame- with four variable nodes for the PRVs and two factor nodes terised probabilistic models (PMs) for relational static mod- for g0 and g1 with edges to the PRVs involved. Addition- els. Afterwards, we extend PMs to the temporal case, result- ally, we can observe the attractiveness of conferences. The ing in PDMs for relational temporal models, which, in turn, remaining PRVs are latent. are based on (Gehrke, Braun, and Moller¨ 2018). The semantics of a model is given by grounding and building a full joint distribution. In general, queries ask for 3.1 Parameterised Probabilistic Models a probability distribution of a randvar using a model’s full PMs combine first-order logic with probabilistic models, joint distribution and fixed events as evidence. representing first-order constructs using logical variables Definition 3. Given a PM G, a ground PRV Q and grounded (logvars) as parameters.