<<

Exploiting Tree Decomposition and Soft Local Consistency in Weighted CSP

Simon de Givry and Thomas Schiex Gerard Verfaillie INRA, Toulouse, France ONERA - DCSD, Toulouse, France degivry,tschiex @toulouse.inra.fr [email protected] { }

Abstract proved practical time and space complexity inside the origi- nal bounds. Note however that in the context of tree-search, Several recent approaches for processing graphical these bounds are obtained at the cost of restricted freedom models (constraint and Bayesian networks) simultane- in variable assignment ordering (see (Bacchus, Dalmao, & ously exploit graph decomposition and local consis- tency enforcing. Graph decomposition exploits the Pitassi 2003) on this topic). problem structure and offers space and time complex- In this paper, we are specifically interested in optimiza- ity bounds while hard information propagation provides tion problems in the framework of valued and weighted practical improvements of space and time behavior in- constraint networks (Schiex, Fargier, & Verfaillie 1995). side these theoretical bounds. Weighted constraint networks (WCN) provide a very general Concurrently, the extension of local consistency to model with several applications in domains such as resource weighted constraint networks has led to important im- allocation, combinatorial auctions and . provements in branch and bound based solvers. In- In a first part, we show that, with a limited weakening of deed, soft local consistencies give incrementally com- existing theoretical complexity bounds, an alternative com- puted strong lower bounds providing inexpensive yet bination of tree decomposition and branch and bound can be powerful pruning and better informed heuristics. defined. In a second part, we exploit the extension of local In this paper, we consider combinations of tree decom- consistency to WCN.This extension has lead to increasingly position based approaches and soft local consistency en- forcing for solving weighted constraint problems. The efficient branch and bound using increasingly intricacy of weighted information processing leads to strong local consistency properties (Cooper & Schiex 2004; different approaches, with different theoretical proper- Larrosa & Schiex 2004; 2003).They offer incrementally ties. It appears that the most promising combination maintained strong bounds and also contribute to better in- sacrifices a bit of theory for improved practical effi- formed variable and value ordering. Introduced in tree de- ciency. composition based approaches, they should enhance their practical time and space complexities and also provide better guidance. Introduction Taken together, these two parts show the increased intri- Graphical model processing is a central problem in AI. In cacy of weighted information processing, leading to differ- the last years, in order to solve satisfaction, optimization or ent possible combinations, each having different theoretical counting problems, several algorithms have been proposed properties and practical efficiencies. that simultaneously exploit a decomposition of the graph of the problem and the propagation of hard information using Preliminaries local consistency enforcing. This includes algorithms such as Recursive Conditioning (RC) (Darwiche 2001), Back- A weighted binary CSP (WCSP) is a triplet (X, D, W). track bounded by Tree Decomposition (BTD) (Terrioux X = 1,...,n is a set of n variables. Each variable i X {has a finite} domain D D of values than can be & Jegou 2003), AND-OR tree and graph search (Mari- ∈ i ∈ nescu & Dechter 2005b; 2005a), all related to Pseudo-tree assigned to it. The maximum domain size is d. W is a set of search (Freuder & Quinn 1985). soft constraints (or cost functions). A binary soft constraint W W is a function W : D D [0,k] where k is a Implicitly or not, all these algorithms exploit the proper- ij ∈ ij i × j → ties of Tree Decompositions (Bodlaender 2005) which cap- given maximum integer cost corresponding to a completely ture structural independence information, in order to ob- forbidden assignment (expressing hard constraints). If they tain theoretical bounds on the time and space complexity do not exist, we add to W one unary cost function for every of the algorithms. The combination with hard informa- variable such that Wi : Di [0,k] and a zero arity con- straint W (a constant cost payed→ by any assignment). All tion propagation provides additional pruning leading to im- ∅ these additional cost functions have initial value 0, leaving Copyright c 2006, American Association for Artificial Intelli- the semantics of the problem unchanged. gence (www.aaai.org). All rights reserved. The problem is then to find a complete assignment

22 with a minimum cost: min W + (a1,a2,...,an) i Di ∅ n ∈ { Wi(ai)+ Wij(ai,aj) , an optimization i=1 Wij W }  problem with an associated∈ NP-complete decision problem. The constraint graph of a WCSP is a graph G =(X, E) with one for each variable and one edge (i, j) for every binary constraint Wij W . A tree decomposi- tion of this graph is defined by∈ a tree (C, T). The set of nodes of the tree is C = C ,...,C where C is a { 1 m} e set of variables (Ce X) called a cluster. T is a set of edges connecting clusters⊂ and forming a tree (a con- nected acyclic graph). The set of clusters C must cover all the variables ( C = X) and all the constraints Ce C e ( (i, j) E, C ∈C s.t. i, j C ). Furthermore, if a ∀ ∈ ∃ e∈ ∈ e variable i appears in two clusters Ce and Cg, i must also ap- pear in all the clusters Cf on the unique path from Ce to Cg in (C, T). The rationale behind this definition is that acyclic prob- lems can be solved with a good theoretical complexity. Thus, a tree decomposition decomposes a problem in sub- problems (clusters) organized in an acyclic graph and in such a way that the variables that relate two adjacent clusters (variables whose removal disconnects the subproblems) are obtained by intersecting the two clusters. This is illustrated on Figure 1 where the graph of a fre- quency assignment problem is covered by clusters defining Figure 1: The constraint graph of preprocessed RLFAP a tree decomposition. Because of the usual emphasis on SCEN-06 problem covered by clusters and the associated purely random problems, one may think that such nice de- tree decomposition using C1 as root. Problem P5 is outlined compositions are exceptional but the emphasis on modular- with separator S5 = 57, 72 . The constraint W57,72 inside S is not part of P . P{ has a} separator S = 69 . ity in design problems and the (spatial or temporal) locality 5 5 6 6 { } of many problems easily induce such structures. The tree-width of a tree decomposition (C, T) is equal to the variables of a cluster C are assigned before any of the max C 1, denoted w in the sequel. The tree- e Ce C e remaining variables in its son clusters and consider a cur- width w∈ {|of G|}is − the minimum tree-width over all tree de- ∗ rent assignment A. Then, for any cluster C Sons(C ), compositions of G (also called the induced-width of G). If f e and for the current assignment A of the separator∈ S , the a root C C is chosen, the maximum number of variables f f r subproblem P under assignment A (denoted P /A ) can appearing∈ in a path starting from C is called the tree-height f f f f r be solved independently from the rest of the problem. If of the decomposition. Finding a minimum tree-width or a memory allows, the optimal cost of P /A may be recorded minimum tree-height decomposition is NP-hard. f f which means it will never be solved again for the same as- signment of S . Exploiting tree decompositions f If d is the maximum domain size, h the decomposition For a given WCSP, we consider a rooted tree decomposition tree-height and w the tree-width, this idea applied recur- (C, T) with an arbitrary root C . We denote by Father(C ) 1 e sively with no recording guarantees that a cluster C will (resp. Sons(C )) the parent (resp. set of sons) of C in T . e e e never be visited more than d to the power of the number The separator of C is the set S = C Father(C ). The set e e e e of variables in the path from the root to C (proper vari- of proper variables of C is V = C ∩S . Note that the V e e e e e e ables excluded). This gives a guaranteed bound on the family defines a partition of X. For a\ given variable i X, number of visited nodes which is dominated by O(dh). we denote by [i] the index of the cluster such that i V∈ . [i] This covers Pseudo-Tree or AND-OR tree search, also ap- The essential property of tree decomposition is∈ that as- plied to WCSP in (Larrosa, Meseguer, & Sanchez 2002; signing separates the initial problem in two subproblems Se Marinescu & Dechter 2005b). which can then be solved independently. The first subprob- lem, denoted Pe, is defined by the variables of Ce and all With recording, a cluster Ce will never be visited more its descendant clusters in T and by all the cost functions in- than the number of assignments of Se. Given the number of volving at least one proper variable of these clusters. The variables in Ve, this means that the number of nodes is dom- remaining constraints, together with the variables they in- inated by O(dw+1), w +1being the size of the largest clus- volve, define the remaining subproblem. ter. This gives algorithms such as RC with full recording, This property has been exploited by many related algo- BTD and AND-OR graph search, which can be considered rithms. In a branch and bound framework, this property may as structural learning algorithms. The side effect is that the be exploited by restricting the variable ordering. Imagine all memory required may be exponential in the separator size.

23 Exploiting better upper bounds sistency (Larrosa & Schiex 2004), full directional arc con- Traditional depth-first branch and bound algorithms are tree- sistency (DAC/FDAC) (Larrosa & Schiex 2003; Cooper & search algorithms exploiting two bounds. In the WCSP case, Schiex 2004)), leading to increasingly efficient branch and where a minimum cost solution is sought, the best known so- bound algorithms. lution so far provides an upper bound on the optimum. Dur- As in the classical case, enforcing local consistency on the ing search, at a given node, an extra dedicated lower bound- initial problem provides in polynomial time an equivalent ing mechanism provides a lower bound on the cost of any problem – defining the same cost distribution on complete complete assignment below the current node. If the lower assignments – with possible smaller domains. It may also in- bound exceeds the upper bound, one can backtrack. Typ- crease the value of W∅ and the costs Wi(a) associated with ically, when search starts, the upper bound is either set to domain values. W∅ defines a strong lower bound which can maximum cost k or to the cost of a solution found e.g. by be exploited by branch and bound algorithms while the up- local search. dated Wi(a) can inform variable and value orderings. With the basic combination of tree decomposition with For our purpose, we point out that enforcing such local branch and bound, as in BTD applied to optimization prob- consistencies is done by the repeated application of atomic operations called arc equivalence preserving transforma- lems (Jegou & Terrioux 2004), once a separator Se is as- tions (Cooper & Schiex 2004). If we consider two cost func- signed, the corresponding problem Pe/Ae is solved as an in- dependent problem. This means that its initial upper bound tions Wij and Wi, a value a Di and a cost α, we can add α to W (a) and subtract α from∈ every W (a, b) for all b D . is set to k. i ij ∈ j So, existing algorithms such as BTD or AND-OR search Simple arithmetics shows that the global cost distribution is do not impose any initial upper bound when solving a sub- unchanged while costs may have moved from the binary to problem. Although this effectively guarantees that the opti- the unary level (if α>0, this is called a Projection) or from the unary to the binary level (if α<0, this is called an mum of Pe/Ae will be computed, it provides limited prun- ing. Furthermore, this optimal cost, once combined with Extension). In these operations, any cost above k, the max- the cost of already totally assigned clusters and with exist- imum allowed cost, can be considered as infinite and is thus ing lower bounds on the remaining unexplored clusters may unaffected by subtraction. If no negative cost appears and well exceed the cost of the best solution found so far: useless if all costs above k are set to k, the remaining problem is optimization has been performed. always a valid and equivalent WCSP. It may therefore be better to start solving Pe/Ae with a The same mechanism, at the unary level, can be used to non trivial upper bound obtained by taking the difference move costs from the Wi to W∅. Finally, any value a such between the upper bound associated with the parent prob- that Wi(a)+W∅ is equal to k can be deleted. lem P and a lower bound on P but P . From this description, one may already see that, even if Father(Ce) Father(Ce) e However, this has a bad side-effect: after solving a subprob- it preserves the cost of complete assignments, local consis- lem, if a solution with a cost strictly lower than the initial tency may move costs between clusters, thereby invalidating upper bound is found, this solution is optimal and can be previously recorded information. recorded as such. But if no solution is found, we just have a new lower bound on the optimum of Pe/Ae. The lower Binary projection and extension Moving costs between bound and the fact it is optimal can be recorded in LB Pe/Ae cost functions W and W (a) may modify the cost distri- and Opt respectively, initially set to 0 and false. ij i Pe/Ae butions (and therefore the recorded optima) of subproblems. If the search comes back later to the problem , and Pe/Ae This happens only when Wi and Wij are associated to dif- even if this lower bound has been recorded, the subprob- ferent subproblems, such as W72 C1 and W72,69 C5 lem may need to be solved again. However, since the lower in Fig. 1, and thus when the first cluster∈ from the root∈ con- bound recorded will necessarily be improved at each search, taining i, C[i], is an ascendant of C[j]. In this case, the cost the number of successive resolutions is bounded by k (and distribution of P[j] and of all ascendant subproblems up to more tightly by the optimum of P /A ). e e (but not including) P[i] is decreased or increased (according The resulting has the same space complexity to α’s sign). as BTD and the number of visited nodes is bounded by We store these cost modifications in a specific backtrack- O(k.dw+1) where w is the tree-width of the decomposition e able data structure ∆Wi (a). There is one count for every used. Note that because it still exploits problem decom- value of every variable in every separator, resulting in a rea- h position, the number of nodes is also dominated by O(d ) sonable O(mnd) space complexity. where h is the decomposition tree-height (thus still better n Initially, ∆W is set to zero. When an arc equivalence than branch and bound in O(d )). e preserving operation is performed as above, all the ∆Wi (a) counters for all separators Se, from S[j] up to S[i] (excluded) Local consistency and tree decomposition e e are updated: ∆Wi (a):=∆Wi (a)+α. Independently of tree decomposition approaches, the algo- During the search, when a separator Se is assigned rithmics of WCSP has been widely improved in the last by assignment Ae, we can obtain the total cost that has years by exploiting the extension of local consistency to been moved out of the subproblem by summing up all the e cost functions.Several increasingly stronger local consisten- ∆Wi (a) for all values (i, a) in the separator assignment cies have been defined since then (soft node and arc con- Ae and correct any recorded information. This summa-

24 tion is denoted ∆W Pe/Ae in the following. For instance, if lb(P5/A)+Wi(u) cub and any value v Dj with 5 5 ≥ 5 ∈ in Fig. 1, ∆W P /A =∆W (a)+∆W (b) with A5 = variable j V V is removed if W + W (v) cub. 5 5 57 72 6 7 ∅ j (57,a), (72,b) . ∈ ∪ ≥ { } Value removal based on local and global cuts Inside a Unary projection Local consistency enforcing uses a sim- subproblem, a value can be removed for local reasons if e ilar mechanism to move costs from unary cost functions W + W (a) cub but also, as in traditional branch and ∅ i ≥ Wi to the problem lower bound W . In order to avoid bound, for global reasons, if W + W (a) gub. Be- ∅ ∅ i ≥ corresponding subproblem cost modification, we W∅ cause neither of these conditions includes the other, they can into local e cost functions, each associated to one cluster. W∅ both be useful. However, the global condition may prune When cost from a unary constraint Wi is moved to the zero the exploration of cluster Ce to the point that we first lose arity level, it is moved to W [i]. For a given subproblem P , the guarantee that if a solution is found, it is optimal, but ∅ e its associated lower bound is obtained by summing up all also the guarantee that an improved lower bound can be de- f duced from the absence of solution. Even if a lower bound local W for all the clusters Cf in Pe defining a local lower ∅ e can still be inferred by collecting leaf costs, the number of bound denoted W . For the complete problem, this local ∅ successive resolutions is not bounded anymore by k and lower bound is equal to the global lower bound . W∅ the only bound on the total number of nodes is always in For any strict subproblem P ,e 2,...,m and any h e ∈{ } O(d ). When recording is used, this approach is therefore assignment Ae, we may therefore have two lower bounds, dominated in theory (either in time or space) by previous ap- one provided by the lower bound recording mechanism proaches. Since experimental results (not reported) did not (LBPe/Ae ∆W Pe/Ae ) and the other one provided by lo- − e exhibit interesting performances, this double-cut approach cal consistency (W ). In the following, we use the maxi- ∅ is only considered without recording (Pseudo-tree search). + mum of these two bounds, denoted as lb(Pe/Ae). We gen- Below, we present the pseudo-code of the Lc-BTD al- eralize the previous definition for any subproblem Pe,e gorithm combining tree decomposition and a given level of ∈ 1,...,m and any partial assignment A: lb(Pe/A)= local consistency (Lc). This algorithm uses our initial en- { e } W + lb(Pf /Af ), taking into account local ∅ Cf Sons(Ce) hanced upper bound (line 1), value removal based on local consistency and∈ recorded lower bounds as soon as their sep- cuts and optional recording (lines 2 and 3). The initial call is  + arator is completely assigned by A. Lc-BTD ( ,C1,V1,k). It assumes an already Lc-consistent For instance, in Fig. 1, let A = (57,a), (72,b), (69,c) , problem and∅ returns its optimum. Function pop(S) returns then 5 { 5 } an element of S and remove it from S. Lc-BTD+ has lb(P5/A)=W∅ + lb(P6/A6)=W∅ + 6 5 6 max W ,LB ∆W = W +max W + + ∅ P6/A6 P6/A6 ∅ ∅ Function Lc-BTD (A,Ce,V ,cub) : integer 7 { −6 } { W ,LBP /A ∆W (c) with A6 = (69,c) . if (V = ) then ∅ 6 6 − 69 } { } ∅ S := Sons(Ce) ; /* Solve all cluster sons whose optima are unknown */ ; Value removal based on local cuts The last mechanism while (S = and lb(Pe/A)

25 by O(k.dw+1). Because of branch and bound pruning, it is lems. With a stronger local consistency, FDAC-BTD+ is possible that only a small fraction of the theoretical space always among the best algorithms and may outperform FC- will be used. Our implementation uses therefore sparse data BTD by two orders of magnitude for large separator size structures (hash tables) for recording information. and high constraint tightness. For small/medium separa- tors, recording seems to be a crucial ingredient for efficiency Experimental results and FDAC-PTS is even dominated by NC/FC recording ap- proaches in this case. MFDAC was unable to solve the in- In this section, we perform an empirical comparison of the stances with small separator size and large number of vari- algorithm Lc-BTD+ with classical BTD exploiting Forward ables. Finally, variable elimination took constant time and Checking (FC-BTD) (Jegou & Terrioux 2004), traditional space (bounded by dw+1 =510 =9.8106). Over all in- depth-first branch and bound (MFDAC) (Larrosa & Schiex stances, the number of successive resolutions of any sub- 2003), and Variable Elimination (VE) (Dechter 1999), a dy- problem P /A never exceeded 11 for FDAC-BTD+ and 17 namic programming algorithm with time and space com- e e for NC-BTD+, far from the theoretical bound defined by the plexity in O(dw+1), on random and real-world binary in- largest optimal cost, here equal to 202. stances. Several versions of our algorithm can be considered: without recording (lines 2 and 3 removed), we get a vari- Real-world instances The Radio Link Frequency As- ant of pseudo-tree search enhanced with local consistency signment Problem (RLFAP) (Cabon et al. 1999) consists in enforcing denoted FDAC-PTS. This algorithm is similar to assigning frequencies to a set of radio links in such a way the AOMFDAC algorithm of (Marinescu & Dechter 2005b). that all the links may operate together without noticeable In this case, local and global cuts are used for value removal. interference. Some RLFAP instances can be naturally cast For local consistency Lc, we tested Node Consistency (NC- as weighted binary CSPs. We focused on instance SCEN-06 BTD+) and the more powerful FDAC (FDAC-BTD+). and its sub-instances SUB1 and SUB4. Compared to the All implementations are in C code (open source solver previous settings, we used the best known solution cost TOOLBAR http://carlit.toulouse.inra.fr/ (optimal) as the initial global upper bound, preprocessed cgi-bin/awki.cgi/SoftCSP). Experiments were per- the MCS tree decomposition by merging clusters with a formed on a 2.4 GHz Xeon with 4 GB. separator size strictly greater than 3 (MCS time always less than 10 ms) and used the 2-sided Jeroslow-like heuristic Randomly generated instances Tree decomposition ap- for variable selection. The root cluster chosen maximizes proaches are useless for pure unstructured random problems. the product of domain sizes. By doing so, the root clusters We designed a structured Max-CSP (violation has unit cost) of SCEN-06 and SUB4 are the same (20 variables). Time random problem generator using a binary clique tree. The limit is 10 hours. The same statistics on time and number of recorded lower bounds (#LB) are reported. model parameters are (w, s, h,d,t) where w+1 is the clique size, s the separator size, h the height of the clique tree, d RLFAP SUB1 SUB4 SCEN-06 the domain size, and t the constraint tightness (percentage optimum 2669 3230 3389 of forbidden tuples). The total number of variables is equal n, d, w, h 14, 44, 13, 14 22, 44, 19, 21 100, 44, 19, 67 to n = s +(w +1 s)(2h 1). Our tree decomposition Method time #LB time #LB time #LB method produces a tree-width− − equal to w and a tree-height FC-BTD 1197 0 - 0 - - equal to h = h(w +1 s)+s. We have generated and NC-BTD+ 490 0 - 0 - - − FDAC-BTD+ 14 0 929 0 10,309 326 solved (w =9,h =3,d =5)clique trees with varying separator sizes s 2, 5, 7 and varying constraint tightness FDAC-PTS 14 n/a 851 n/a - n/a t [30, 90]% (over-constrained∈{ } instances only). Average MFDAC 14 n/a 984 n/a - n/a over∈ 50 instances are reported. All algorithms (except VE) Again, FDAC-BTD+ shows its robustness. Slightly dom- use the dom/deg dynamic variable ordering heuristic (locally inated by FDAC-PTS on the SUB4 problem, it is the only inside clusters for tree decomposition based methods). For algorithm that is able to prove optimality of the full SCEN- value selection we consider values in increasing order of 06 instance. The recording, even if quite limited (326 unary cost Wi. The DAC variable ordering is lexicographic. over a possible maximum of roughly one million(M), used The tree decomposition and the variable elimination order 5.5M times during the exploration of 16M search nodes), is are computed using the maximum cardinality search (MCS) the only difference (except the value removal policy) with heuristic (Tarjan & Yannakakis 1984) with linear time com- FDAC-PTS algorithm that didn’t solve the problem in the plexity in O(n + E ). A root minimizing the tree-height is 10 hours limit (“-” symbol) but finished in 2 days. Variable used. The cpu-time| reported| excludes the time to compute a elimination didn’t solve any instance due to the 4 GB limit. tree-decomposition (always less than 10 ms) and is limited Compared with related work, in (Jegou & Terrioux 2004), to 5 minutes per instance (unsolved instances are assigned a FC-BTD without initial global upper bound solved SUB4 in 5 minute solving time). Fig. 2 reports times in seconds and 122,933 seconds. AOMFDAC using a static variable order- memory usage in number of recorded lower bounds. ing solved the same sub-instance in 47,115 seconds (Mari- For weak local consistencies such as FC and NC (which nescu & Dechter 2005b). Both experiments were performed provide the same lower bound), our enhanced initial up- on a 2.4 GHz Pentium IV computer but used different tree per bound saves space at the cost of time on these prob- decomposition from us. First solved by a partitioning ap-

26 Figure 2: Time in seconds (y-axis in log-scale) and number of recorded lower bounds (y-axis in linear-scale, its maximum corresponds to the theoretical maximum) to find the optimum and prove optimality of random binary clique trees with various separator sizes s equal to 2, 5, and 7 (from left to right). Algorithms are sorted from the worst (top) to the best (bottom) at t =90%. n, d, c, w and h are the variable number, domain size, graph connectivity, tree-width, and tree-height respectively.

proach (SUB4 being the largest component), the full SCEN- Darwiche, A. 2001. Recursive Conditioning. Artificial Intelli- 06 instance was also solved by a specific dynamic program- gence 126(1-2):5–41. ming algorithm in 27,102 seconds on a DEC 2100 A500MP Dechter, R. 1999. Bucket elimination: A unifying framework for workstation (Koster 1999). However, our approach is more reasoning. Artificial Intelligence 113(1–2):41–85. generic, we do not use any specific preprocessing dedicated Freuder, E. C., and Quinn, M. J. 1985. Taking advantage of stable to frequency assignment problems, and uses less memory sets of variables in constraint satisfaction problems. In Proc. of (326 compared to roughly one million (Koster 1999)). IJCAI-85, 1076–1078. Jegou, P., and Terrioux, C. 2004. Decomposition and good Conclusion recording. In Proc. of ECAI-04, 196–200. In this paper, we have proposed a branch and bound algo- Koster, A. 1999. Frequency assignment: Models and Algorithms. rithm exploiting tree decomposition and local consistency. Ph.D. Dissertation, Maastricht, The Netherlands. Among the possible tree decomposition based approaches, Larrosa, J., and Schiex, T. 2003. In the quest of the best form of we found that the most promising and robust approach com- local consistency for weighted CSP. In Proc. of IJCAI-03, 239– bines initial enhanced upper bounds, strong local consis- 244. tency, corrected recorded lower bounds, and value removal Larrosa, J., and Schiex, T. 2004. Solving Weighted CSP by Main- based on local cuts. The resulting algorithm sacrifices a bit taining Arc-consistency. Artificial Intelligence 159(1-2):1–26. of theory for improved practical (time and space) overall ef- Larrosa, J.; Meseguer, P.; and Sanchez, M. 2002. Pseudo-tree ficiency as shown on random and real-world instances. search with soft constraints. In Proc. of ECAI-02, 131–135. We have presented our algorithm in the framework of Marinescu, R., and Dechter, R. 2005a. Advances in and/or weighted binary CSP. Our approach is also applicable to branch-and-bound search for constraint optimization. In CP-2005 non-binary soft constraints, opening the door to other do- workshop on Preferences and Soft Constraints. mains such as Max-SAT and MPE in Bayesian Networks. Marinescu, R., and Dechter, R. 2005b. And/or branch-and-bound for graphical models. In Proc. of IJCAI-05, 224–229. References Schiex, T.; Fargier, H.; and Verfaillie, G. 1995. Valued constraint satisfaction problems: hard and easy problems. In Proc. of IJCAI- Bacchus, F.; Dalmao, S.; and Pitassi, T. 2003. Value elimination: 95, 631–637. Bayesian inference via backtracking search. In UAI-03, 20–28. Tarjan, R., and Yannakakis, M. 1984. Simple linear-time algo- Bodlaender, H. 2005. Discovering . In Theory and rithms to test chordality of graphs, test acyclicity of hypergraphs Practive of Computer Science - SOFSEM’2005, 1–16. and selectively reduce acyclic hypergraphs. SIAM J. Comput. Cabon, B.; de Givry, S.; Lobjois, L.; Schiex, T.; and Warners, J. 13(3):566–579. 1999. Radio link frequency assignment. Constraints 4:79–89. Terrioux, C., and Jegou, P. 2003. Hybrid backtracking bounded Cooper, M., and Schiex, T. 2004. Arc consistency for soft con- by tree-decomposition of constraint networks. Artificial Intelli- straints. Artificial Intelligence 154:199–227. gence 146(1):43–75.

27