Soft constraint processing

112
Soft constraint processing Thomas Schiex INRA – Toulouse France Javier Larrosa UPC – Barcelona Spain These slides are provided as a teaching support for the community. They can be freely modified and used as far as the original authors (T. Schiex and J. Larrosa) contribution is clearly mentionned and visible and that any modification is acknowledged by the author of the modification.

description

These slides are provided as a teaching support for the community. They can be freely modified and used as far as the original authors (T. Schiex and J. Larrosa) contribution is clearly mentionned and visible and that any modification is acknowledged by the author of the modification. - PowerPoint PPT Presentation

Transcript of Soft constraint processing

  • Soft constraint processingThomas SchiexINRA ToulouseFranceJavier LarrosaUPC BarcelonaSpainThese slides are provided as a teaching support for the community. They can be freely modified and used as far as the original authors (T. Schiex and J. Larrosa) contribution is clearly mentionned and visible and that any modification is acknowledged by the author of the modification.

    CP06

  • OverviewFrameworks Generic and specificAlgorithmsSearch: complete and incompleteInference: complete and incompleteIntegration with CPSoft as hardSoft as global constraint

    CP06

  • Parallel mini-tutorialCSP SAT strong relationAlong the presentation, we will highlight the connections with SAT

    Multimedia trick:SAT slides in yellow background

    CP06

  • Why soft constraints?CSP framework: natural for decision problemsSAT framework: natural for decision problems with boolean variables

    Many problems are constrained optimization problems and the difficulty is in the optimization part

    CP06

  • Why soft constraints?Given a set of requested pictures (of different importance) select the best subset of compatible pictures subject to available resources:3 on-board camerasData-bus bandwith, setup-times, orbiting

    Best = maximize sum of importance Earth Observation Satellite Scheduling

    CP06

  • Why soft constraints?Frequency assignment

    Given a telecommunication networkfind the best frequency for each communication link avoiding interferences

    Best can be:Minimize the maximum frequency (max)Minimize the global interference (sum)

    CP06

  • Why soft constraints?Combinatorial auctionsGiven a set G of goods and a set B of bidsBid (bi,vi), bi requested goods, vi value find the best subset of compatible bidsBest = maximize revenue (sum)

    CP06

  • Why soft constraints?Probabilistic inference (bayesian nets)

    Given a probability distribution defined by a DAG of conditional probability tablesand some evidence find the most probable explanation for the evidence (product)

    CP06

  • Why soft constraints?Even in decision problems:users may have preferences among solutions

    Experiment: give users a few solutions and they will find reasons to prefer some of them.

    CP06

  • ObservationOptimization problems are harder than satisfaction problems

    CSP vs. Max-CSP

    CP06

  • Why is it so hard ?Problem P(alpha): is there an assignment of cost lower than alpha ? Proof of inconsistency Proof of optimalityHarder than finding an optimum

    CP06

  • NotationX={x1,..., xn} variables (n variables)D={D1,..., Dn} finite domains (max size d)

    ZYX, tY is a tuple on YtY[Z] is its projection on ZtY[-x] = tY[Y-{x}]is projecting out variable x fY: xiY Di Eis a cost function on Y

    CP06

  • Generic and specific frameworksValued CNweighted CNSemiring CNfuzzy CN

    CP06

  • Costs (preferences)E costs (preferences) setordered by if a b then a is better than bCosts are associated to tuplesCombined with a dedicated operatormax: priorities+: additive costs*: factorized probabilities

    Fuzzy/possibilistic CNWeighted CNProbabilistic CN, BN

    CP06

  • Soft constraint network (CN)(X,D,C)X={x1,..., xn} variablesD={D1,..., Dn} finite domainsC={f,...} cost functionsfS, fij, fi fscope S,{xi,xj},{xi}, fS(t): E (ordered by , T)

    Obj. Function: F(X)= fS (X[S]) Solution: F(t) T Task: find optimal solutionidentity commutative associative monotonicanihilator

    CP06

  • Specific frameworks

    CP06

  • Weighted Clauses(C,w)weighted clauseCdisjunction of literalswcost of violation w E (ordered by , T) combinator of costsCost functions = weighted clauses(xi xj, 6), (xi xj, 2),(xi xj, 3)

    xixjf(xi,xj)006010102113

    CP06

  • Soft CNF formulaF={(C,w),} Set of weighted clauses (C, T)mandatory clause (C, w
  • Specific weighted prop. logics

    CP06

  • CSP example (3-coloring)x3x2x5x1x4For each edge:(hard constr.)

    xixjf(xi,xj)bbTbgbrgbggTgrrbrgrrT

    CP06

  • Weighted CSP example ( = +)x3x2x5x1x4F(X): number of non blue verticesFor each vertex

    CP06

  • Possibilistic CSP example (=max)x3x2x5x1x4F(X): highest color used (b
  • Some important detailsT = maximum acceptable violation. Empty scope soft constraint f (a constant)Gives an obvious lower bound on the optimumIf you do not like it: f =

    Additional expression power

    CP06

  • Weighted CSP example ( = +)x3x2x5x1x4F(X): number of non blue verticesFor each vertexFor each edge:T=6f = 0T=3Optimal coloration with less than 3 non-blue

    xixjf(xi,xj)bbTbg0br0gb0ggTgr0rb0rg0rrT

    CP06

  • General frameworks and cost structures hard{,T}totallyorderedSemiring CSP

    Valued CSPidempotentfairmulticriterialatticeorderedmultiple

    CP06

  • Idempotencya a = a (for any a)

    For any fS implied by (X,D,C)(X,D,C) (X,D,C{fS})Classic CN: = andPossibilistic CN: = maxFuzzy CN: = max

    CP06

  • FairnessAbility to compensate for cost increases by subtraction using a pseudo-difference:

    For b a, (a b) b = aClassic CN: ab = or (max)Fuzzy CN: ab = maxWeighted CN: ab = a-b (aT) else T Bayes nets:ab = /

    CP06

  • Processing Soft constraintsSearchcomplete (systematic)incomplete (local)Inferencecomplete (variable elimination)incomplete (local consistency)

    CP06

  • Systematic searchBranch and bound(s)

    CP06

  • I - Assignment (conditioning)f[xi=b]g(xj)g[xj=r]h

    xixjf(xi,xj)bbTbg0br3gb0ggTgr0rb0rg0rrT

    xjbTg0r3

    03

    CP06

  • I - Assignment (conditioning){(xyz,3), (xy,2)}

    x=true(y,2)(,2)y=false empty clause. It cannot be satisfied,2 is necessary cost

    CP06

  • Systematic search(LB) Lower Bound (UB) Upper BoundIf then prunevariablesunder estimation of the best solution in the sub-tree= best solution so farEach node is a soft constraint subproblemLBf= f= TUBT

    CP06

  • Depth First Search (DFS)BT(X,D,C)if (X=) then Top :=f else xj := selectVar(X) forall aDj do fSC s.t. xj S f := f[xj =a] f:= gSC s.t. S= gS if (f
  • Improving the lower bound (WCSP)Sum up costs that will necessarily occur (no matter what values are assigned to the variables)

    PFC-DAC (Wallace et al. 1994)PFC-MRDAC (Larrosa et al. 1999)Russian Doll Search (Verfaillie et al. 1996)Mini-buckets (Dechter et al. 1998)

    CP06

  • Improving the lower bound (Max-SAT)Detect independent subsets of mutually inconsistent clauses

    LB4a (Shen and Zhang, 2004)UP (Li et al, 2005)Max Solver (Xing and Zhang, 2005)MaxSatz (Li et al, 2006)

    CP06

  • Local searchNothing really specific

    CP06

  • Local searchBased on perturbation of solutions in a local neighborhood

    Simulated annealingTabu searchVariable neighborhood searchGreedy rand. adapt. search (GRASP)Evolutionary computation (GA)Ant colony optimization

    See: Blum & Roli, ACM comp. surveys, 35(3), 2003For booleanvariables: GSAT

    CP06

  • Boosting Systematic Search with Local SearchDo local search prior systematic search Use best cost found as initial T If optimal, we just prove optimality In all cases, we may improve pruningLocal search(X,D,C)time limitSub-optimalsolution

    CP06

  • Boosting Systematic Search with Local Search Ex: Frequency assignment problemInstance: CELAR6-sub4#var: 22 , #val: 44 , Optimum: 3230Solver: toolbar 2.2 with default optionsT initialized to 100000 3 hoursT initialized to 3230 1 hourOptimized local search can find the optimum in a less than 30 (incop)

    CP06

  • Complete inference Variable (bucket) eliminationGraph structural parameters

    CP06

  • II - Combination (join with , + here)= 0 6

    xixjf(xi,xj)bb6bg0gb0gg6

    xjxkg(xj,xk)bb6bg0gb0gg6

    xixjxkh(xi,xj,xk)bbb12bbg6bgb0bgg6gbb6gbg0ggb6ggg12

    CP06

  • III - Projection (elimination)f[xi]002g[]0Min

    xixjf(xi,xj)bb4bg6br0gb2gg6gr3rb1rg0rr6

    xig(xi)bgr

    h

    CP06

  • Properties

    Replacing two functions by their combination preserves the problem

    If f is the only function involving variable x, replacing f by f[-x] preserves the optimum

    CP06

  • Variable eliminationSelect a variableSum all functions that mention itProject the variable out

    Complexity Time: (exp(deg+1)) Space: (exp(deg))

    CP06

  • Variable elimination (aka bucket elimination)Eliminate Variables one by one.When all variables have been eliminated, the problem is solvedOptimal solutions of the original problem can be recomputed Complexity: exponential in the induced width

    CP06

  • Elimination order influence{f(x,r), f(x,z), , f(x,y)}Order: r, z, , y, x

    xrzy

    CP06

  • Elimination order influence{f(x,r), f(x,z), , f(x,y)}Order: r, z, , y, x

    xrzy

    CP06

  • Elimination order influence{f(x), f(x,z), , f(x,y)}Order: z, , y, x

    xzy

    CP06

  • Elimination order influence{f(x), f(x,z), , f(x,y)}Order: z, , y, x

    xzy

    CP06

  • Elimination order influence{f(x), f(x), f(x,y)}Order: y, x

    xy

    CP06

  • Elimination order influence{f(x), f(x), f(x,y)}Order: y, x

    xy

    CP06

  • Elimination order influence{f(x), f(x), f(x)}Order: x

    x

    CP06

  • Elimination order influence{f(x), f(x), f(x)}Order: x

    x

    CP06

  • Elimination order influence{f()}Order:

    CP06

  • Elimination order influence{f(x,r), f(x,z), , f(x,y)}Order: x, y, z, , r

    xrzy

    CP06

  • Elimination order influence{f(x,r), f(x,z), , f(x,y)}Order: x, y, z, , r

    xrzy

    CP06

  • Elimination order influence{f(r,z,,y)}Order: y, z, r

    rzyCLIQUE

    CP06

  • Induced widthFor G=(V,E) and a given elimination (vertex) ordering, the largest degree encountered is the induced width of the ordered graph

    Minimizing induced width is NP-hard.

    CP06

  • History / terminologySAT: Directed Resolution (Davis and Putnam, 60)Operations Research: Non serial dynamic programming (Bertel Brioschi, 72) Databases: Acyclic DB (Beeri et al 1983) Bayesian nets: Join-tree (Pearl 88, Lauritzen et Spiegelhalter 88)Constraint nets: Adaptive Consistency (Dechter and Pearl 88)

    CP06

  • Boosting search with variable elimination: BB-VE(k)At each nodeSelect an unassigned variable xiIf degi k then eliminate xiElse branch on the values of xi

    PropertiesBE-VE(-1) is BBBE-VE(w*) is VEBE-VE(1) is similar to cycle-cutset

    CP06

  • Boosting search with variable elimination Ex: still-life (academic problem)Instance: n=14#var:196 , #val:2Ilog Solver 5 daysVariable Elimination 1 dayBB-VE(18) 2 seconds

    CP06

  • Memoization fights thrashing=Different nodes, Same subproblemstoreretrieveDetecting subproblems equivalence is hardVVtPtPPtt

    CP06

  • Context-based memoizationP=P, if|t|=|t| andsame assign. to partially assigned cost functionsttPP

    CP06

  • MemoizationDepth-first B&B with,context-based memoizationindependent sub-problem detection is essentialy equivalent to VETherefore space expensiveFresh approach: Easier to incorporate typical tricks such as propagation, symmetry breaking,Algorithms:Recursive Cond. (Darwiche 2001)BTD (Jgou and Terrioux 2003)AND/OR (Dechter et al, 2004)

    Adaptive memoization: time/space tradeoff

    CP06

  • SAT inferenceIn SAT, inference = resolution xAxB------------ ABEffect: transforms explicit knowledge into implicitComplete inference: Resolve until quiescenceSmart policy: variable by variable (Davis & Putnam, 60). Exponential on the induced width.

    CP06

  • Fair SAT Inference(x A,u), (x B,w) (A B,m), (x A,um),(x B, wm),(x A B,m),(x A B,m)where: m=min{u,w} Effect: moves knowledge

    CP06

  • Example: Max-SAT (=+, =-)(xy,3), (xz,3)=xyyzxxyzyzx33z333(yz,3),(xy,3-3),(xz,3-3),(xyz,3),(xyz,3)

    CP06

  • Properties (Max-SAT)In SAT, collapses to classical resolutionSound and completeVariable elimination:Select a variable xResolve on x until quiescenceRemove all clauses mentioning x

    Time and space complexity: exponential on the induced width

    CP06

  • Change

    CP06

  • Incomplete inferenceLocal consistencyRestricted resolution

    CP06

  • Incomplete inference

    Tries to trade completeness for space/timeProduces only specific classes of cost functions Usually in polynomial time/space

    Local consistency: node, arcEquivalent problemCompositional: transparent useProvides a lb on consistencyoptimal cost

    CP06

  • Classical arc consistency A CSP is AC iff for any xi and cijci= ci (cij cj)[xi] namely, (cij cj)[xi] brings no new information on xiwvvw0000ijTTcij cj(cij cj)[xi]

    xixjcijvv0vw0wvTwwT

    xic(xi)v0wT

    CP06

  • Enforcing AC for any xi and cij ci:= ci (cij cj)[xi] until fixpoint (unique)wvvw0000ijTTcij cjT(cij cj)[xi]

    xixjcijvv0vw0wvTwwT

    xic(xi)v0wT

    CP06

  • Arc consistency and soft constraints for any xi and fijf=(fij fj)[xi] brings no new information on xiwvvw0000ij21fij fj1Always equivalent iff idempotent(fij fj)[xi]

    xixjfijvv0vw0wv2ww1

    Xif(xi)v0w1

    CP06

  • Idempotent soft CNThe previous operational extension works on any idempotent semiring CNChaotic iteration of local enforcing rules until fixpointTerminates and yields an equivalent problemExtends to generalized k-consistency

    Total order: idempotent ( = max)

    CP06

  • Non idempotent: weighted CN for any xi and fijf=(fij fj)[xi] brings no new information on xiwvvw0000ij21fij fj1EQUIVALENCE LOSTfij fj [xi]

    xixjfijvv0vw0wv2ww1

    xif(xi)v0w1

    CP06

  • IV - Subtraction of cost functions (fair)Combination+Subtraction: equivalence preserving transformation2wvvw000ij1011

    CP06

  • (K,Y) equivalence preserving inferenceFor a set K of cost functions and a scope YReplace K by (K)Add (K)[Y] to the CN (implied by K)Subtract (K)[Y] from (K)

    Yields an equivalent networkAll implicit information on Y in K is explicit

    Repeat for a class of (K,Y) until fixpoint

    CP06

  • Node Consistency (NC*): ({f,fi}, ) EPIFor any variable Xia, f + fi (a)
  • Full AC (FAC*): ({fij,fj},{xi}) EPINC*For all fij a b fij(a,b) + fj(b) = 0(full support)

    01wvvwf =0 T=401010xz11Thats our starting point!No termination !!!

    CP06

  • Arc Consistency (AC*): ({fij},{xi}) EPINC*For all fij a b fij(a,b)= 0b is a supportcomplexity:O(n 2d 3)

    0wvvwwf =T=4211100001xyz1120

    CP06

  • Neighborhood Resolution

    if |A|=0, enforces node consistency

    if |A|=1, enforces arc consistency(x A,u), (x A,w) (A,m), (x A,um),(x A, wm),(x A A,m),(x A A,m)

    CP06

  • Confluence is lostw01yxvw00v1f = 0 1

    CP06

  • Confluence is lostFinding an AC closure that maximizes the lb is an NP-hard problem (Cooper & Schiex 2004).Well one can do better in pol. time (OSAC, IJCAI 2007)w01yxvw00vf = 0 1

    CP06

  • HierarchyNC* O(nd)AC* O(n 2d 3)DAC* O(ed 2)FDAC* O(end 3)ACNCDACSpecial case: CSP (Top=1)EDAC* O(ed2 max{nd,T})

    CP06

  • Boosting search with LCBT(X,D,C)if (X=) then Top :=f else xj := selectVar(X) forall aDj do fSC s.t. xj S fS := fS [xj =a] if (LC) then BT(X-{xj},D-{Dj},C)

    CP06

  • BTMNCMAC/MDACMFDACMEDAC

    CP06

  • Boosting Systematic Search with Local consistency Frequency assignment problem CELAR6-sub4 (22 var, 44 val, 477 cost func):

    MNC*1 yearMFDAC* 1 hour

    CELAR6 (100 var, 44 val, 1322 cost func):

    MEDAC+memoization 3 hours (toolbar-BTD)

    CP06

  • Beyond Arc ConsistencyPath inverse consistency PIC (Debryune & Bessire)xyz(x,a) can be pruned because there are two other variables y,z such that (x,a) cannot be extended to any of their values.({fy, fz, fxy, fxz, fyz},{x}) EPIabc

    CP06

  • Beyond Arc ConsistencySoft Path inverse consistency PIC*223({fy, fz, fxy, fxz, fyz},x) EPIfy fz fxy fxz fyz(fyfzfxyfxzfyz)[x]xyz122

    xyzaaa2aab5aba2abb3baa0bab0bba2bbb0

    xa2b0

    xyzaaa0aab3aba0abb1baa0bab0bba2bbb0

    CP06

  • Hyper-resolution (2 steps)(lhA,u),(lqA,v),(hqA,u)(hqA,m),(lhA,u-m),(lqA,v-m),(lhqA,m),(lqhA,m),(hqA,u)(qA,m),(hqA,m-m),(hqA,u-m),(lhA,u-m),(lqA,v-m),(lhqA,m),(lqhA,m)if |A|=0, equal to soft PICImpressive empirical speed-ups==

    CP06

  • Complexity & Polynomial classesTree = induced width 1Idempotent or not

    CP06

  • Polynomial classesIdempotent VCSP: min-max CN

    Can use -cuts for lifting CSP classesSufficient condition: the polynomial class is conserved by -cuts

    Simple TCSP are TCSP where all constraints use 1 interval: xi-xj[aij,bij]

    Fuzzy STCN: any slice of a cost function is an interval (semi-convex function) (Rossi et al.)

    CP06

  • Hardness in the additive case(weighted/boolean)MaxSat is MAXSNP complete (no PTAS)Weighted MaxSAT is FPNP-completeMaxSAT is FPNP[O(log(n))] complete: weights !MaxSAT tractable langages fully characterized (Creignou 2001)

    MaxCSP langage: feq(x,y) : (x = y) ? 0 : 1 is NP-hard.Submodular cost function lang. is polynomial.(u x, v y f(u,v)+f(x,y) f(u,y)+f(x,v)) (Cohen et al.)

    CP06

  • Integration of soft constraints into classical constraint programmingSoft as hardSoft local consistency as a global constraint

    CP06

  • Soft constraints as hard constraintsone extra variable xs per cost function fS all with domain E fS cS{xS} allowing (t,fS(t)) for all t(S)one variable xC = xs (global constraint)

    CP06

  • Soft as Hard (SaH)Criterion represented as a variableMultiple criteria = multiple variablesConstraints on/between criteria

    Weaknesses:Extra variables (domains), increased aritiesSaH constraints give weak GAC propagationProblem structure changed/hidden

    CP06

  • Soft AC stronger than SasH GAC

    Take a WCSPEnforce Soft AC on itEach cost function contains at least one tuple with a 0 cost (definition)Soft as Hard: the cost variable xC will have a lb of 0

    The lower bound cannot improve by GAC

    CP06

  • > Soft AC stronger than SasH GAC111111f=1abx1x3x2x1x3x2101010x12x23xcxc=x12+x232

    CP06

  • Soft local Consistency as a Global constraint (=+)Global constraint: Soft(X,F,C)XvariablesFcost functionsCinterval cost variable (ub = T)

    Semantics: X U{C} satisfy Soft(X,F,C) ifff(X)=CEnforcing GAC on Soft is NP-hardSoft consistency: filtering algorithm (lbf)

    CP06

  • Ex: Spot 5 (Earth satellite sched.)For each requested photography: lost if not taken , Mb of memory if takenvariables: requested photographiesdomains: {0,1,2,3}constraints:{rij, rijk} binary and ternary hard costraintsSum(X)
  • Example: soft quasi-group (motivated by sports scheduling)

    Alldiff(xi1,,xin) i=1..mAlldiff(x1j,,xmj) j=1..nSoft(X,{fij},[0..k],+)Minimize #neighbors of different parityCost 1

    34

    CP06

  • Global soft constraints

    CP06

  • Global soft constraintsIdea: define a library of useful but non-standard objective functions along with efficient filtering algorithms

    AllDiff (2 semantics: Petit et al 2001, van Hoeve 2004)Soft global cardinality (van Hoeve et al. 2004)Soft regular (van Hoeve et al. 2004) all enforce reified GAC

    CP06

  • ConclusionA large subset of classic CN body of knowledge has been extended to soft CN, efficient solving tools exist.Much remains to be done:Extension: to other problems than optimization (counting, quantification)Techniques: symmetries, learning, knowledge compilation Algorithmic: still better lb, other local consistencies or dominance. Global (SoftAsSoft). Exploiting problem structure.Implementation: better integration with classic CN solver (Choco, Solver, Minion)Applications: problem modelling, solving, heuristic guidance, partial solving.

    CP06

  • 30 of publicity

    CP06

  • Open source libraries Toolbar and Toulbar2

    Accessible from the Soft wiki site:carlit.toulouse.inra.fr/cgi-bin/awki.cgi/SoftCSPAlg: BE-VE,MNC,MAC,MDAC,MFDAC,MEDAC,MPIC,BTDILOG connection, large domains/problemsRead MaxCSP/SAT (weighted or not) and ERGO formatThousands of benchmarks in standardized formatPointers to other solvers (MaxSAT/CSP)Forge mulcyber.toulouse.inra.fr/projects/toolbar (toulbar2)

    Pwd: bia31

    CP06

  • Thank you for your attentionThis is it !S. Bistarelli, U. Montanari and F. Rossi, Semiring-based Constraint Satisfaction and Optimization, Journal of ACM, vol.44, n.2, pp. 201-236, March 1997.S. Bistarelli, H. Fargier, U. Montanari, F. Rossi, T. Schiex, G. Verfaillie. Semiring-Based CSPs and Valued CSPs: Frameworks, Properties, and Comparison. CONSTRAINTS, Vol.4, N.3, September 1999.S. Bistarelli, R. Gennari, F. Rossi. Constraint Propagation for Soft Constraint Satisfaction Problems: Generalization and Termination Conditions , in Proc. CP 2000C.Blum and A.Roli. Metaheuristics in combinatorial optimization: Overview and conceptual comparison. ACM Computing Surveys, 35(3):268-308, 2003. T. Schiex, Arc consistency for soft constraints, in Proc. CP2000.M. Cooper, T. Schiex. Arc consistency for soft constraints, Artificial Intelligence, Volume 154 (1-2), 199-227 2004.M. Cooper. Reduction Operations in fuzzy or valued constraint satisfaction problems. Fuzzy Sets and Systems 134 (3) 2003.A. Darwiche. Recursive Conditioning. Artificial Intelligence. Vol 125, No 1-2, pages 5-41.R. Dechter. Bucket Elimination: A unifying framework for Reasoning. Artificial Intelligence, October, 1999. R. Dechter, Mini-Buckets: A General Scheme For Generating Approximations In Automated Reasoning In Proc. Of IJCAI97

    CP06

  • References

    S. de Givry, F. Heras, J. Larrosa & M. Zytnicki. Existential arc consistency: getting closer to full arc consistency in weighted CSPs. In IJCAI 2005.W.-J. van Hoeve, G. Pesant and L.-M. Rousseau. On Global Warming: Flow-Based Soft Global Constraints. Journal of Heuristics 12(4-5), pp. 347-373, 2006. P. Jegou & C. Terrioux. Hybrid backtracking bounded by tree-decomposition of constraint networks. Artif. Intell. 146(1): 43-75 (2003) J. Larrosa & T. Schiex. Solving Weighted CSP by Maintaining Arc Consistency. Artificial Intelligence. 159 (1-2): 1-26, 2004. J. Larrosa and T. Schiex. In the quest of the best form of local consistency for Weighted CSP, Proc. of IJCAI'03J. Larrosa, P. Meseguer, T. Schiex Maintaining Reversible DAC for MAX-CSP. Artificial Intelligence.107(1), pp. 149-163. R. Marinescu and R. Dechter. AND/OR Branch-and-Bound for Graphical Models. In proceedings of IJCAI'2005. J.C. Regin, T. Petit, C. Bessiere and J.F. Puget. An original constraint based approach for solving over constrained problems. In Proc. CP'2000. T. Schiex, H. Fargier et G. Verfaillie. Valued Constraint Satisfaction Problems: hard and easy problems In Proc. of IJCAI 95. G. Verfaillie, M. Lemaitre et T. Schiex. Russian Doll Search Proc. of AAAI'96.

    CP06

  • References

    M. Bonet, J. Levy and F. Manya. A complete calculus for max-sat. In SAT 2006.M. Davis & H. Putnam. A computation procedure for quantification theory. In JACM 3 (7) 1960.I. Rish and R. Dechter. Resolution versus Search: Two Strategies for SAT. In Journal of Automated Reasoning, 24 (1-2), 2000.F. Heras & J. Larrosa. New Inference Rules for Efficient Max-SAT Solving. In AAAI 2006. J. Larrosa, F. Heras. Resolution in Max-SAT and its relation to local consistency in weighted CSPs. In IJCAI 2005.C.M. Li, F. Manya and J. Planes. Improved branch and bound algorithms for max-sat. In AAAI 2006.H. Shen and H. Zhang. Study of lower bounds for max-2-sat. In proc. of AAAI 2004.Z. Xing and W. Zhang. MaxSolver: An efficient exact algorithm for (weighted) maximum satisfiability. Artificial Intelligence 164 (1-2) 2005.

    CP06

  • SoftasHard GAC vs. EDAC 25 variables, 2 values binary MaxCSPToolbar MEDACopt=34220 nodescpu-time = 0

    GAC on SoftasHard, ILOG Solver 6.0, solveopt = 34339136 choice pointscpu-time: 29.1Uses table constraints

    CP06

  • Other hints on SoftasHard GACMaxSAT as Pseudo Boolean SoftAsHardFor each clause:c = (xz,pc) cSAH = (xzrc) Extra cardinality constraint: pc.rc k

    Used by SAT4JMaxSat (MaxSAT competition).

    CP06

  • MaxSAT competition (SAT 2006)Unweighted MaxSAT

    CP06

  • MaxSAT competition (SAT 2006)Weighted

    CP06