Stocs – A Stochastic CSP Solver Bella Dubrov IBM Haifa Research Lab © Copyright IBM.
-
Upload
norman-parker -
Category
Documents
-
view
219 -
download
0
Transcript of Stocs – A Stochastic CSP Solver Bella Dubrov IBM Haifa Research Lab © Copyright IBM.
Stocs – A Stochastic CSP SolverStocs – A Stochastic CSP Solver
Bella Dubrov
IBM Haifa Research Lab
© Copyright IBM
IBM Haifa Research Lab
Copyright IBM2
Outline
CSP solving algorithms Systematic Stochastic
Limitations of systematic methods Stochastic approach Stocs algorithm Stocs challenges Summary
IBM Haifa Research Lab
Copyright IBM3
Constraint satisfaction problems
Variables: Anna, Beth, Cory, Dave Domains: Red, Green, Orange, Yellow houses Constraints:
The Red and Green houses are in the city The Orange and Yellow houses are in the
countryside The Red and Green houses are neighboring, as
well as the Orange and Yellow houses Anna and Dave have dogs, Beth owns a cat Dogs and cats cannot be neighbors Dogs must live in the countryside
Solution: Anna lives in the Orange house, Beth lives in the
Red house, Cory lives in the Green house, Dave lives in the Yellow house
IBM Haifa Research Lab
Copyright IBM5
Systematic approach
Systematically go over the search space Use pruning whenever possible
pruning is done by projection
IBM Haifa Research Lab
Copyright IBM6
Example: projector for multiply
a x b = c
a є [2, 20], b є [3, 20], c є [1, 20]
Projection to input 1
a’ = [2, 6]
Projection to input 2
b’ = [3, 10]
Projection to result
c’ = {6, 8, 9, 10, 12, 14, 15, 16, 18, 20}
IBM Haifa Research Lab
Copyright IBM7
Limitations of systematic methods: example 1
cba
Ncba
*
1,...,0,,
Propagation is hard (factoring)
IBM Haifa Research Lab
Copyright IBM8
Limitations of systematic methods: example 2
The Some-Different constraint: given a graph on the variables, the variables connected by an edge must have different values
Propagation is NP-hard for domains of size k > 3 (k-colorability)
IBM Haifa Research Lab
Copyright IBM9
Limitations of systematic methods: example 3
10.4
00.3
00.2
00.1
ac
cb
ca
ba
0,1 cbaOnly solution:
1,...,0,, Ncba
Local consistency at onset probability of success: 1/N
IBM Haifa Research Lab
Copyright IBM10
Stochastic approach
State: an assignment of values to all the variables Cost: a function from the set of states to {0} U R+
Cost = 0 iff all constraints are satisfied by the state
IBM Haifa Research Lab
Copyright IBM11
Stochastic approach
General idea: Start from some stateFind the next state and move thereStop if a state with cost 0 is found
Stochastic algorithms are usually incomplete Different stochastic algorithms use different heuristics for finding the
next state Examples:
Simulated annealingTabu search
IBM Haifa Research Lab
Copyright IBM12
Stocs algorithm overview
Check states on length-scales “typical” for the problem. Hop to a new state if cost is lowerLearn the topography of the problem: learn the typical step sizes and directionsGet domain-knowledge as input strategies
IBM Haifa Research Lab
Copyright IBM13
Problem:7 groups of players 6 members in each groupPlay 4 weeksWithout any two players playing together (in the same group) twice
Exponential decreaseNo sense in trying step sizes larger than 20. But may benefit strongly from step sizes of 10-15
Reproducible - characterizes the problem
Social Golfer Problem (7-6-4)
1
10
100
1000
10000
0 10 20 30 40 50
Step Size
Nu
mb
er o
f Su
cces
ses
4.6 million tries 4.7 million tries 93 million tries
Example: Social Golfer Problem
IBM Haifa Research Lab
Copyright IBM14
Problem:Minimize the autocorrelation on a sequence of N (45) bits
Non-exponential decrease, followed by saturationMakes sense to always try large steps
Identifies small characteristic features
Extremely reproducible
Example: LABS
Low Autocorrelation Binary Squence (N = 45)
1000
10000
100000
0 10 20 30 40 50
Step Size
Nu
mb
er o
f Su
cces
ses
77 million tries 77 million tries
IBM Haifa Research Lab
Copyright IBM15
Problem:Select different values for three variables out of a given set of values (smaller than domains)Easy problem: results are for many runs
Prefer larger step sizes(up to a cutoff)
Reproducible
Example: Selection Problem
Selection Problem (40,30)
1
10
100
1000
10000
0 500 1000 1500
Step Size
Num
ber o
f Su
cces
ses
1.5 million tries 1.5 million tries 3.3 million tries
IBM Haifa Research Lab
Copyright IBM16
Problem:Same as before, modeled differently
Prefer intermediate step sizes Reproducible
Example: Selection Problem, different modeling
Selection Problem (4,3)
10000
100000
1000000
10000000
0 2 4 6 8 10 12 14
Step Size
Nu
mb
er o
f S
ucc
esse
s
n tries n tries m tries
IBM Haifa Research Lab
Copyright IBM17
Stocs algorithm
At each step:
decide attempt type: random, learned or user-defined
if random:
choose a random step
if learned:
decide learn-type: step-size, direction, …
if step-size:
choose a step-size which was previously successful (weighted)
create a random attempt with chosen step size
if direction:
choose a direction which was previously successful (weighted)
create a random attempt with chosen direction
if user-defined:
get next user-defined attempt
IBM Haifa Research Lab
Copyright IBM18
Optimization problems
Constraints must be satisfied In addition, an objective function that should be optimized is given Example: doll houses
Constraints as before In addition each doll has a preferred set of houses The best solution satisfies as much of the preferences as possible
IBM Haifa Research Lab
Copyright IBM19
Optimization with Stocs
Last year we added the optimization capability to Stocs Optimization is natural for Stocs:
First find a solution Then keep searching for a better state
Implementation: Cost function from a state to a pair of non-negative numbers (c1, c2):
c1 is the cost of the constraints c2 is the value of the objective function lexicographic order on the pairs:
a better state will always improve the constraints after a state with c1 = 0 is found, Stocs will continue searching for
a better c2
IBM Haifa Research Lab
Copyright IBM20
Preprocessing and initialization
Before starting the search 2 things happen: Preprocessing of the problem
including: finding bits that should be constant in any solution removing unnecessary variables simplifying constraints
has a big impact on the search: last year we improved the performance by a factor of 100 with
preprocessing Initialization: finding the initial state
Starting the search at a good state is critical Currently, each constraint tries to initialize its variables to a satisfying
assignment, considering the “wishes” of other constraints
IBM Haifa Research Lab
Copyright IBM21
Summary
Limitations of systematic methods Stochastic approach: move between full assignments Stocs: learn the topography of the problem, allow user-defined heuristics Optimization with Stocs Preprocessing and initialization Variable types