Scalable Synthesis
description
Transcript of Scalable Synthesis
![Page 1: Scalable Synthesis](https://reader036.fdocuments.in/reader036/viewer/2022083006/56813d0c550346895da6c21b/html5/thumbnails/1.jpg)
Scalable SynthesisBrandon Lucia and Todd Mytkowicz
Microsoft Research
![Page 2: Scalable Synthesis](https://reader036.fdocuments.in/reader036/viewer/2022083006/56813d0c550346895da6c21b/html5/thumbnails/2.jpg)
Synthesizing Circuits
∃𝑤 ∀ 𝑥𝑃𝑤 (𝑥 )=𝑆𝑝𝑒𝑐(𝑥)
Inference problem is undecidable in general – hard problems to solve!Can we leverage existing work to make this scale?
There exists some parameters, w, such that all inputs x implement a specification
![Page 3: Scalable Synthesis](https://reader036.fdocuments.in/reader036/viewer/2022083006/56813d0c550346895da6c21b/html5/thumbnails/3.jpg)
Neural Networks
𝑓 (𝑥¿¿ 𝑗)=𝜙 (∑𝑖=0
𝑛
𝑊 𝑖 , 𝑗1 𝜙 (∑
𝑖=0
𝑛
𝑊 𝑖 , 𝑗0 𝑥 𝑗))¿
𝐸𝑟𝑟𝑜𝑟 (𝐹 (𝑥 𝑗) , 𝑓 (𝑥 𝑗))≈0
Find such that
∃𝑊 0𝑊 1∀ 𝑥 𝑗∈𝑥 [𝐹 (𝑥 𝑗)≈ 𝑓 (𝑥 𝑗) ]𝑋 0
𝑋 1
or…
F T
F
T
F(, )
F F F
F T T
T F F
T T T
![Page 4: Scalable Synthesis](https://reader036.fdocuments.in/reader036/viewer/2022083006/56813d0c550346895da6c21b/html5/thumbnails/4.jpg)
Neural Networks
2 layer neural network can approximate any continuous function!
𝐸𝑟𝑟𝑜𝑟 (𝐹 (𝑥) , 𝑓 (𝑥))≈0
or…
𝑓 (𝑥)=𝜙(𝑊 1𝜙 (𝑊 0𝑥 ))
F(, )
F F F
F T T
T F T
T T F
𝑋 0
𝑋 1
F T
F
T
∃𝑊 0𝑊 1∀ 𝑥 [𝐹 (𝑥 )≈ 𝑓 (𝑥 )]
Find such that
![Page 5: Scalable Synthesis](https://reader036.fdocuments.in/reader036/viewer/2022083006/56813d0c550346895da6c21b/html5/thumbnails/5.jpg)
Learning Abstractions with ML
First layer learns low level features
Each subsequent layer learns higher level features
Unsupervised training, layer by layer
![Page 6: Scalable Synthesis](https://reader036.fdocuments.in/reader036/viewer/2022083006/56813d0c550346895da6c21b/html5/thumbnails/6.jpg)
Duality of Synthesis and ML
∃𝑊 0 ,𝑊 1∀ 𝑥( 𝑓 (𝑥)≈ 𝑦 )
Specification is implicit in input/output pairs
Over Booleanswhere is Or, is And, andis Not
Machine Learning Synthesis
𝑓 (𝑥 )≈𝜙 (𝑊 1𝜙 (𝑊 0𝑥 )) 𝑓 (𝑥)=𝜙(𝑊 1𝜙 (𝑊 0𝑥 ))
Specification is vector of logical formula
Over Realswhere is +, is , andis Sigmoid
∃𝑊 0 ,𝑊 1∀ 𝑥( 𝑓 (𝑥 )=𝑦 )
![Page 7: Scalable Synthesis](https://reader036.fdocuments.in/reader036/viewer/2022083006/56813d0c550346895da6c21b/html5/thumbnails/7.jpg)
Synthesizing Sudoku Recognizer
…
…
…
A0 A1 A2
B0 B1 B2
C0 C1 C2
K0
If A0 is 0 then no other cell in A0’s column 0 no other cell in A0’s row is 0 no other cell in A0’s unit is 0If A0 is not 0 then one cell in A0’s column must be 0 one cell in A0’s row must be 0 one cell in A0’s unit must be 0
𝑥
![Page 8: Scalable Synthesis](https://reader036.fdocuments.in/reader036/viewer/2022083006/56813d0c550346895da6c21b/html5/thumbnails/8.jpg)
Learning Abstractions in Sudoku
First layer learns local implications
Each subsequent level combines prior levels.
Learns “factorization” of potentially exponential specification!
Unsupervised training, layer by layer
![Page 9: Scalable Synthesis](https://reader036.fdocuments.in/reader036/viewer/2022083006/56813d0c550346895da6c21b/html5/thumbnails/9.jpg)
Future directions & Questions
• Flesh out duality into formal details
• Duality goes both ways: can we help ML methods with our understanding of synthesis / formal verification?• CEGIS: • Program = Data Structure + Algorithm
• Learn structure (depth) and algorithm (connectivity)
• Approximation / Probabilistic need not mean incorrect• But it may help scale inference