DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of...

48
DUSD(Labs) GSRC GSRC bX update bX update March 2003 March 2003 Aaron Ng, Marius Eriksen and Igor Markov Aaron Ng, Marius Eriksen and Igor Markov University of Michigan University of Michigan
  • date post

    21-Dec-2015
  • Category

    Documents

  • view

    217
  • download

    0

Transcript of DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of...

DUSD(Labs)

GSRCGSRC

bX updatebX updateMarch 2003March 2003

Aaron Ng, Marius Eriksen and Igor MarkovAaron Ng, Marius Eriksen and Igor MarkovUniversity of MichiganUniversity of Michigan

12/09/02 2

OutlineOutline

Motivation, issues in benchmarkingMotivation, issues in benchmarking

bX in the picturebX in the picture

Sample application: Evaluation of toolsSample application: Evaluation of tools

Future focusFuture focus

Contact info, linksContact info, links

12/09/02 3

Motivation, issues in benchmarkingMotivation, issues in benchmarking

1.1. EvaluationEvaluation• independent reproduction of results and experimentsindependent reproduction of results and experiments

• explicit methods requiredexplicit methods required minimum room for misinterpretation of resultsminimum room for misinterpretation of results

• evaluation of algorithms across entire problem spaceevaluation of algorithms across entire problem space conflicting and correlating optimization objectives conflicting and correlating optimization objectives separation of placement and routing tasksseparation of placement and routing tasks

12/09/02 4

Motivation, issues in benchmarking Motivation, issues in benchmarking (cont’d)(cont’d)

2.2. Availability of resultsAvailability of results• raw experimental resultsraw experimental results

• availability allows verificationavailability allows verification

• results provide insight into the performance of a toolresults provide insight into the performance of a tool

12/09/02 5

Motivation, issues in benchmarking Motivation, issues in benchmarking (cont’d)(cont’d)

3.3. Standard formatsStandard formats• meaningful comparison of resultsmeaningful comparison of results

• compatibility between tools and benchmarkscompatibility between tools and benchmarks

• correct interpretation of benchmarkscorrect interpretation of benchmarks

12/09/02 6

bX in the picturebX in the picture

1.1. AutomationAutomation• ‘‘live’ repositorylive’ repository

support for execution of tools on benchmarkssupport for execution of tools on benchmarks distributed network of computational hostsdistributed network of computational hosts

• online reporting of resultsonline reporting of results automatic updates when changes in dependencies occurautomatic updates when changes in dependencies occur

12/09/02 7

bX in the picture bX in the picture (cont’d)(cont’d)

2.2. Scripts and flowsScripts and flows• reproduction of resultsreproduction of results

scripts and flows describe scripts and flows describe experimentsexperiments scripts can be saved, shared and reusedscripts can be saved, shared and reused

• representation of entire problem spacerepresentation of entire problem space relationship between optimization objectivesrelationship between optimization objectives e.g. the effect of placement results on routinge.g. the effect of placement results on routing

12/09/02 8

bX in the picture bX in the picture (cont’d)(cont’d)

3.3. Standard formatsStandard formats• interoperability between tools and benchmarksinteroperability between tools and benchmarks

• meaningful comparison of resultsmeaningful comparison of results

12/09/02 9

Sample application: Evaluation of toolsSample application: Evaluation of tools

1.1. PlacersPlacers• CapoCapo

randomizedrandomized fixed die placerfixed die placer emphasis on routabilityemphasis on routability tuned on proprietary Cadence benchmarkstuned on proprietary Cadence benchmarks

12/09/02 10

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

1.1. Placers Placers (cont’d)(cont’d)

• DragonDragon randomizedrandomized variable-die placervariable-die placer tuned on IBM-Place benchmarkstuned on IBM-Place benchmarks

12/09/02 11

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

1.1. Placers Placers (cont’d)(cont’d)

• KraftWerkKraftWerk deterministicdeterministic fixed-die placerfixed-die placer results typically have cell overlapsresults typically have cell overlaps additional legalization step by DOMINOadditional legalization step by DOMINO

12/09/02 12

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

2.2. BenchmarksBenchmarks• PEKOPEKO

artificial netlistsartificial netlists designed to match statistical parameters of IBM netlistsdesigned to match statistical parameters of IBM netlists known optimal wirelengthknown optimal wirelength concern that they are not representative of industry circuitsconcern that they are not representative of industry circuits

12/09/02 13

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

2.2. Benchmarks Benchmarks (cont’d)(cont’d)

• gridsgrids 4 fixed vertices, n4 fixed vertices, n22 1x1 movables 1x1 movables tests placers on datapath-like circuitstests placers on datapath-like circuits known optimal placementknown optimal placement results are easily visualized for debuggingresults are easily visualized for debugging

12/09/02 14

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flowExample flow

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor

A script in bX serves as a template describing an experiment, and can be saved and shared.

Scripts are instantiated by defining the individual components of the script.

Flows are instantiated scripts.

Flows can be re-executed to reproduce results.

12/09/02 15

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor

Flow parameters:

Capo

PEKO

(default)

placement map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

After completion, the results of the jobs will be automatically posted online.

In the case of the placement job, the results include wirelength and runtime.

12/09/02 16

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

post-processing

post-processor

Flow parameters:

Capo

PEKO

(default)

placement map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

12/09/02 17

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

If we swapped Capo with Dragon:

Capo

PEKO

(default)

placement map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

12/09/02 18

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

If we swapped Capo with Dragon:

Dragon

PEKO

(default)

placement map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

12/09/02 19

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

If we swapped Capo with Dragon:

Dragon

PEKO

(default)

placement map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

12/09/02 20

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor Capo

PEKO

(default)

placement map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

12/09/02 21

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor

Modify the flow:

Capo

PEKO

(default)

placement map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

12/09/02 22

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor

Modify the flow:

Capo

grid

(default)

placement map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

12/09/02 23

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor

Modify the flow:

Capo

grid

(default)

grid graph

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

12/09/02 24

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor

Modify the flow:

Capo

grid

(default)

grid graph

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

12/09/02 25

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

Swap Capo with Dragon:

Capo

grid

(default)

grid graph

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

12/09/02 26

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

Swap Capo with Dragon:

Dragon

grid

(default)

grid graph

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

12/09/02 27

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

Swap Capo with Dragon:

Dragon

grid

(default)

grid graph

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

12/09/02 28

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor Capo

PEKO

(default)

congestion map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

12/09/02 29

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

12/09/02 30

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

12/09/02 31

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

placerbenchmark

parameters

legalizer

legalization

evaluator

post-processor

12/09/02 32

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

DOMINO

placerbenchmark

parameters

legalizer

legalizationlegalizer

evaluator

post-processor

12/09/02 33

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

DOMINO

placerbenchmark

parameters

evaluator

post-processor

legalizer

legalizationlegalizer

12/09/02 34

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

DOMINO

placerbenchmark

parameters

evaluator

post-processor

legalizer

legalizationlegalizer

12/09/02 35

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

DOMINO

placerbenchmark

parameters

evaluator

post-processor

legalizer

legalization

router

routing

routerlegalizer

12/09/02 36

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

DOMINO

placerbenchmark

parameters

evaluator

post-processor

legalizer

legalization

router

routing

routerlegalizer

12/09/02 37

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

KraftWerk

PEKO

(default)

overlap/legality & wirelength

DOMINO

placerbenchmark

parameters

evaluator

legalizer

legalizationlegalizer

post-processing

post-processor

post-processor

router

routing

router

12/09/02 38

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters

evaluator1

placementevaluation KraftWerk

PEKO

(default)

overlap/legality & wirelength

DOMINO

placerbenchmark

parameters

evaluator1

legalizer

legalizationlegalizer

12/09/02 39

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters

placement KraftWerk

PEKO

(default)

overlap/legality & wirelength

DOMINO

placerbenchmark

parameters

evaluator1

legalizer

legalizationlegalizer

evaluator1

evaluation

evaluator2

evaluation

12/09/02 40

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters

placement KraftWerk

PEKO

(default)

overlap/legality & wirelength

routability

DOMINO

placerbenchmark

parameters

evaluator1

legalizer

legalizationlegalizer

evaluator2

evaluator1

evaluation

evaluator2

evaluation

12/09/02 41

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters

placement KraftWerk

PEKO

(default)

overlap/legality & wirelength

routability

DOMINO

placerbenchmark

parameters

evaluator1

legalizer

legalization evaluator2

evaluator1

evaluation

evaluator2

evaluation

evaluator3

evaluationlegalizer

12/09/02 42

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters

placement KraftWerk

PEKO

(default)

overlap/legality & wirelength

routability

DOMINO

timing analysis

placerbenchmark

parameters

evaluator1

legalizer

legalization evaluator2

evaluator1

evaluation

evaluator2

evaluation

evaluator3

evaluationlegalizer

evaluator3

12/09/02 43

Future FocusFuture Focus

1.1. Easy deploymentEasy deployment• downloadable bX distributiondownloadable bX distribution

in the form of a binary or installation packagein the form of a binary or installation package

12/09/02 44

Future Focus Future Focus (cont’d)(cont’d)

2.2. Interpretation of resultsInterpretation of results• multiple views and query supportmultiple views and query support

• for example, for example, ‘‘show all results for solver S’show all results for solver S’ ‘‘show the hardest benchmarks for solver S’show the hardest benchmarks for solver S’ ‘‘has the solution quality decreased for benchmark B, has the solution quality decreased for benchmark B,

since the upload of the new version of solver S?’since the upload of the new version of solver S?’

12/09/02 45

Future Focus Future Focus (cont’d)(cont’d)

3.3. Type checkingType checking• MIME-like affinity between solvers and benchmarksMIME-like affinity between solvers and benchmarks

compatibility checkscompatibility checks useful for performing queries on different ‘families’useful for performing queries on different ‘families’

• ‘‘learning’ of new file typeslearning’ of new file types

12/09/02 46

Future focus Future focus (cont’d)(cont’d)

4.4. GSRC BookshelfGSRC Bookshelf• populate bX with implementations from Bookshelfpopulate bX with implementations from Bookshelf

still the same ‘one-stop-shop’still the same ‘one-stop-shop’ except that it will be a except that it will be a live live repositoryrepository

12/09/02 47

Future Focus Future Focus (cont’d)(cont’d)

5.5. OpenAccessOpenAccess• method of communicating data between jobsmethod of communicating data between jobs

provide interoperability between toolsprovide interoperability between tools

• single ‘design-through-manufacturing’ data model single ‘design-through-manufacturing’ data model

12/09/02 48

Contact info, linksContact info, links

For more info or source code:For more info or source code:

[email protected]@umich.eduFeedback and comments are appreciated. Feedback and comments are appreciated.

OpenAccessOpenAccess www.openeda.orgwww.openeda.org

www.cadence.com/feature/open_access.htmlwww.cadence.com/feature/open_access.html

GSRC BookshelfGSRC Bookshelf www.gigascale.org/bookshelfwww.gigascale.org/bookshelf

Thanks!Thanks!