Williamson arm validation metrics
Click here to load reader
-
Upload
obsidian-software -
Category
Documents
-
view
390 -
download
0
Transcript of Williamson arm validation metrics
1
Verification Metrics
Dave WilliamsonCPU Verification and Modeling Manager
Austin Design Center
June 2006
222
Verification Metrics: Why do we care?
Predicting functional closure of a design is hard
Design verification is typically the critical path
CPU design projects rarely complete on schedule
Cost of failure to predict design closure is significant
333
Two key types of metricsVerification test plan based metrics
Amount of direct tests completed Amount of random testing completedNumber of assertions written Amount of functional coverage written and hit Verification reviews completed
Health of the design metricsSimulation passing ratesBug rate Code stabilityDesign reviews completed
444
Challenges and limitationsLimitations of test plan based metrics
Will give a best case answer for completion dateThe plan will grow as testing continues
Limitations of health of the design based metricsCan give false impressions if used independent from test plan metricsRequires good historical data on similar project for proper interpretation
General concerns to be aware of for all metricsWhat you measure will affect what you doGathering metrics is not freeHistorical data can be misleadingDon’t be a slave to the metrics:
they are a great tool, but not the complete answer
555
Bug rate exampleBug History
0
200
400
600
800
1000
1200
1 5 9 13 17 21 25 29 33 37 41 45 49 53 57 61 65 69 73 77 81 85 89 93 97 101
105
109
113
Week number
Tota
l Bug
Cou
nt
0
2
4
6
8
10
12
14
16
18
20
Bug
Rat
e R
ollin
g A
vera
ge
Total Bug Count Weekly Bug Count (4wk rolling average)
Knee in curve
666
Bug rate by unit exampleBug breakdown per design unit
0
50
100
150
200
250
300
1 5 9 13 17 21 25 29 33 37 41 45 49 53 57 61 65 69 73 77 81 85 89 93 97 101
105
109
113
777
Functional Coverage closure example
New coverage points added