Verifica e Validazione Automatica di Sistemi Complessi Enrico Tronci Dipartimento di Informatica,...
-
Upload
hester-bell -
Category
Documents
-
view
215 -
download
0
Transcript of Verifica e Validazione Automatica di Sistemi Complessi Enrico Tronci Dipartimento di Informatica,...
Verifica e Validazione Automatica di Sistemi Complessi
Enrico TronciDipartimento di Informatica, Università di Roma “La Sapienza”, Via Salaraia 113,
00198 Roma, Italy, [email protected], http://www.dsi.uniroma1.it/~tronci
Workshop on Research and InnovationsNEXT
20 Mat 2005
2
Automatic Verification: A Money Saver Testing without automation tends to discover errors towards the end of the design flow. Error fixing is very expensive at that point and may delay product release. Methods to discover errors as soon as possible are needed.
0
5
10
15
20
25
30
35
40
1X 3-6X 10X 15-40X 30-70X 40-1000X
Without AutomatedtestingWith Automated testing
Early development Implementation
Number of times more expensive to fix
Err
ors
cau
ght
(per
cent
)
Source: Mercury Interactive, Siebel Siemens
3
Model Checking Game
Model Checker(Equivalent to
Exhaustive testing)
Sys(VHDL, Verilog, C, C++
Java, MathLab, Simulink, …)
BAD(CTL, CTL*, LTL, …)
PASSFAILWhat went
wrong …
I.e. no sequence of events (states) can possibly lead to an undesired state.
CounterexampleI.e. sequence of events (states) leading to an undesired state.
4
Mutual Exclusion (Mutex)n1 t1
c1 S2=t
2 &
T=1
S2 =
n2
S1 n2 t2
c2 S1=t
1 &
T=2
S1 =
n1
S2
1 2T
S1=n1 & S2=t2
S2=n2 & S1=t1
n1, n2, 1 t1, n2, 1 c1, n2, 1 n1, t2, 1 t1, t2, 1 c1, t2, 1
n1, c2, 1 t1, c2, 1 c1, c2, 1 n1, n2, 2 t1, n2, 2 c1, n2, 2
n1, t2, 2 t1, t2, 2 c1, t2, 2 n1, c2, 2 t1, c2, 2 c1, c2, 2
SPECMutual exclusion: AG (S1 != c1 | S2 != c2) … trueNo starvation S1: AG (S1 = t1 --> AF (S1 = c1)) … true
5
Mutex (~ arbitrary initial state)
Mutual exclusion: AG (S1 != c1 | S2 != c2) …No starvation S1: AG (S1 = t1 --> AF (S1 = c1)) …
n1 t1
c1 S2=t
2 &
T=1
S2 =
n2
S1 n2 t2
c2 S1=t
1 &
T=2
S1 =
n1
S2
1 2T
S1=n1 & S2=t2
S2=n2 & S1=t1
6
SMV output (mutex)
-- specification AG (S1 != c1 | S2 != c2) is true
-- specification AG (S1 = t1 -> AF S1 = c1) is true
resources used:user time: 0.02 s, system time: 0.04 sBDD nodes allocated: 635Bytes allocated: 1245184BDD nodes representing transition relation: 31 + 6
7
Hybrid SystemsHybrid Systems are systems with discrete as well as continuous state variables. Typically requirements analysis for embedded software/hardware leads to study verification of hybrid systems.
Examples of hybrid systems:
• Industrial Plants• Automotive systems (cost of software in new cars compares with that of the mechanics)• Avionics• Biological models•…
8
Gas Turbine System
ControllerGas Turbine(Turbogas)
Disturbances: electric users, param. var, etc
Vrot: Turbine Rotation speedTexh: Exhaust smokes TemperaturePel: Generated Electric PowerPmc: Compressor Pressure
Settings Fuel Valve OpeningFG102
Vrot, Texh, Pel, Pmc
9
PLAN
• Build discrete time model of ICARO Turbogas Control System.
• Code system model with Murphi verifier. This is very similar to simulation code, only more abstract because of
model checking limitations (state explosion).
• Run verification experiments.
10
Experimental ResultsMAX_D_U
(KW/sec)
Reachable States
Rules Fired
Diameter CPU (sec) Result
1000 2,246,328 6,738,984 12904 16988.18 PASS
1750 7,492,389 22,477,167 7423 54012.18 PASS
2500 1,739,719 5,186,047 1533 12548.25 FAIL
5000 36,801 109,015 804 271.77 FAIL
Results on a INTEL Pentium 4, 2GHz Linux PC with 512 MB RAM. Murphi options: -b, -c, --cache, -m350
11
Fail trace: MAX_D_U = 2500 KW/sec10 ms time step (100 Hz sampling frequency)
Electric user demand (KW)
Rotation speed (percentage of max = 22500 rpm)
Allowed range for rotation speed:40-120
12
Fail trace: MAX_D_U = 5000 Kw/sec10 ms time step (100 Hz sampling frequency)
Electric user demand (KW)
Rotation speed (percentage of max = 22500 rpm)
Allowed range for rotation speed:40-120
13
Probabilistic Model Checking (1)Sometimes we can associate a probability with each transition. In such cases reachability analysis becomes the task of computing the stationary distribution of a Markov Chain. This can be done using a Probabilistic Model Checker (state space too big for matrices).
0
1
2
0.4
0.6
0.3
0.7
0.8
0.2
14
Finite Horizon Markov Chain Analysis… of our turbogas
MAX_D_U
(KW/sec)
Visited States
Rules
Fired
Horizon CPU time (s) Probability of violating spec
2500 3,018,970 8,971,839 1600 68562 7.373292e-05
3500 2,226,036 6,602,763 1400 50263 1.076644e-04
4500 1,834,684 5,439,327 1300 41403 9.957147e-05
5000 83,189 246,285 900 2212 3.984375e-03
15
Obstructions
State Explosion:That is the HUGE number of reachable states that large systems have.
Integration in the design flow:People devoted to verification, validation, specification and testing needed … among other things
16
Open Source Model Checkers
SMV, NuSMV (Carnegie Mellon University, IRST) [smv,VHDL / CTL]
SPIN (Bell Labs) [PROMELA (C like)/ LTL]
Murphi (Stanford, “La Sapienza”, L’Aquila) [Pascal like/assert() style]
VIS (Berkeley, Stanford, Colorado University) [BLIF, Verilog/CTL, LTL]
PVS (Stanford) [PVS/PVS]
TVLA (Tel-Aviv) [TVLA/TVLA]
Java PathFinder (NASA) [Java Bytecode/LTL]
BLAST (Berkeley) [C/assert()]
Here are a few examples of open source model checkers.
17
Java Verification (BANDERA)SAnToS Group at Kansas State University
18
Some Commercial Model Checkers
Cadence (Verilog, VHDL)
Synopsis (Verilog, VHDL)
Innologic (Verilog)
Telelogic (inside SDL suite)
Esterel
Coverity (C, C++)
Here are a few examples of commercial model checkers.
19
In House Model Checkers
FORTE (INTEL) [Verilog, VHDL/Temporal Logic]
SLAM (Microsoft) [C/assert()]
BEBOP (Microsoft) [C/assert()]
Rule Based (IBM) [Verilog, VHDL/CTL, LTL]
CANVAS (IBM) [Java/constraints-guarantees]
Verisoft (Bell Labs) [C/C]
Here are a few examples of in house model checkers.
20
Summing Up
Automatic Verification (reachability analysis) is a very useful tool for design and analysis of complex systems such as: digital hardware, embedded software and hybrid systems.
Decrease the probability of leaving undetected bugs in our design, thus increasing design quality.
Speed up the testing/simulation process, thus decreasing costs and time-to-market.
Early error detection, thus decreasing design costs.
Support exploration of more complex, hopefully more efficient, solutions by supporting their debugging.
Automatic Verification allows us to:
21
Adoption Paths
• Integrating Automatic Verification in Design Flow (to reach state of the art)
• Custom Model Checker (for competitive advantage, to go beyond state of the art)