Verification and Validation Overview References: Shach, Object Oriented and Classical Software...
-
Upload
blaise-summers -
Category
Documents
-
view
229 -
download
1
Transcript of Verification and Validation Overview References: Shach, Object Oriented and Classical Software...
Verification and ValidationOverview
References: Shach, Object Oriented and Classical Software Engineering
Pressman, Software Engineering: a Practitioner’s Approach, McGraw HillPfleeger, Software Engineering, Theory and Practice, Prentice Hall
Davis, Software Requirements: Objects, Functions, and States, Prentice Hall
1209
Purpose of V&V
• Groups of 3
• 2 minutes
• What is the purpose of verification and validation?
Purpose of V&V
• Give programmers information they can use to prevent faults
• Give management information to evaluate risk• Provide software that is reasonably defect free• Achieve a “testable” design (one that can be
easily verified)• Validate the software (demonstrate that it
works)
Definitions-1 (note: usual definitions, but they do not match all authors or
the 1990 IEEE glossary)• Validation
– Evaluation of an object to demonstrate that it meets expectations. (Did we build the right system?)
• Verification– Evaluation of an object to demonstrate that it meets its
specification. (Did we build the system right?)– Evaluation of the work product of a development phase to
determine whether the product satisfies the conditions imposed at the start of the phase.
• Correct Program– Program matches its specification
• Correct Specification– Specification matches the client’s intent
Definitions-2• Error (a.k.a. mistake):
– A human activity that leads to the creation of a fault– A human error results in a fault which may, at runtime, result in a
failure– Kaner: “It’s an error if the software doesn’t do what the user intended”
• Fault (a.k.a. bug)– May result in a failure– A discrepancy between what something should contain (in order for
failure to be impossible) and what it does contain– The physical manifestation of an error
• Failure (a.k.a. symptom, problem, incident)– Observable misbehavior– Actual output does not match the expected output– Can only happen when a thing is being used
Definitions-3
• Fault identification– Process of determining what fault caused a failure
• Fault correction– Process of changing a system to remove a fault.
• Debugging– The act of finding and fixing program errors
Definitions-4• Testing
– The act of designing, debugging, and executing tests
• Test– A sample execution to be examined
• Test case– A particular set of input and the expected output
• Oracle– Any means used to predict the outcome of a test
Definitions-5
• Significant test case– A test case with a high probability of detecting an error– One test case may be more significant than another
• Significant test set– A test set with a high probability of detecting an error– A test set is more significant than another if the first is a superset of
the second– The number of test cases does not determine the significance
• Regression testing: – rerun a test suite to see if
• a change fixed a bug • a change introduced a new one
Definitions-6
• Let S be a relation, a specification of a program.
• Let P be the implementation of the program. • R is the Range. r R.• D is the domain. d D.• S (r,d). The specification. • P (r,d). The implementation.
Definitions-7
• FailureP (r,d) but not S (r,d)
• Test caseA pair (r,d) such that S (r,d)
• Test set T A finite set of test cases.
• P passes T if tT, t=(r,d) S (r,d) P (r,d)
• T is ideal if ( d,r | S (r, d) (P(r, d))) ( tT | t=(r’,d’) S (r’,d’) P(r’,d’))
• R: Range. r R.
• D: Domain. d D.
• S (r,d). The specification.
• P (r,d). The implementation.
Definitions-7
• FailureP (r,d) but not S (r,d)
• Test caseA pair (r,d) such that S (r,d)
• Test set T A finite set of test cases.
• P passes T if tT, t=(r,d) S (r,d) P (r,d)
• T is ideal if ( d,r | S (r, d) (P(r, d))) ( tT | t=(r’,d’) S (r’,d’) P(r’,d’))
In groups, translate these into English
Parnas: “There are only three engineering techniques for
verification”• Mathematical analysis
• Exhaustive case analysis
• Prolonged realistic testing
Parnas: “There are only three engineering techniques for
verification”• Mathematical analysis
– Works well for continuous functions (software engineering is more difficult than other engineering)
– Cannot interpolate reliably for discrete functions
• Exhaustive case analysis
• Prolonged realistic testing
Parnas: “There are only three engineering techniques for
verification”• Mathematical analysis
• Exhaustive case analysis– Only possible for systems with small state space
• Prolonged realistic testing
Hierarchy of V&V techniques
Static Analysis
V&V
Dynamic Techniques
Model Checking
SymbolicExecution
Testing InformalAnalysis
FormalAnalysis
Static Techniques
Proofs Review
Inspection
Walkthrough
Types of Faults
• In group
• 3 minutes
• List all the types and causes of faults: what can go wrong in the development process?
Some Types of Faults• Algorithmic: algorithm or logic does not produce the proper output
for the given input• Syntax: improper use of language constructs• Computation (precision): formula’s implementation wrong or
result not to correct degree of accuracy• Documentation: documentation does not match what program does• Stress (overload): data structures filled past capacity• Capacity: system’s performance unacceptable as activity reaches
its specified limit• Timing: code coordinating events is inadequate• Throughput: system does not perform at speed required• Recovery: failure encountered and does not behave correctly.
Causes of FaultsRequirements
System Design
Program Design
Program Implementation
Unit Testing
System Testing
Incorrect or missing requirementsIncorrect translation
Incorrect design specification
Incorrect design specification
Incorrect design interpretation
Incorrect semantics
Incorrect documentation
Incomplete testing
New faults introduced correctingothers
Some Verification and Validation Techniques
Requirements
Operation
Design
Maintenance
Implementation
Testing
Reviews: walkthroughs/inspections
Synthesis
Model CheckingCorrectness Proofs
TraditionalRuntime Monitoring
Effectiveness of Fault Detection Techniques
0
10
20
30
40
50
60
70
Require
ments
Design
Coding
Docum
enta
tion
Per
cen
t E
rro
rs d
isco
vere
d
Prototyping
Review, Requirements
Review, Design
Code Inspection
Unit Testing
Groups:2 min.What
does this slide say?
Error Estimates:
• 3 errors per 1000 keystrokes for trained typists.• 1 bug per 100 lines of code (after publication)• 1.5 bugs per line of code (all together, including
typing errors).• Testing is 30-90% of the cost of a product.• probability of correctly changing a program
50% if less than 10 lines,
20% if between 10 and 50 lines