A Practical Approach to Verification and...
Transcript of A Practical Approach to Verification and...
A Practical Approach toVerification and Validation
1
Dr. Eugene W.P. Bingue U. S. Navy
Dr. David A. Cook Stephen F. Austin State University
[email protected] Practical Approach to V&V
Built Well?Fit for Intended Use?
Used Right?
PROBLEM DOMAIN
TOOL DOMAIN
USER DOMAIN
RequirementsRequirements
VerificationVerification
The Three “Domains” of V&V
Verification
ProgramValidation
Requirements Validation
A Practical Approach to V&V 3
Verification and Validation – Definitions• Verification
– The process of determining that a model implementation accurately represents the developer's conceptual description and specifications.
“Did I build the system right?”• Validation
– The process of determining (a) the manner and degree to which a model is an accurate representation of the ‘real-world’ from the perspective of the intended uses of the model.
“Did I build the right system?”
A Practical Approach to V&V 4
V&V vs. Testing• Testing is a discrete phase• VV&A should occur during each phase
Brian Marick, as quoted in Pressman “A Practitioners Guide to Software Engineering”
“The first mistake that people make is thinking thatthe testing team is responsible for assuring quality”
A Practical Approach to V&V 5
Non-Simulation
A Practical Approach to V&V 6
It gives the right solutions!
It works well, and as I expected!
Accreditation(for simulations)
• Accreditation is the official certification that a model or simulation is acceptable for use for a specific application
• Three steps:– Identify gaps in the program (what it
WON’T do)– Assess the risks– Recommend acceptable uses,
and list limitations
A Practical Approach to V&V 8
Why V&V?
“When Quality is vital, independent checks are necessary, not because people are untrustworthy but because they are human.”
Watts Humphrey, Managing the Software Process
A Practical Approach to V&V 9
Mature Process for System Development
User'sViews
UserTrialPlan
DeliveredSystem
AcceptTestPlan
Acceptancetesting
System & int.testing
IntegSystem
IntegTestPlan
unittesting
UnitTestPlan
Modules
modulecoding
CodedUnits
Users ViewDesigners
ViewDevelopers
View
The V&VView
Source: Ould and Unwin.Testing in Software Development, 1988
Reqts
Rqmts. Analysis
SystemSpec
System Spec
SystemDesign
System Design
ModuleDesign
Module Design
10
VV&A: Right Product built Right
Inspect CM practices
Formal Document Review
Inspect Conceptual ModelInspect Requirements Inspect Design
Inspect Code
Inspect Test Plans/Test ResultsFunctionality Testing
Validate Equations/Algorithms
Verify Equations/Algorithms
VV&C Input/Default Data
User'sViews Reqts System
SpecSystemDesign
ModuleDesign
IntegSystem
DeliveredSystem
UnitTestPlan
IntegTestPlan
AcceptTestPlan
UserTrialPlan
unittesting
integrationtesting
System acceptancetesting
modulecoding
Modules CodedUnits
A Practical Approach to V&V 11
Basic (and Practical ) Taxonomy for V&V
Potential Verification & Validation Techniques
Acceptance TestingAlpha TestingAssertion CheckingBeta TestingBottom-Up TestingComparison TestingCompliance Testing
AuthorizationPerformanceSecurityStandards
DebuggingExecution Testing
MonitoringProfilingTracing
Fault / Failure Insertion TestingField TestingFunctional (Black-Box)
TestingGraphical ComparisonsInterface Testing
DataModelUser
Object-Flow TestingPartition Testing
FormalInduction
Inference
Logical Deduction
Inductive Assertions
Calculus
Lambda Calculus
Predicate Calculus
Predicate Transformation
Proof of Correctness
Predictive ValidationProduct TestingRegression TestingSensitivity AnalysisSpecial Input TestingBoundary ValueEquivalence PartitioningExtreme InputInvalid InputReal-Time InputSelf-Driven InputStressTrace-Driven InputStatistical TechniquesStructural (White-Box)
Testing BranchConditionData FlowLoopPathStatement
Submodel / Module Testing
Symbolic DebuggingTop-Down TestingVisualization / Animation
Audit
Desk Checking
Face Valid-tion
Inspections
Reviews
Turing Test
Walkthroughs
Cause-Effect Graphing
Control AnalysisCalling StructureConcurrent ProcessControl FlowState Transition
Data Analysis
Data Dependency
Data Flow
Fault/Failure Analysis
Interface AnalysisModel InterfaceUser Interface
Semantic Analysis
Structural Analysis
Symbolic Evaluation
Syntax Analysis
Traceability Assess-ment
DynamicInformal Static
Source: DMSO Best Practices
A Practical Approach to V&V 12
V&V Techniques
V&V Techniques• Informal V&V techniques are among the most commonly
used. They are called informal because their tools and approaches rely heavily on human reasoning and subjectivity without stringent mathematical formalism.
• Static V&V techniques assess the accuracy of the static model design and source code. Static techniques do not require machine execution of the model, but mental execution can be used. The techniques are very popular and widely used, and many automated tools are available to assist in the V&V process. Static techniques can reveal a variety of information about the structure of the model, the modeling techniques used, data and control flow within the model, and syntactical accuracy (Whitner and Balci, 1989).
.
V&V Techniques (continued)• Dynamic V&V techniques require model execution; they evaluate the
model based on its execution behavior. Most dynamic V&V techniques require model instrumentation, the insertion of additional code (probes or stubs) into the executable model to collect information about model behavior during execution. – Dynamic V&V techniques usually are applied in three steps:
• executable model is instrumented• instrumented model is executed• model output is analyzed, dynamic model behavior is
evaluated• Formal V&V techniques (or formal methods) are based on formal
mathematical proofs or correctness and are the most thorough means of model V&V. The successful application of formal methods requires the model development process to be well defined and structured. Formal methods should be applied early in the model development process to achieve maximum benefit. Because formal techniques require significant effort they are best applied to complex problems, which cannot be handled by simpler methods.
Verification and Validation Technique Taxonomy Informal Techniques
audit desk check face validation review Turing test walk-through
inspection
Static Techniques control analyses data analyses
calling structure control flowcause-effect graphing concurrent process
state transition
data dependency data flow
fault/failure analysis
interface analyses semantic analysis structural analysis symbolic evaluation model
interface user
interface syntax analysis traceability assessment Dynamic Techniques
acceptance test alpha test assertion check beta test compliance tests
authorization security bottom-up test comparison test performance standards
debugging
execution tests monitor profile trace
fault / failure insertion test field test functional test
(Black Box test)
interface tests graphical comparison data model user
object-flow test partition test
predictive validation product test regression test sensitivity analysis
special input tests structural tests (White Box tests)
statistical techniques
• • boundary value • • equivalence
partitioning • • extreme input • • invalid input
• • real-time input • • self-driven input • • stress • • trace-driven input
• • branch
• • condition
• • data flow
• • loop • • path • •
statement
submodel / module test
symbolic debugging top-down test visualization / animation Formal Techniques
induction inference logical deduction inductive assertion
lambda calculus predicate calculus predicate transformation
proof of correctness
What activities do you select?• It depends upon
– Available time– Available funds– Confidence in the development and
developers and process– Accreditation needs (very important
for simulations)– Type of activity– User needs and desires– “Criticality” of the application
• Formally, you should document what activities you will perform in a V&V Plan
A Practical Approach to V&V 16
Lesson 1• Identify intended uses of the product early
– Create use cases, scenarios, or SRS– Verify and Validate the requirements. Do it again.– Keep the requirements separate and current.– Insist on a design (or future maintenance will be
problematic).
• Plan for V&V early
• Insist on user involvement in V&V of requirements
A Practical Approach to V&V 18
Lesson 2
• Software Engineering 101– Ten 1,000 line programs are easier to V&V than
one 10,000 one
• Separate different classes of uses and users. Plan and design accordingly.
• You MUST have a design.
A Practical Approach to V&V 19
Lesson 3
• Determine acceptability criteria as early as possible– Determine how you will know when the product is “good enough”
– Know what the user really needs - “perfect” vs. the “80% solution”
– This is another way of saying “The requirements must be very clear” and “the user must agree with the developers as to what the requirements are”
A Practical Approach to V&V 20
Lesson 4• Keep track of “complex” requirements
– Accuracy– Fidelity– Speed– Response time– Interfaces– Interoperability– Real-time requirements
• You will need domain-specific expertise for these areas
A Practical Approach to V&V 21
Lesson 5
• Start the V&V early (which is a nice way of saying “FUND the V&V early”)
• Manage, organize and update the V&V artifacts
• Do not confuse V&V with testing
A Practical Approach to V&V 22
Lesson 6 – use a taxonomy• Conceptual Model (SRS, Briefing, Conversation)• Requirements (formal and informal)• Equations and Algorithms • Design (validity, coupling, cohension)• Code (documentation and coding standards)• Equations / algorithms / dimensional analysis• Test plans, test results• Check input data / default values / constants• Functionality check (final user approval)• Configuration Management• Documentation • Risks
A Practical Approach to V&V 23
Lesson 7
• Know the limits of YOUR expertise
• Know when to to use– Subject Matter Experts (SMEs) who usually don’t
understand program development, but can help you understand “the real world”
– Statisticians and mathematicians– Domain experts, who can translate SME input into
program requirements– Specialized domain-specific developers
A Practical Approach to V&V 24
Lesson 8
• Configuration Management is critical. Small changes to the program can invalidate V&V results.
• It is critical to re-evaluate program results, and perform incremental V&V after any changes that might affect the validity of the program.
• Limit access to the code and requirements. • Update requirements and design as needed.• Save the test cases you use for V&V – you will
need to reuse them frequently!A Practical Approach to V&V 25
Final Lessons• The cost of V&V is a small part of overall
development• V&V saves you time and money
– Without V&V, you get stuck in the “code – fix – code – fix” loop late in the development process
– The risk of overall program failure increases dramatically without V&V
• V&V pays for itself andsaves you $$s in decreasedfuture maintenance costs
A Practical Approach to V&V 26