Software Inspections of Requirements Specifications Smita Chaganty Brandon Vega.
-
Upload
bartholomew-franklin -
Category
Documents
-
view
214 -
download
1
Transcript of Software Inspections of Requirements Specifications Smita Chaganty Brandon Vega.
Software Inspections of
Requirements Specifications
Smita Chaganty
Brandon Vega
Overview
What are inspections? Reviews and walkthroughs Inspection teams Steps to Formal Inspection Detection methods Usage Based Reading (UBR) UBR Vs CBR (Checklist Based Reading) Defects in Use Case models Study & results
Importance of Requirements
Software Requirements Specifications (SRS) used as input to Planning, Estimating, Design, and Testing
“contract” between customers and developers
“Use-case Driven” Quality of requirements documents is
important
What are Inspections? - 1
A means for detecting defects and violation of development standards
Improve quality in software documents Team of reviewers reads SRS, and identify
as many defects as possible Defects are sent to the document’s author for
repair
What are Inspections? - 2
Objectives of an inspection: verify that the software element(s) satisfy its
specifications, verify that the software element(s) conform to
applicable standards, identify deviation from standards and
specifications, and collect software engineering data (such as defect
and effort data).
Reviews and Walkthroughs - 1 REVIEWS
Manual process Multiple readers Checks for anomalies and omissions Representatives of stakeholders should
participate in a review
Reviews and Walkthroughs - 2 Walkthrough
Peer group review Several people involved Typically a walkthrough involves at least one
person (usually the author)
Reviews for consensus and walkthroughs for training
Inspection team consists of…
Moderator – leads inspections, schedules meetings, controls meetings…..
Author – creates or maintains the product being inspected
Reader - describes the sections of work product to the team as they proceed
Recorder – classifies and records defects and issues raised
Inspector – attempts to find errors in the product
Formal inspection consists of… Planning Overview meeting Preparation Inspection meeting Casual analysis Rework Follow-up
Detection Methods - 1
A set of procedures coupled with the assignment of responsibilities to individual reviewers
Methods: Ad Hoc Checklist Scenario-based Formal proofs of correctness (e.g.. Z)
Detection Methods - 2
Ad Hoc Informal, non-systematic detection technique No explicit assignment of reviewer responsibility Relies on knowledge and experience of inspector
Checklist Most popular method Reuse of “lessons learned” Defines reviewer responsibilities Suggests ways for reviewers to identify defects
Detection Methods - 3
Checklist (continued) Reviewers answer a list of questions based on
knowledge from previous inspections / ad hoc technique no guidance provided
Checklist Example- 1
Does the development team understand the Use Cases that this design supports? Can every member of the development team trace a Use Case Scenario through the
designs using a small number of high level components? Are the bulk of the Class names understandable and recognizable by the domain experts? Do the Class names reflect the overall responsibility of the Class with respect to the Use
Cases and the design? Does the name of every Message reflect the intended outcome of the Method? (There
should be no actions that could not be inferred from the Message name.) Does the purpose of the Method match the overall responsibility of the Class that it is in?
(Should the method be moved to another class, or does it belong in a different place in the inheritance hierarchy?)
Have the Class diagrams been drawn to emphasize the Classes used by a set of Use Cases?
Have Packages been used to group related Classes, and are dependencies between packages noted?
Are all parameters specified for every message on the interaction diagrams, and is there a way for the sending object to know the parameters it is passing?
Checklist Example- 2
Is flow of control obvious and correct within the interaction diagrams? (Hyperspace leaps are not allowed.)
Do Object names on Interaction Diagrams conform to coding guidelines? Do Parameter names in Messages conform to coding guidelines? For cases when there are more than one message to the same object, could this set of
messages be replaced with a single message? Is the design supposed to conform to the Law Of Demeter? (An object can only send
Messages to Itself, It’s Attributes and passed parameters) Does each Method have a set of Unit Tests defined? Are all of the Failure Conditions from each Use Case tested for? Does the implementation of every Method on a Class use at least one attribute or method
in that class? (If a Method does not refer to it’s object, then it is just a utility function attached to the class, so it should be moved elsewhere.)
Overall, is the an even balance of responsibilities between the classes? (No Workaholics allowed)
Do the bulk of the Classes have an even balance of methods and attributes? (Trivial accessor methods do not count.)
For Messages with similar groups of parameters, can a simple data holder class be defined to simplify the message?
Checklist Example- 3
Whenever Inheritance is used, is the phrase "Sub-class is a kind of Super-class" correct for the life of the object?
Does Interface Inheritance make more sense than Implementation Inheritance? Even where inheritance is not applicable, are consistent names used for similar Methods? Are all Attributes private? Is the visibility of Methods appropriate? (If a Method is not invoked from outside the class it
should be private.) Are there any unused Attributes or Methods on a Class? (Unused implies untested which
implies code defects.) Is Run Time Type Information used to switch behavior? (This is only needed for interaction
with non-object systems.) Are the Classes partitioned so that as many Classes as possible are platform
independent? Is the persistence and transaction policy defined? For containers, is ownership of the contained objects defined? (If the container is deleted,
are the contents also deleted?) Has the design been tested by the development of an Executable Architecture?
Detection Methods – 4
Ad Hoc and Checklist Methods:
Non-systematic approaches
Reviewers responsibilities - both general and identical
Detection Methods – 5
Scenario Defect specific procedures Used to detect particular classes of defects Each reviewer executes a single scenario,
therefore, Multiple reviewers are needed to achieve broad
coverage of the document Each review has a distinct responsibility
Results of an experiment using different detection methods Scenario approach increased defect
detection rate as compared to Ad Hoc and Checklist
Performance of reviewers: reviewers using Scenarios found more defects
than those who used Ad Hoc or Checklist Ad Hoc and Checklist techniques were equivalent
Inspections of requirements specification Problems with ad hoc and checklist
techniques – use scenario-based Why? – checklist is used as starting point for
more elaborate technique Scenario based technique
teaches the instructors how to read requirements for defect detection
Offers a strategy for decomposition
Usage Based Reading (UBR)
Assumes that UC and scenarios have been defined earlier in the development process
Utilizes UC to focus the inspection effort Steps:
1. Prioritize the UC in order of importance2. Select the UC with the highest priority3. Track the UC scenarios through the document
under inspection4. Ensure the document fulfills the UC goal5. Select next use case and repeat from step 3
Usage Based Reading (UBR) vs. Checklist Based Reading (CBR) Study: Compare the effectiveness( # of
defects) and efficiency (time to find defects) 23 CIS graduate students, many with
software engineering experience 11 used UBR, 12 used CBR Taxi fleet management system Defects were ranked as A (Crucial), B
(Important), or C (Unimportant)
Usage Based Reading (UBR) vs. Checklist Based Reading (CBR) Results: UBR is significantly more efficient and effective than
CBR UBR finds more faults per time unit for crucial and important
faults UBR finds a larger share of faults UBR reviewers spent an average of 6.5 minutes less in
preparation and 4 minutes less in inspection UBR reviewers found twice as many crucial faults per hour as
CBR reviewers UBR reviewers identified average of 21% more faults than CBR
reviewers CBR discovered 63% more unimportant faults
Which means…CBR wastes effort searching for issues
Defects in Use Case Models
To create an inspection technique for UC models, knowledge of typical defects and their consequences is needed
Defects in Use Case Models
We must consider different stakeholders to find a comprehensive list of defects
Stakeholder: Clients and End Users Clients and end users want to be sure they get the expected
functionality The correct actors should be identified and described The correct UC should be identified and should describe how use case
goals are reached The actors should be associated with the correct UC The flow of events in each UC should be realistic, easy to understand,
and described in an appropriate level of detail Functionality should be shown through use of pre- and post-conditions
Defects in Use Case Models
Stakeholder: Project Manager Managers need to plan projects To support the planning:
UC model should cover all functional requirements
All interaction between actor and system that are relevant to the user should be described
Defects in Use Case Models
Stakeholder: Designers Designers will apply UC models to produce
an OO design, therefore: Terminology should be consistent throughout the
UC descriptions UC descriptions should be described at a suitable
level of detail
Defects in Use Case Models
Stakeholder: Testers Testers apply UC models to test that the
functionality is correctly implemented, therefore: Pre-conditions and post-conditions for the
execution of a UC should be testable, and All terms in UC descriptions should be testable
Defects in Use Case Models
UC can be described in many different formats
The actual format may have an impact on defect detection
Inspection techniques must be tailored to the actual format used
Study 1
No inspection technique exists that is specific to UC models Studies: Anda & Sjoberg - Create a checklist based inspection
technique for UC models Students organized into teams of clients and developers Each team clients for one system and developers for the other
system Client teams created informal textual requirements specifications Developer teams constructed UC models Fall 2000: Client teams evaluated UC models and found very few
defects despite the presence of many defects Fall 2001: Client and development teams evaluated UC models
using the checklist method
Study 1 Results
With checklist almost all teams found defects and suggested corrections
Very few defects were missed The clients found twice as many defects as developers Very few common defects were found between clients and
developers...Why? There is a large difference between what is considered a defect
in a UC model Difference in defects found suggests a technique based on
different perspectives may be useful May also be useful to involve different stakeholders in the
inspections
Study 2
Fall 2001: 45 students received textual requirements for a hospital roster management system
Defects were inserted into the UC by authors Half used checklist, half used ad hoc Inspections performed individually
Study 2 Results
Checklist group found slightly more defects regarding the actors in the UC
Ad hoc group found more defects in the other categories Overall the difference in detected defects was negligible Checklist method found to be more time consuming Many defects relating to flow of events not found by either group
indicating that these defects are difficult to detect Conclusion: checklist may not be useful when inspectors have
good knowledge of defects they are expected to find (the students had recently performed similar inspections) and…
Experienced inspectors may be more efficient without a checklist
Study 3
Porter & Vatta 1994 Hypothesized that systematic approaches such as
scenario based will increase the overall effectiveness of an inspection
Remember, scenarios target a specific class of defects Results: the scenario detection method had highest
defect detection rate followed by ad hoc and checklist (keep in mind, the checklist is the industry standard)
Conclusion: reviewers are able to focus on the specific class of defects which facilitates a higher rate of defect detection
Study 4
Lanubile & Visaggio replicated the previous study Hypothesis: Scenario based methods would result in more defect
detection Conclusions: The team defect detection rate when using the
Scenario technique was not significantly different from those obtained with Ad Hoc or Checklist techniques
Why?: Subjects were asked to learn too many new things Defects in the introductory parts create confusion Training was unfair Subjects who had trouble with the Scenario approach used
different techniques to execute the task. Time limit was too short
Conclusions…
The quality of requirements specification is important for the quality of the resulting product.
Different methods can be used for software inspections, however the best are the ones that are systematic and are inspected by experienced inspectors.
Poorly designed inspection method can lead to poor inspection performance
UBR is significantly more efficient and effective that CBR To create an inspection technique for UC models,
knowledge of typical defects and their consequences is needed.
References
Towards Inspection Technique for use Case Models, Bente Anda & Dag I. K Sjoberghttp://portal.acm.org
Software Inspection of Requirements Specification Documents: A Pilot Study, Tereza G. Kirner & Janaina C. Abibhttp://portal.acm.org
An Experiment to Assess Different Defect Detection methods for Software Requirements, A. A. Porter & L. G Vottawww.acm.org/pubs/citations/proceedings/sotf/257734/p103-porter/
Prioritized Use Cases as a vehicle for Software inspections, Thomas Thelin & Per Runesonhttp://computer.org/publication/dlib
Optimizing Software inspections, Tom Glib
http://216.239.41.104/search?q=cache:P6xvIMfH4VEJ:www.gilb.com/Download/Crosstalk.pdf+how+to+do+software+inspections&hl=en&ie=UTF-8
Assessing Defect Detection methods for Software requirements Inspections Through External Replication, Filippo Lanubile & Giuseppe Visaggio
http://216.239.39.104/search?q=cache:D7H45enpweYJ:www2.umassd.edu/SWPI/ISERN/isern-96-1.pdf+assessing+defect+detection+methods+for+software+requirements+inspections+through+external+replication&hl=en&ie=UTF-8