Software Metrics
description
Transcript of Software Metrics
![Page 1: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/1.jpg)
Software Metrics
ByTouseef Tahir
![Page 2: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/2.jpg)
Agenda
- Introduction
- Metrics in the Process Domain
- Metrics in the Project Domain
- Software Measurement
- Integrating Metrics within the Software Process
![Page 3: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/3.jpg)
Definitions• Measure - quantitative indication of extent, amount, dimension,
capacity, or size of some attribute of a product or process.– E.g., Number of errors
• Metric - quantitative measure of degree to which a system, component or process possesses a given attribute. “A handle or guess about a given attribute.”– E.g., Number of errors found per person hours expended
• Indicator- a combination of metrics that provide insight into the software process or project or product itself.
![Page 4: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/4.jpg)
What are Software Metrics?
AppliedTo
MeasurementBased
Techniques
Engineering &ManagementInformation
ToImprove
SoftwareProcesses,Products &
Services
ToSupply
![Page 5: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/5.jpg)
Measurement Defined
Entity:Attribute:
Features & Properties
Mapping:Numbers
&Symbols
![Page 6: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/6.jpg)
6
Metrics of Process Improvement
• Focus on Manageable Repeatable Process
• Use of Statistical SQA on Process
• Defect Removal Efficiency
![Page 7: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/7.jpg)
7
Statistical Software Process ImprovementStatistical Software Process Improvement
All errors and defects are categorized by origin
The cost to correct each error and defect is
recorded
No. of errors and defects in each category is counted and ranked in descending order
The overall cost in each category is computed
Resultant data are analyzed and the
“culprit” category is uncovered
Plans are developed to eliminate the errors
![Page 8: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/8.jpg)
8
Metrics of Project Management
• Budget• Schedule/Resource
Management• Risk Management• Project goals met or
exceeded• Customer satisfaction
![Page 9: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/9.jpg)
9
Metrics of the Software Product
• Focus on Deliverable Quality
• Analysis Products• Design Product
Complexity – algorithmic, architectural, data flow
• Code Products• Production System
![Page 10: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/10.jpg)
Two Historic Schools of Thought1. Collect data on everything -- then find
meaning
2. Implement a random selection of metrics
Jeopardy approach to metrics -- start with the answer and try to guess the
question
![Page 11: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/11.jpg)
Motivation for Metrics
• Estimate the cost & schedule of future projects
• Evaluate the productivity impacts of new tools and techniques
• Establish productivity trends over time
• Improve software quality
• Forecast future staffing needs
• Anticipate and reduce future maintenance needs
![Page 12: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/12.jpg)
12 Steps to Useful Software MetricsStep 1 - Identify Metrics Customers
Step 2 - Target Goals
Step 3 - Ask Questions
Step 4 - Select Metrics
Step 5 - Standardize Definitions
Step 6 - Choose a Model
Step 7 - Establish Counting Criteria
Step 8 - Decide On Decision Criteria
Step 9 - Define Reporting Mechanisms
Step 10 - Determine Additional Qualifiers
Step 11 - Collect Data
Step 12 - Consider Human Factors
![Page 13: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/13.jpg)
Metric Classification
• Products– Explicit results of software development activities– Deliverables, documentation, by products
• Processes– Activities related to production of software
• Resources– Inputs into the software development activities– hardware, knowledge, people
![Page 14: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/14.jpg)
Product vs. Process
• Process Metrics – Insights of process paradigm, software engineering
tasks, work product, or milestones – Lead to long term process improvement
• Product Metrics – Assesses the state of the project– Track potential risks– Uncover problem areas– Adjust workflow or tasks– Evaluate teams ability to control quality
![Page 15: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/15.jpg)
15
What are Metrics?• Software process and project metrics are quantitative
measures• They are a management tool• They offer insight into the effectiveness of the software
process and the projects that are conducted using the process as a framework
• Basic quality and productivity data are collected• These data are analyzed, compared against past averages,
and assessed• The goal is to determine whether quality and productivity
improvements have occurred• The data can also be used to pinpoint problem areas• Remedies can then be developed and the software process
can be improved
![Page 16: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/16.jpg)
16
A Quote on Measurement
“When you can measure what you are speaking about and express it innumbers, you know something about it; but when you cannot measure,when you cannot express it in numbers, your knowledge is of a meagerand unsatisfactory kind; it may be the beginning of knowledge, but youhave scarcely, in your thoughts, advanced to the stage of science.”
LORD WILLIAM KELVIN (1824 – 1907)
![Page 17: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/17.jpg)
Role of Measurement
Processes,Products &Resources
Characterize Predict
Evaluate Improve
![Page 18: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/18.jpg)
18
Reasons to Measure
• To characterize in order to– Gain an understanding of processes, products, resources, and
environments– Establish baselines for comparisons with future assessments
• To evaluate in order to– Determine status with respect to plans
• To predict in order to– Gain understanding of relationships among processes and products– Build models of these relationships
• To improve in order to– Identify roadblocks, root causes, inefficiencies, and other opportunities for
improving product quality and process performance
![Page 19: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/19.jpg)
Metrics in the Process Domain
![Page 20: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/20.jpg)
20
Metrics in the Process Domain
• Process metrics are collected across all projects and over long periods of time
• They are used for making strategic decisions• The intent is to provide a set of process indicators that lead to long-
term software process improvement• The only way to know how/where to improve any process is to
– Measure specific attributes of the process– Develop a set of meaningful metrics based on these attributes– Use the metrics to provide indicators that will lead to a strategy for
improvement
(More on next slide)
![Page 21: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/21.jpg)
21
Metrics in the Process Domain(continued)
• We measure the effectiveness of a process by deriving a set of metrics based on outcomes of the process such as– Errors uncovered before release of the software– Defects delivered to and reported by the end users– Work products delivered– Human effort expended– Calendar time expended– Conformance to the schedule– Time and effort to complete each generic activity
![Page 22: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/22.jpg)
22
Etiquette of Process Metrics
• Use common sense and organizational sensitivity when interpreting metrics data
• Provide regular feedback to the individuals and teams who collect measures and metrics
• Don’t use metrics to evaluate individuals• Work with practitioners and teams to set clear goals and metrics that will
be used to achieve them• Never use metrics to threaten individuals or teams• Metrics data that indicate a problem should not be considered “negative”
– Such data are merely an indicator for process improvement• Don’t obsess on a single metric to the exclusion of other important
metrics
![Page 23: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/23.jpg)
Metrics in the Project Domain
![Page 24: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/24.jpg)
24
Metrics in the Project Domain
• Project metrics enable a software project manager to– Assess the status of an ongoing project– Track potential risks– Uncover problem areas before their status becomes critical– Adjust work flow or tasks– Evaluate the project team’s ability to control quality of software work
products
• Many of the same metrics are used in both the process and project domain
• Project metrics are used for making tactical decisions– They are used to adapt project workflow and technical activities
![Page 25: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/25.jpg)
25
Use of Project Metrics
• The first application of project metrics occurs during estimation– Metrics from past projects are used as a basis for estimating time and effort
• As a project proceeds, the amount of time and effort expended are compared to original estimates
• As technical work commences, other project metrics become important– Production rates are measured (represented in terms of models created, review
hours, function points, and delivered source lines of code)
– Error uncovered during each generic framework activity (i.e, communication, planning, modeling, construction, deployment) are measured
(More on next slide)
![Page 26: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/26.jpg)
26
Use of Project Metrics (continued)
• Project metrics are used to– Minimize the development schedule by making the adjustments necessary to
avoid delays and mitigate potential problems and risks
– Assess product quality on an ongoing basis and, when necessary, to modify the technical approach to improve quality
• In summary– As quality improves, defects are minimized
– As defects go down, the amount of rework required during the project is also reduced
– As rework goes down, the overall project cost is reduced
![Page 27: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/27.jpg)
Software Measurement
![Page 28: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/28.jpg)
28
Categories of Software Measurement
• Two categories of software measurement– Direct measures of the
• Software process (cost, effort, etc.)
• Software product (lines of code produced, execution speed, defects reported over time, etc.)
– Indirect measures of the• Software product (functionality, quality, complexity, efficiency, reliability,
maintainability, etc.)
![Page 29: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/29.jpg)
29
Size-oriented Metrics
• Derived by normalizing quality and/or productivity measures by considering the size of the software produced
• Thousand lines of code (KLOC) are often chosen as the normalization value
• Metrics include – Errors per KLOC - Errors per person-month– Defects per KLOC - KLOC per person-month– Dollars per KLOC - Dollars per page of documentation– Pages of documentation per KLOC
(More on next slide)
![Page 30: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/30.jpg)
30
Size-oriented Metrics (continued)
• Size-oriented metrics are not universally accepted as the best way to measure the software process
• Opponents argue that KLOC measurements– Are dependent on the programming language– Penalize well-designed but short programs– Cannot easily accommodate nonprocedural languages– Require a level of detail that may be difficult to achieve
![Page 31: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/31.jpg)
31
Function-oriented Metrics
• Function-oriented metrics use a measure of the functionality delivered by the application as a normalization value
• Most widely used metric of this type is the function point:
FP = count total * [0.65 + 0.01 * sum (value adj. factors)]
![Page 32: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/32.jpg)
32
Function Point Controversy
• Like the KLOC measure, function point use also has proponents and opponents
• Proponents claim that– FP is programming language independent– FP is based on data that are more likely to be known in the early stages of
a project, making it more attractive as an estimation approach• Opponents claim that
– FP requires some “sleight of hand” because the computation is based on subjective data
– FP has no direct physical meaning…it’s just a number
![Page 33: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/33.jpg)
33
Reconciling LOC and FP Metrics
• Relationship between LOC and FP depends upon– The programming language that is used to implement the software
– The quality of the design
• FP and LOC have been found to be relatively accurate predictors of software development effort and cost– However, a historical baseline of information must first be established
• LOC and FP can be used to estimate object-oriented software projects– However, they do not provide enough granularity for the schedule and
effort adjustments required in the iterations of an evolutionary or incremental process
• The table on the next slide provides a rough estimate of the average LOC to one FP in various programming languages
![Page 34: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/34.jpg)
34
LOC Per Function Point
Language Average Median Low High
Ada 154 -- 104 205
Assembler 337 315 91 694
C 162 109 33 704
C++ 66 53 29 178
COBOL 77 77 14 400
Java 55 53 9 214
PL/1 78 67 22 263
Visual Basic 47 42 16 158
www.qsm.com/?q=resources/function-point-languages-table/index.html
![Page 35: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/35.jpg)
Metrics and Software Quality
FURPS• Functionality - features of system• Usability – Training time, skill level necessary
to use, Increase in productivity, subjective questionnaire or controlled experiment
• Reliability – frequency of failure, security• Performance – speed, throughput• Supportability – maintainability
![Page 36: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/36.jpg)
Measures of Software Quality• Correctness – degree to which a program operates according to
specification– Defects/KLOC– Defect is a verified lack of conformance to requirements– Failures/hours of operation
• Maintainability – degree to which a program is open to change– Mean time to change– Change request to new version (Analyze, design etc)– Cost to correct
• Integrity - degree to which a program is resistant to outside attack– Fault tolerance, security & threats
![Page 37: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/37.jpg)
McCall’s Triangle of QualityMaintainabilityMaintainability
FlexibilityFlexibility
TestabilityTestability
PortabilityPortability
ReusabilityReusability
InteroperabilityInteroperability
CorrectnessCorrectness
ReliabilityReliability
EfficiencyEfficiency
IntegrityIntegrity
UsabilityUsability
PRODUCT TRANSITIONPRODUCT TRANSITIONPRODUCT REVISIONPRODUCT REVISION
PRODUCT OPERATIONPRODUCT OPERATION
![Page 38: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/38.jpg)
A Comment
McCall’s quality factors were proposed in theearly 1970s. They are as valid today as they were
in that time. It’s likely that software built to conform to these factors will exhibit high quality well into
the 21st century, even if there are dramatic changesin technology.
![Page 39: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/39.jpg)
Quality Model
product
operation revision transition
reliability efficiency usability maintainability testability portability reusability
Metrics
![Page 40: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/40.jpg)
High level Design Metrics
Structural ComplexityData ComplexitySystem ComplexityCard & Glass ’80• Structural Complexity S(i) of a module i.
– S(i) = fout2(i)
– Fan out is “A count of the number of modules that call
– module i”
![Page 41: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/41.jpg)
Design Metrics
• Data Complexity D(i)– D(i) = v(i)/[fout(i)+1]– v(i) is the number of inputs and outputs passed
to and from i
• System Complexity C(i)– C(i) = S(i) + D(i)– As each increases the overall complexity of the
architecture increases
![Page 42: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/42.jpg)
System Complexity Metric
• Another metric:– length(i) * [fin(i) + fout(i)]2
– Length is LOC– Fan in is the “number of modules that invoke i”
• Cyclomatic complexity
![Page 43: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/43.jpg)
43
Metrics for Software Quality
• Correctness– This is the number of defects per KLOC, where a defect is a verified lack of
conformance to requirements– Defects are those problems reported by a program user after the program is
released for general use
• Maintainability– This describes the ease with which a program can be corrected if an error is
found, adapted if the environment changes, or enhanced if the customer has changed requirements
– Mean time to change (MTTC) : the time to analyze, design, implement, test, and distribute a change to all users
• Maintainable programs on average have a lower MTTC
![Page 44: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/44.jpg)
44
Defect Removal Efficiency
• Defect removal efficiency provides benefits at both the project and process level
• It is a measure of the filtering ability of QA activities as they are applied throughout all process framework activities– It indicates the percentage of software errors found before software
release
• It is defined as DRE = E / (E + D)– E is the number of errors found before delivery of the software to the end
user– D is the number of defects found after delivery
• As D increases, DRE decreases (i.e., becomes a smaller and smaller fraction)
• The ideal value of DRE is 1, which means no defects are found after delivery
• DRE encourages a software team to institute techniques for finding as many errors as possible before delivery
![Page 45: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/45.jpg)
45
Object-oriented Metrics
• Average number of support classes per key class– Key classes are identified early in a project (e.g., at requirements analysis)– Estimation of the number of support classes can be made from the number
of key classes– GUI applications have between two and three times more support classes
as key classes– Non-GUI applications have between one and two times more support
classes as key classes• Number of subsystems
– A subsystem is an aggregation of classes that support a function that is visible to the end user of a system
![Page 46: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/46.jpg)
Coupling
• Data and control flow– di – input data parameters– ci input control parameters– do output data parameters– co output control parameters
• Global– gd global variables for data– gc global variables for control
• Environmental– w fan in – r fan out
![Page 47: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/47.jpg)
Metrics for Coupling
• Mc = k/m, k=1
– m = di + aci + do + bco + gd + cgc + w + r
– a, b, c, k can be adjusted based on actual data
![Page 48: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/48.jpg)
Component Level Metrics
• Cohesion (internal interaction) - a function of data objects
• Coupling (external interaction) - a function of input and output parameters, global variables, and modules called
• Complexity of program flow - hundreds have been proposed (e.g., cyclomatic complexity)
• Cohesion – difficult to measure– Bieman ’94, TSE 20(8)
![Page 49: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/49.jpg)
Using Metrics
• The Process– Select appropriate metrics for problem– Utilized metrics on problem– Assessment and feedback
• Formulate• Collect• Analysis• Interpretation• Feedback
![Page 50: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/50.jpg)
Metrics for the Object Oriented
• Chidamber & Kemerer ’94 TSE 20(6)
• Metrics specifically designed to address object oriented software
• Class oriented metrics
• Direct measures
![Page 51: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/51.jpg)
Weighted Methods per Class
WMC =
• ci is the complexity (e.g., volume, cyclomatic complexity, etc.) of each method
• Viewpoints: (of Chidamber and Kemerer)
-The number of methods and complexity of methods is an indicator of how much time and effort is required to develop and maintain the object
-The larger the number of methods in an object, the greater the potential impact on the children
-Objects with large number of methods are likely to be more application specific, limiting the possible reuse
n
iic
1
![Page 52: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/52.jpg)
Depth of Inheritance Tree
• DIT is the maximum length from a node to the root (base class)
• Viewpoints:• Lower level subclasses inherit a number of methods
making behavior harder to predict
• Deeper trees indicate greater design complexity
![Page 53: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/53.jpg)
Number of Children• NOC is the number of subclasses immediately
subordinate to a class
• Viewpoints:• As NOC grows, reuse increases - but the abstraction may be diluted
• Depth is generally better than breadth in class hierarchy, since it promotes reuse of methods through inheritance
• Classes higher up in the hierarchy should have more sub-classes then those lower down
• NOC gives an idea of the potential influence a class has on the design: classes with large number of children may require more testing
![Page 54: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/54.jpg)
Coupling between Classes• CBO is the number of collaborations between
two classes (fan-out of a class C)– the number of other classes that are referenced in the class
C (a reference to another class, A, is an reference to a method or a data member of class A)
• Viewpoints:• As collaboration increases reuse decreases
• High fan-outs represent class coupling to other classes/objects and thus are undesirable
• High fan-ins represent good object designs and high level of reuse
• Not possible to maintain high fan-in and low fan outs across the entire system
![Page 55: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/55.jpg)
Response for a Class
• RFC is the number of methods that could be called in response to a message to a class (local + remote)
• Viewpoints: As RFC increases• testing effort increases
• greater the complexity of the object
• harder it is to understand
![Page 56: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/56.jpg)
Lack of Cohesion in Methods
• LCOM – poorly described in Pressman
• Class Ck with n methods M1,…Mn
• Ij is the set of instance variables used by Mj
![Page 57: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/57.jpg)
LCOM
• There are n such sets I1 ,…, In
– P = {(Ii, Ij) | (Ii Ij ) = }
– Q = {(Ii, Ij) | (Ii Ij ) }
• If all n sets Ii are then P =
• LCOM = |P| - |Q|, if |P| > |Q|
• LCOM = 0 otherwise
![Page 58: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/58.jpg)
Example LCOM
• Take class C with M1, M2, M3
• I1 = {a, b, c, d, e}• I2 = {a, b, e}• I3 = {x, y, z}• P = {(I1, I3), (I2, I3)}• Q = {(I1, I2)}
• Thus LCOM = 1
![Page 59: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/59.jpg)
Explanation
• LCOM is the number of empty intersections minus the number of non-empty intersections
• This is a notion of degree of similarity of methods
• If two methods use common instance variables then they are similar
• LCOM of zero is not maximally cohesive• |P| = |Q| or |P| < |Q|
![Page 60: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/60.jpg)
Some other cohesion metrics
![Page 61: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/61.jpg)
Class Size
• CS – Total number of operations (inherited, private,
public)– Number of attributes (inherited, private, public)
• May be an indication of too much responsibility for a class
![Page 62: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/62.jpg)
Number of Operations Overridden
• NOO
• A large number for NOO indicates possible problems with the design
• Poor abstraction in inheritance hierarchy
![Page 63: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/63.jpg)
Number of Operations Added
• NOA
• The number of operations added by a subclass
• As operations are added it is farther away from super class
• As depth increases NOA should decrease
![Page 64: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/64.jpg)
Method Inheritance Factor
MIF = .
• Mi(Ci) is the number of methods inherited and not overridden in Ci
• Ma(Ci) is the number of methods that can be invoked with Ci
• Md(Ci) is the number of methods declared in Ci
n
iia
n
iii
CM
CM
1
1
)(
)(
![Page 65: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/65.jpg)
MIF
• Ma(Ci) = Md(Ci) + Mi(Ci)
• All that can be invoked = new or overloaded + things inherited
• MIF is [0,1]
• MIF near 1 means little specialization
• MIF near 0 means large change
![Page 66: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/66.jpg)
Coupling Factor
CF= .
• is_client(x,y) = 1 iff a relationship exists between the client class and the server class. 0 otherwise
• (TC2-TC) is the total number of relationships possible
• CF is [0,1] with 1 meaning high coupling
)(
),(_2 TCTC
CCclientisi j ji
![Page 67: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/67.jpg)
Polymorphism Factor
PF = .
• Mn() is the number of new methods
• Mo() is the number of overriding methods
• DC() number of descendent classes of a base class
• The number of methods that redefines inherited methods, divided by maximum number of possible distinct polymorphic situations
i iin
i io
CDCCM
CM
)()(
)(
![Page 68: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/68.jpg)
Operational Oriented Metrics
• Average operation size (LOC, volume)
• Number of messages sent by an operator
• Operation complexity – cyclomatic
• Average number of parameters/operation– Larger the number the more complex the collaboration
![Page 69: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/69.jpg)
Encapsulation
• Lack of cohesion
• Percent public and protected
• Public access to data members
![Page 70: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/70.jpg)
Inheritance
• Number of root classes
• Fan in – multiple inheritance
• NOC, DIT, etc.
![Page 71: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/71.jpg)
Metric tools
• McCabe & Associates ( founded by Tom McCabe, Sr.)
– The Visual Quality ToolSet – The Visual Testing ToolSet – The Visual Reengineering ToolSet
• Metrics calculated
– McCabe Cyclomatic Complexity– McCabe Essential Complexity – Module Design Complexity – Integration Complexity – Lines of Code – Halstead
![Page 72: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/72.jpg)
CCCC• A metric analyser C, C++, Java, Ada-83, and Ada-95 (by
Tim Littlefair of Edith Cowan University, Australia)
• Metrics calculated– Lines Of Code (LOC)– McCabe’s cyclomatic complexity– C&K suite (WMC, NOC, DIT, CBO)
• Generates HTML and XML reports
• freely available
• http://cccc.sourceforge.net/
![Page 73: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/73.jpg)
Jmetric• OO metric calculation tool for Java code (by Cain and
Vasa for a project at COTAR, Australia) • Requires Java 1.2 (or JDK 1.1.6 with special extensions)
• Metrics – Lines Of Code per class (LOC)– Cyclomatic complexity– LCOM (by Henderson-Seller)
• Availability– is distributed under GPL
• http://www.it.swin.edu.au/projects/jmetric/products/jmetric/
![Page 74: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/74.jpg)
JMetric tool result
![Page 75: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/75.jpg)
GEN++ (University of California, Davis and Bell Laboratories)
• GEN++ is an application-generator for creating code analyzers for C++ programs
– simplifies the task of creating analysis tools for the C++
– several tools have been created with GEN++, and come with the package
– these can both be used directly, and as a springboard for other applications
• Freely available
• http://www.cs.ucdavis.edu/~devanbu/genp/down-red.html
![Page 76: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/76.jpg)
Integrating Metrics within the Software Process
![Page 77: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/77.jpg)
77
Arguments for Software Metrics
• Most software developers do not measure, and most have little desire to begin
• Establishing a successful company-wide software metrics program can be a multi-year effort
• But if we do not measure, there is no real way of determining whether we are improving
• Measurement is used to establish a process baseline from which improvements can be assessed
• Software metrics help people to develop better project estimates, produce higher-quality systems, and get products out the door on time
![Page 78: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/78.jpg)
78
Establishing a Metrics Baseline
• By establishing a metrics baseline, benefits can be obtained at the software process, product, and project levels
• The same metrics can serve many masters• The baseline consists of data collected from past projects• Baseline data must have the following attributes
– Data must be reasonably accurate (guesses should be avoided)– Data should be collected for as many projects as possible– Measures must be consistent (e.g., a line of code must be interpreted
consistently across all projects)– Past applications should be similar to the work that is to be estimated
• After data is collected and metrics are computed, the metrics should be evaluated and applied during estimation, technical work, project control, and process improvement
![Page 79: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/79.jpg)
79
Software Metrics Baseline ProcessSoftware
EngineeringProcess
SoftwareProject
SoftwareProduct
Data Collection
MetricsComputation
MetricsEvaluation
Measures
Metrics
Indicators
![Page 80: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/80.jpg)
80
Getting Started with Metrics
1) Understand your existing process2) Define the goals to be achieved by establishing a metrics program3) Identify metrics to achieve those goals
– Keep the metrics simple– Be sure the metrics add value to your process and product
4) Identify the measures to be collected to support those metrics
(More on next slide)
![Page 81: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/81.jpg)
81
Getting Started with Metrics(continued)
5) Establish a measurement collection processa) What is the source of the data?b) Can tools be used to collect the data?c) Who is responsible for collecting the data?d) When are the data collected and recorded?e) How are the data stored?f) What validation mechanisms are used to ensure the data are correct?
6) Acquire appropriate tools to assist in collection and assessment7) Establish a metrics database8) Define appropriate feedback mechanisms on what the metrics indicate
about your process so that the process and the metrics program can be improved
![Page 82: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/82.jpg)
IEEE Software Metrics Methodology
1. Establish software quality requirements
2. Identify software quality metrics
3. Implement the software quality metrics
4. Analyze the software metrics results
5. Validate the software quality metrics
![Page 83: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/83.jpg)
Establish Software Quality Requirements
• What group is empowered to define software quality requirements?
• How should customers provide input?
• How are requirements conflicts resolved?
![Page 84: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/84.jpg)
Identify Software Quality Metrics
• Specify important quality factors and subfactors• Identify direct metrics
– Name– Costs– Target value– Tools– Application– Data items– Computation
![Page 85: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/85.jpg)
Example of Documenting a Metric
Item Description
Name Number of defects detected in selected modules
Costs Minimal: data can be obtained from bug-tracking tool
Target Value 5
Tools Spreadsheet
Application Metric is used for relative comparison to values obtained for other modules
Data Items Count of defects detected at code inspections
Computation Sum number of defects reported against specific modules
![Page 86: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/86.jpg)
Implement the Collection of Data
Item Description
Name Name given to a data item
Metrics Metrics associated with the data item
Definition Straightforward description of the data item
Source Location of where the data originates
Procedures Procedures (manual or automated) for collecting the data
Representation Manner in which data is represented, for example, precision, format, units, etc.
Storage Location of where the data is stored
![Page 87: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/87.jpg)
Analyze Software Quality Metric Results
• Results need to be analyzed within the context of the project’s overall software quality requirements
• Any metrics that fall outside of their respective targets should be identified for further analysis
![Page 88: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/88.jpg)
Validate the Software Quality Metrics
• Assess the statistical significance of the metrics to the quality factors they represent
• See the IEEE Standard 1061-1998 for a thorough description of this process
![Page 89: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/89.jpg)
Metrics that Support Software Verification Activities
• Complexity Metrics– The McCabe Cyclomatic Complexity Metric– Halstead’s Software Science Complexity Metric
• Defect Metrics
• Product Metrics
• Process Metrics
![Page 90: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/90.jpg)
Complexity Metrics Can Be Used to Identify
• Candidate modules for code inspections• Areas where redesign may be appropriate• Areas where additional documentation is
required• Areas where additional testing may be
required
![Page 91: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/91.jpg)
Product Metrics
• Number and type of defects found during requirements, design, code, and test inspections
• Number of pages of documentation delivered
• Number of new source lines of code created
• Number of source lines of code delivered
• Total number or source lines of code delivered
• Average complexity of all modules delivered
![Page 92: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/92.jpg)
Product Metrics (cont’d)
• Average size of modules
• Total number of modules
• Total number of bugs found as a result of unit testing
• Total number of bugs found as a result of integration testing
• Total number of bugs found as a result of validation testing
• Productivity, as measured by KLOC per person-hour
![Page 93: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/93.jpg)
Process Metrics
• Average find-fix cycle time
• Number of person-hours per inspection
• Number of person-hours per KLOC
• Average number of defects found per inspection
• Number of defects found during inspections in each defect category
• Average amount of rework time
• Percentage of modules that were inspected
![Page 94: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/94.jpg)
Attributes of a Measurement Program – according to Humphrey
• The measures should be robust
• The measures should suggest a norm
• The measures should relate to specific product and process properties
• The measures should suggest an improvement strategy
![Page 95: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/95.jpg)
Attributes of a Measurement Program – according to Humphrey (cont’d)
• The measures should be a natural result of the software development process
• The measures should be simple• The measures should be predictable and
trackable• The measures should not be used as part of a
person’s performance evaluation
![Page 96: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/96.jpg)
Template for Software Quality Goal Definition
• Purpose: To (characterize, evaluate, predict, monitor, etc.) the (process, product, model, metric, etc.) in order to (understand, plan, assess, manage, control, engineer, learn, improve, etc.) it.– Example: To evaluate the maintenance process in
order to improve it.
![Page 97: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/97.jpg)
Template for Software Quality Goal Definition (cont’d)
• Perspective: Examine the (cost, effectiveness, correctness, defects, changes, product measures, etc.) from the viewpoint of the (developer, manager, customer, etc.)– Example: Examine the effectiveness from the
viewpoint of the customer.
![Page 98: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/98.jpg)
Template for Software Quality Goal Definition (cont’d)
• Environment: The environment consists of the following: process factors, people factors, methods, tools, constraints, etc.– Example: The maintenance staff are poorly
motivated programmers who have limited access to tools.
![Page 99: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/99.jpg)
Determining Metrics
Goal Questions Metrics
Evaluate How fast are fixes to customer reported problems made?
What is the quality of fixes delivered?
Average effort to fix a problem
Percentage of incorrect fixes
![Page 100: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/100.jpg)
What We Are Talking About Today
• What Is Software Cost Estimation
• How It’s Done (Models, Methods, Tools)
• Issues and Problems
04/21/23 100
![Page 101: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/101.jpg)
Introduction To Software Cost Estimation
• A Few Definitions (Vidger/Kark 1994)
– Software Cost• Manpower Loading
• Effort
• Duration
– Software Cost Estimation Process• Set of techniques and procedures that an
organization uses to arrive at a software cost estimate
04/21/23 101
![Page 102: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/102.jpg)
Classical View - Inputs and Outputs
Classical View Of Software Estimation Process [Vidger/Kark]
04/21/23 102
![Page 103: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/103.jpg)
Why and How Cost Estimates Are Used
• Understanding why and how cost estimates are used within an organization will likely determine how cost estimations will be done
• Common Examples (Vidger/Kark 1994)
– Planning and Budgeting– Project Management– Communication Among Team Members
04/21/23 103
![Page 104: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/104.jpg)
Cost Estimation Process
• Determine Objectives– Who needs what data for what purpose(s)
• Gather Data– Focus Should Be Given To ‘Hard’ Data
• Well-Defined Requirements
• Available Resources
• Analyze Data Using A Variety Of Methods
04/21/23 104
![Page 105: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/105.jpg)
Cost Estimation Process (Cont’d)
• Re-estimate Costs Throughout The Project– Effective Monitoring– Refine and Make Changes As Necessary
• Compare End Costs With Estimated Costs– That is, if project actually completes!
04/21/23 105
![Page 106: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/106.jpg)
Issues and Problems
Inaccurate Estimation Models, Methods, and Tools Ineffective Management of Requirements Duration, Size, and Number of People Involved Ineffective or Non-existent Monitoring of Project New Technology Mixed Technology Inexperienced Project Managers and Estimators Lack of Application Domain Expertise Software Processes and Process Maturity Lack of Historical Data Business / Legal / Monetary Issues
04/21/23 106
![Page 107: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/107.jpg)
What is Needed To Make Reliable Cost Estimates
• A Combination Of Models, Methods, and Tools
• Gathering/Improving of Historical Data• Well-defined and Well-controlled Software
Development Processes• Better Managing of Requirements• Experienced Project Managers, Estimators,
and Team Members
04/21/23 107
![Page 108: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/108.jpg)
Expert Judgment Method
Expert judgment techniques involve consulting with software cost estimation experts to use their experience and understanding of the proposed project to arrive at an estimate of its cost.
Technique: Delphi technique, a group consensus technique.
• Advantages: Empirical• Disadvantages: Subjective
04/21/23 108
![Page 109: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/109.jpg)
Estimating by Analogy
Estimating by analogy means comparing the proposed project to previously completed similar project where the project development information is known.
• Advantages: Based on actual experience of projects• Disadvantages: Difficult to ensure the degree of
representative between previous projects and new one
04/21/23 109
![Page 110: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/110.jpg)
Top-Down Estimating Method
Top-down estimating method is also called Macro Model. Using it, a cost estimation is derived from the global properties of the software project, and then the project is partitioned into various low-level components.
• Advantages: Efficient and system level view• Disadvantages: Too rough
04/21/23 110
![Page 111: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/111.jpg)
Bottom-up Estimating Method
Bottom-up estimating method, the cost of each software components is estimated and then combines the results to arrive at an estimated cost of overall project.
Advantages: Detailed and stable Disadvantages: overlook many of the system-
level costs, inaccurate, more time-consuming
04/21/23 111
![Page 112: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/112.jpg)
Algorithmic Method
The algorithmic method is designed to provide some mathematical equations to perform software estimation.
Models: COCOMO & COCOMO II, Putnam, ESTIMACS and SPQR/20.
04/21/23 112
![Page 113: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/113.jpg)
Algorithmic Method (Cont’d)
Advantages: 1. Objective, repeatable
2. Has modifiable and analyzable formulas
3. Efficient and able to support sensitivity analysis
4. Objectively calibrated to previous experience.
04/21/23 113
![Page 114: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/114.jpg)
Algorithmic Method (Cont’d)
Disadvantages:
1. Unable to deal with exceptional conditions
2. Poor sizing inputs and inaccurate cost driver rating will result in inaccurate estimation
3. Some experience and factors can not be easily quantified
04/21/23 114
![Page 115: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/115.jpg)
The Selection and Use of Estimation Methods
No one method is necessarily better or worse than the other, in fact, their strengths and weaknesses are often complementary to each other.
1. Do not depend on a single cost or schedule estimate, use several techniques or cost models, compare the results
2. Document the assumptions made when making the estimates3. Monitor the project to detect when assumptions that turn out to
be wrong jeopardize the accuracy of the estimate. 4. Improve software process5. Maintaining a historical database
04/21/23 115
![Page 116: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/116.jpg)
Commercial Tools of Cost Estimation (Cont’d)
Some Promising Tools:
• ACEIT • COCOMO II *• Construx Estimate *• COSMOS *• COSTAR *
04/21/23 116
• Cost Xpert • ESTIMATE Pro *
• PRICE-S • SEER-SEM
• SLIM-Estimate
Most parametric models are likely to have the COCOMO II equations at the core...
![Page 117: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/117.jpg)
The COCOMO Model
Model to estimate the development cost and schedule of a software project.
Introduced by Barry Boehm of USC-CSE in 1981.First two letters of the words Constructive Cost
ModelPrimarily based on the software development
practices prior to 1980s, i.e. based on the Waterfall model.
04/21/23 117
![Page 118: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/118.jpg)
The COCOMO Model
• Effort equation is the basis of the COCOMO II model.
• The nominal effort equation of a project of a given size is given by the equation – PM(nominal) = A * (Size)B where PM(nominal) is the nominal effort in person months A is the multiplicative effect of cost drivers B is the constant representing the affect of scale factors
04/21/23 118
![Page 119: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/119.jpg)
COCOMO Model
• MM = a * KLOC b
• Time for development = c * MM d
a b c d
organic 2.4 1.05 2.5 0.38
Semi-Detached 3.0 1.12 2.5 0.35
Embedded 3.6 1.20 2.5 0.3204/21/23 119
![Page 120: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/120.jpg)
The COCOMO II Model
• Has three series of models:– The Application Composition Model– The Early Design Model– The Post-Architecture Model
04/21/23 120
![Page 121: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/121.jpg)
The COCOMO II Model
Cost Drivers are used in the model to adjust the nominal effort in the software project
Cost Drivers are multiplicative factors required to determine the effort required to complete the software project. Ratings range from VL, L, N, H, VH, EH
Model has 17 cost drivers divided into 4 categories:ProductComputerPersonnelProject
04/21/23 121
![Page 122: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/122.jpg)
The COCOMO II Model
• Cost drivers in the Product category:– Required Software Reliability– Database Size– Software Product Complexity– Required Reusability (new)– Documentation match to life-cycle needs
04/21/23 122
![Page 123: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/123.jpg)
The COCOMO II Model
• Cost drivers in the Computer Category:– Execution Time Constraint– Main Storage Constraint– Platform Volatility
04/21/23 123
![Page 124: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/124.jpg)
The COCOMO II Model
• Cost drivers in Personnel Category:– Analyst Capability– Programmer Capability– Applications Experience– Platform Experience– Language and Tool Experience– Personnel Continuity
04/21/23 124
![Page 125: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/125.jpg)
The COCOMO II Model
• Cost drivers in Project category:– Use of Software Tools– Multisite Development– Required Development Schedule
04/21/23 125
![Page 126: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/126.jpg)
The COCOMO II Model
Scale Drivers are important factors determining the cost and duration of the software development effort.
The five scale drivers in the COCOMO II model are:Precedentedness Development Flexibility Architecture / Risk Resolution Team Cohesion Process Maturity
04/21/23 126
![Page 127: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/127.jpg)
Scaling Factors• cause an exponential cost increase
04/21/23 127
Cost Drivers Very Low
Low Nom High Very High
Extra High
Precedentedness (PREC)
5 4 3 2 1 0
Development flexibility (FLEX)
5 4 3 2 1 0
Archictecture/risk resolution (RESL)
5 4 3 2 1 0
Team cohesion (TEAM)
5 4 3 2 1 0
Process Maturity (PMAT)
5 4 3 2 1 0
![Page 128: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/128.jpg)
Scaling Factors
04/21/23 128
![Page 129: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/129.jpg)
Effort Multipliers (Post-Architecture)
• Product factors
04/21/23 129
Cost Drivers Very Low
Low Nom High Very High
Extra High
Reliability (RELY) 0.75 0.88 1.00 1.15 1.39
Database size (DATA) 0.93 1.00 1.09 1.19
Complexity (CPLX) 0.75 0.88 1.00 1.15 1.30 1.66
Reusability (RUSE) 0.91 1.00 1.14 1.29 1.49
Documentation (DOCU) 0.89 0.95 1.00 1.06 1.13
![Page 130: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/130.jpg)
Effort Multipliers (Post-Architecture)
• Platform factors
04/21/23 130
Cost Drivers Very Low
Low Nom High Very High
Extra High
Execution time constraints (TIME)
1.00 1.11 1.31 1.67
Main storage constraints (STOR)
1.00 1.06 1.21 1.57
Platform volatility (PVOL)
0.87 1.00 1.15 1.30
![Page 131: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/131.jpg)
Effort Multipliers (Post-Architecture)
• Personnel factors
04/21/23 131
Cost Drivers Very Low
Low Nom High Very High
Extra High
Analyst capability (ACAP)
1.50 1.22 1.00 0.83 0.67
Programmer .. (PCAP) 1.37 1.16 1.00 0.87 0.74
Application experience (APEX)
1.22 1.10 1.00 0.89 0.81
Platform .. (PLEX) 1.24 1.10 1.00 0.92 0.84
Language/tool ..(LTEX) 1.25 1.12 1.00 0.88 0.81
Personnel continuity (PCON)
1.24 1.10 1.00 0.92 0.84
![Page 132: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/132.jpg)
Effort Multipliers (Post-Architecture)
• Project factors
04/21/23 132
Cost Drivers Very Low
Low Nom High Very High
Extra High
Use of software tools (TOOL)
1.24 1.12 1.00 0.86 0.72
Multi-site development (SITE)
1.25 1.10 1.00 0.92 0.84 0.78
Required development schedule (SCED)
1.29 1.10 1.00 1.00 1.00
![Page 133: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/133.jpg)
Effort Multipliers (Post-Architecture)
04/21/23 133
every 1 mo.
![Page 134: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/134.jpg)
Effort Multipliers (Post-Architecture)
04/21/23 134
![Page 135: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/135.jpg)
Application Composition Model intended mainly for prototyping activitiesuses Object Point (OP) counts (not FPs)
a count of the number and complexity of large granularity entities such as screens, reports and components
and factors in code reuse and productivity new object points, NOP = OP x (100 - % reuse) / 100
productivity, PROD = NOP / person-months
effort, PM = NOP / PROD
04/21/23 135
![Page 136: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/136.jpg)
Functional Size Measurement
136
![Page 137: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/137.jpg)
Functional Size measurement
137
![Page 138: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/138.jpg)
Applicability of FSM
• FUR can be extracted from software engineering artifacts BEFORE the software exists… (using UML for instance).
• Inputs:– Requirements definition artifacts
– Data analysis / modeling artifacts
– Artifacts from functional decomposition of requirements
– FUR can also be extracted from software engineering artifacts AFTER the software has been constructed.
– Physical programs and screens
– Software operations manuals and procedures
– Physical data storage artifacts
138
![Page 139: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/139.jpg)
ISO 14143-1 Terminology for FSM (I)
• Functional User Requirements (FUR): A sub-
set of the user requirements. The FURs
represent the user practices and procedures
that the software must perform to fulfill the
users’ needs.
• Functional Size: A size of the software derived
by quantifying the FUR.
139
![Page 140: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/140.jpg)
ISO 14143-1 Terminology for FSM (II)
• Base Functional Component (BFC): An
elementary unit of FUR defined by and used
by an FSM Method for measurement
purposes.
• BFC Type: A defined category of BFCs. A
BFC is classified as one and only one BFC
Type.140
![Page 141: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/141.jpg)
Functional Size Measurement
Function points (FP)p( )
International Function Point Users Group
(IFPUG) FP
Common Software Measurement International
Consortium (COSMIC) FP
141
![Page 142: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/142.jpg)
Functional Size Measurement
• Function points (FP)– International Function Point Users Group
(IFPUG) FP– Common Software Measurement International
Consortium (COSMIC) FP
142
![Page 143: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/143.jpg)
ISO Approved FSM Methods
• IFPUG Function Point Analysis (ISO/IEC 20926)
• Mark II Function Point Analysis (ISO/IEC 20968)
• NESMA Functional Size Measurement Method
• (ISO/IEC 24570)
• COSMIC Function Points (ISO/IEC 19761)143
![Page 144: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/144.jpg)
IFPUG Function Point AnalysisMethod
144
![Page 145: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/145.jpg)
IFPUG FPA Measurement Procedure
• Identify Elementary processes from the• Functional User RequirementsRequirements.• Identify the BFCs and their types.• Rate the complexity of each BFC Type.• Assign Function Points to each BFC Type• according to the complexity rates.• Calculate the functional size by summing the• FPs assigned to each distinct BFC TypeType.
145
![Page 146: Software Metrics](https://reader035.fdocuments.in/reader035/viewer/2022062423/56814cf7550346895dba0273/html5/thumbnails/146.jpg)
IFPUG FPA - BFC
• Elementary Process: the smallest unit of activity that is meaningful to the user(s)user(s).
– E.g., FUR: “The user will be able to add a new employee to the application.” is an Elementary Process.
• The elementary process must be self-contained and leave the business of the application being counted in a consistent state.
• E.g., If all the employee information is not added, an employee has not yet been created. Adding some of the information alone leaves the business of adding employee in an inconsistent state.
146