MTAT.03.094 Software Engineering - ut€¦ · different measures have been proposed, e.g., Coupling...
Transcript of MTAT.03.094 Software Engineering - ut€¦ · different measures have been proposed, e.g., Coupling...
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
MTAT.03.094
Software Engineering
Lecture 13:
Measurement and Process
Improvement
Dietmar Pfahl
email: [email protected] Fall 2013
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Schedule of Lectures
Week 01: Introduction to SE
Week 02: Requirements Engineering I
Week 03: Requirements Engineering II
Week 04: Analysis
Week 05: Development Infrastructure I
Week 06: Development Infrastructure II
Week 07: Architecture and Design
Week 08: Refactoring
Week 09: Quality Management
Week 10: Verification and Validation I
Week 11: Verification and Validation II
Week 12: Agile/Lean Methods
Week 13: Measurement and Process
Improvement
Week 14: Course wrap-up, review and
exam preparation
Week 15: no lecture
Week 16: no lecture
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Structure of Lecture 13
• Measurement
• Measurement Basics
• Example Measures
• Measurement Method
• Process Improvement
• Plan-Do-Check-Act
• Process Assessment
• CMMI
• TMMI
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Definitions:
Measurement and Measure
• Measurement:
• Measurement is the process through which
values (e.g., numbers) are assigned to attributes
of entities of the real world.
• Measure:
• A measure is the result of the measurement
process, so it is the assignment of a value to an
entity with the goal of characterizing a specified
attribute.
• Source: Sandro Morasca, “Software Measurement”, in “Handbook of Software
Engineering and Knowledge Engineering - Volume 1: Fundamentals” (refereed
book), pp. 239 - 276, Knowledge Systems Institute, Skokie, IL, USA, 2001,
ISBN: 981-02- 4973-X.
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Software Measurement Challenge
• Measuring physical properties (attributes):
entity attribute unit* scale (type) value range*
Human Height cm ratio 178 (1, 300)
Human Temperature C° interval 37 (30, 45)
• Measuring non-physical properties (attributes):
entity attribute unit* scale (type) value range*
Human Intelligence/IQ index ordinal 135 [0, 200]
Program Modifiability ? ? ? ?
Software properties are usually non-physical:
• size, complexity, functionality, reliability, maturity, portability, flexibility,
understandability, maintainability, correctness, testability, coupling, coherence,
interoperability, …
‘unit’ and ‘range’ are sometimes used synonymously with ‘scale’
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Measurement Scale Types
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Measurement Scale Types – cont’d
The classification of scales has an
important impact on their practical use,
in particular on the statistical techniques
and indices that can be used.
Example: Indicator of central tendency of
a distribution of values (“Location”).
Mode = most frequent value of
distribution
Median = the value such that not more
than 50% of the values of
the distribution are less
than the median and not
more than 50% of the values
of the distribution are greater
than the median
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
More Terminology
Direct versus Indirect Measures:
• Direct Measure: Measurement doesn’t
rely on other measurements.
• Indirect Measure: Measure is
calculated (derived) from other
measures.
Quantitative versus Qualitative Measures:
• Quantitative Measure: Measurement
on ratio or interval scale.
• Qualitative Measure: Measurement on
ordinal or nominal scale.
Reliability:
• A measurement procedure is 100%
reliable, if the measurement of the the
same attribute of an entity always gives
the same result, no matter when and by
whom the measurement is conducted.
Objective versus Subjective Measure:
• Objective Measure: Requires a
measurement procedure (ideally tool
supported for automatic measurement)
that guarantees close to 100% reliability,
• Subjective Measure: Measurement
based on human judgment.
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Measurable entities in a SW Process Model
An entity can represent any of the following:
• Process/Activity: any activity (or set of
activities) related to software
development and/or maintenance (e.g.,
requirements analysis, design, testing) –
these can be defined at different levels
of granularity
• Product/Artifact: any artifact produced or
changed during software development
and/or maintenance (e.g., source code,
software design documents)
• Resources: people, time, money,
hardware or software needed to perform
the processes
Activity
Productout
Productin
Ressourcetool
Ressourcerole
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Examples of Software Product Attributes
• Size
• Length, Churn,
Complexity,
Functionality
• Modularity
• Cohesion
• Coupling
• Quality
• Value (Price)
• ...
• Quality (-> ISO 9126)
• Functionality
• Reliability
• Usability
• Efficiency
• Maintainability
• Portability
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Common OO Code Measures
Measure Desirable Value
Coupling Lower
Cohesion Higher
Cyclomatic Complexity Lower
Method Hiding Factor Higher
Attribute Hiding Factor Higher
Depth of Inheritance Tree Low (tradeoff)
Number of Children Low (tradeoff)
Weighted Methods Per Class Low (tradeoff)
Number of Classes Higher (with ident functionality)
Lines of Code (net and total; comment) Lower (with ident functionality)
Churn (new + changed LoC) Lower (with ident functionality)
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Modularisation – Coupling
Measure Desirable
Value
Description
Coupling
(many
different
measures
have been
proposed,
e.g.,
Coupling
Factor
(CF),
Coupling
between
Objects
(CBO))
Lower Coupling is a measure of interdependence of two objects. For example,
objects A and B are coupled if a method of object A calls a method or
accesses a variable in object B. Classes are coupled when methods
declared in one class use methods or attributes of the other classes.
Empirical evidence supports the benefits of low coupling between
objects.
The stronger the coupling between software artifacts,
(i) the more difficult it is to understand individual artifacts, and hence
to correctly maintain or enhance them;
(ii) the larger the sensitivity of (unexpected) change and defect
propagation effects across artifacts; and
(iii) consequently, the more testing required to achieve satisfactory
reliability levels.
Additionally, excessive coupling between objects is detrimental to
modular design and prevents reuse. To summarize, low coupling is
desirable.
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Modularisation – Cohesion
Measure Desirable
Value
Description
Cohesion
(many
different
measures
have been
proposed,
e.g., Lack
of Cohesion
in Methods
(LCOM))
Higher Cohesion refers to how closely the methods in a class are related to
each other. Cohesion of a class is the degree to which the local
methods are related to the local instance variables in the class.
A class's lack of cohesion in methods (LCOM) metric counts the sets of methods
in a class that are not related through the sharing of some of the class's fields.
The original definition of this metric considers all pairs of a class's methods. In
some of these pairs both methods access at least one common field of the class,
while in other pairs the two methods to not share any common field accesses. The
lack of cohesion in methods is then calculated by subtracting from the number of
method pairs that don't share a field access the number of method pairs that do.
High cohesion indicates good class subdivision.
Lack of cohesion or low cohesion increases complexity, thereby
increasing the likelihood of errors during the development process.
Classes with low cohesion could probably be subdivided into two or
more subclasses with increased cohesion. This metric evaluates the
design implementation as well as reusability.
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Complexity – McCabe
Measure Desirable
Value
Description
Cyclomatic
Complexity
(CC)
Lower Defines the number of independent (simple) paths in a Control Flow
Graph (CFG).
Draw CFG, then calculate CC as follows:
CC = #(edges) – #(nodes) + 2
CC = #(decisions) + 1
CC = 5 + 1 = 6
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Encapsulation –
Visibility of Members of a Class
The figure on the right shows the four java classes and how they are related:
-- Example: classes and packages used to illustrate access levels (see table below)
The following table shows where the members of the Alpha class are visible for each of the access modifiers that can be applied to them. Access Modifier Alpha Beta AlphaSub Gamma public Y Y Y Y protected Y Y Y N no modifier Y Y N N private Y N N N
Alpha
Beta
Package One
AlphaSub
Gamma
Package Two
subclass
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Encapsulation – Methods
Measure Desirable
Value
Description
Method
Hiding
Factor
(MHF)
Higher MHF is defined as the ratio of the sum of the invisibilities of all methods
defined in all classes to the total number of methods defined in the
system under consideration.
The invisibility of a method is the percentage of the total classes from
which this method is not visible.
In other words, MHF is the ratio of hidden methods – protected or
private methods - to total methods.
Methods should be encapsulated (hidden) within a class and not
available for use to other objects. Method hiding increases reusability in
other applications and decreases complexity. If there is a need to
change the functionality of a particular method, corrective actions will
have to be taken in all the objects accessing that method, if the method
is not hidden. Thus hiding methods also reduces modifications to the
code. The Method Hiding Factor should have a large value.
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Encapsulation – Methods
Measure Desirable
Value
Description
Method
Hiding
Factor
(MHF)
Higher MHF is defined as the ratio of the sum of the invisibilities of all methods
defined in all classes to the total number of methods defined in the
system under consideration.
The invisibility of a method is the percentage of the total classes from
which this method is not visible.
In other words, MHF is the ratio of hidden methods – protected or
private methods - to total methods.
Methods should be encapsulated (hidden) within a class and not
available for use to other objects. Method hiding increases reusability in
other applications and decreases complexity. If there is a need to
change the functionality of a particular method, corrective actions will
have to be taken in all the objects accessing that method, if the method
is not hidden. Thus hiding methods also reduces modifications to the
code. The Method Hiding Factor should have a large value.
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Encapsulation – Attributes
Measure Desirable
Value
Description
Attribute
Hiding
Factor
(AHF)
Higher AHF is defined as the ratio of the sum of the invisibilities of all attributes
defined in all classes to the total number of attributes defined in the
system under consideration.
access Ami
is
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Encapsulation – Attributes
Measure Desirable
Value
Description
Attribute
Hiding
Factor
(AHF)
Higher AHF is defined as the ratio of the sum of the invisibilities of all attributes
defined in all classes to the total number of attributes defined in the
system under consideration.
In other words, AHF is the ratio of hidden attributes – protected or
private – to total attributes.
It is desirable for the Attribute Hiding Factor to have a large value.
Ideally this metric should be always 100%. Systems as a rule should try
to hide nearly all instance data. Design guidelines suggest that public
attributes should not be used because are generally considered to
violate the rules of OO encapsulation since they expose the object's
implementation. In benefit of performance, sometimes is avoided the
use of accessor and modifier methods (getters and setters methods).
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Inheritance
Measure Desirable
Value
Description
Depth of
Inheritance
Tree
Low
(tradeoff) The depth of inheritance tree (DIT) metric provides for each class a
measure of the inheritance levels from the object hierarchy top. In
Java where all classes inherit Object the minimum value of DIT is 1.
Large inheritance depths indicate complex objects that may be
difficult to test and reuse.
Small inheritance depths can indicate functional code that does not
take advantage of the inheritance mechanism.
Number of
Children
Low
(tradeoff) This metric is the number of direct descendants (subclasses) for
each class.
Classes with large number of children are considered to be difficult to
modify and usually require more testing because of the effects on
changes on all the children. They are also considered more complex
and fault-prone, because a class with numerous children may have
to provide services in a larger number of contexts and therefore must
be more flexible.
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Size / Complexity
Measure Desirable
Value
Description
Weighted
Methods
per Class
Low
(tradeoff) WMC measures the complexity of an individual class by counting the
number of methods in a class. Often, the WMC calculation considers
complexity and the count of the number of methods applies a weighted
complexity factor (e.g., McCabe complexity).
A class with more member functions than its peers is considered to be
more complex and therefore more fault-prone. The larger the number of
methods in a class, the greater the potential impact on children since
children will inherit all the methods defined in a class. Classes with
large numbers of methods are likely to be more application specific,
limiting the possibility of reuse. This reasoning indicates that a smaller
number of methods is good for usability and reusability.
However, more recently, studies support having more, smaller methods
over fewer, larger methods for reduced complexity, increased
readability, and improved understanding. On the other hand, if a
method is in a large inheritance tree having a large number of methods
may not be advisable.
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Common OO Code Measures
Measure Desirable Value
Coupling Lower
Cohesion Higher
Cyclomatic Complexity Lower
Attribute Hiding Factor Higher
Method Hiding Factor Higher
Depth of Inheritance Tree Low (tradeoff)
Number of Children Low (tradeoff)
Weighted Methods Per Class Low (tradeoff)
Number of Classes Higher (with ident functionality)
Lines of Code (net and total; comment) Lower (with ident functionality)
Churn (new + changed LoC) Lower (with ident functionality)
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Simple Quality Measures (Examples)
Correctness:
• Entity: Document (e.g. Code)
• Attribute: Quality (Correctness)
• Unit: Defect
(found during QA activity)
• Range: [0, ∞)
• Scale type: ratio
• Characterisation:
• Direct
• Quantitative
• Objective/Subjective ???
• Defect Density:
• Entity: Document (e.g., Code)
• Attribute: Quality (Defect Density)
• Unit: Defect/LOC
• Range: [0, ∞)
• Scale type: ratio
• Characterisation:
• Indirect
• Quantitative
• Objective/Subjective ???
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Example – Performance Requirements
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Example – Performance Requirements
How to test: - Define ’standard work load’ - Expose system to standard
work load for a defined period of time
- Measure CPU usage Q: Should we do this for different kinds of CPUs?
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Example – Usability Requirements
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Example – Usability Requirements
How to test: - Define several (typical) usage
scenarios involving tasks Q and R
- Select test users and classify as ’novice’ and ’experienced’
- Let 5 (or better 10, 15) novices perform the secenarios
- Observe what problems they encounter
- Classify and count observed problems
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Examples of Software Process and Resource
Attributes that can be measured
• Process-related:
• Efficiency:
• How fast (time, duration), how much effort (effort, cost), how much
quantity/quality per time or effort unit (velocity, productivity)?
• Effectiveness:
• Do we get the results (quantity/quality) we want? – e.g., test coverage
• Capability: CMMI level
• Resource-related:
• People: Skill, knowledge, experience, learning, motivation, personality
• Organisation: Maturity
• Method/Technique/Tool: Effectiveness, efficiency, learnability, cost
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
‘Time’ versus ‘Effort’
Time:
• Entity: Some Activity (e.g., Test)
• Attribute: Time (or Duration)
• Unit: Year, Month, Week,
(Work) Day, Hour, Minute,
Second, ...
• Range: [0, ∞)
• Scale type: ratio
• Characterisation:
• Direct
• Quantitative
• Objective/Subjective ???
• Effort:
• Entity: Some Activity (e.g., Test)
• Attribute: Effort
• Unit: Person-Year, …, Person-
Day, Person-Hour, …
• Range: [0, ∞)
• Scale type: ratio
• Characterisation:
• Direct
• Quantitative
• Objective/Subjective ???
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
‘Time’ versus ‘Effort’ (cont’d)
What does it mean when I
say:
• ”This task (e.g., testing)
takes 4 days”
• ”This task (e.g., testing)
needs 4 person-days”
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Agile Measurement: Burn-Down & Burn-Up
Both can be used to calculate (average) team velocity = Story Points (or: Storys) per Team per Sprint
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Structure of Lecture 13
• Measurement
• Measurement Basics
• Example Measures
• Measurement Method
• Process Improvement
• Plan-Do-Check-Act
• Process Assessment
• CMMI
• TMMI
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Plan-Do-Check-Act (PDCA): A systematic approach to Software Process Improvement (SPI)
• PLAN what you want to accomplish over a period of time and what you might do, or need to do, to get there
• DO what you planned to do • CHECK the results of what
you did to see if the objective was achieved
• ACT on the information – standardize or plan for further improvement
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Plan-Do-Check-Act – “Plan”
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Origins of Capability Maturity Model (CMM)
• DoD** decided in the 80s to do something about the many problems in its expensive software projects (often involving suppliers).
– Ada didn’t solve the problems (as many had thought/hoped)
– Appraisals showed that there was a management problem
• 1986: Watts Humphrey left IBM, joined SEI (Software Engineering Institute, Carnegie Mellon University) and began developing CMM
• 1989: “Managing the Software Process” published by W. Humphrey
• 1993: CMM Version 1.1 published – still used
• CMMI first published in 1999, version 1.3 published in 2010.
CMM(I) = Capability Maturity Model (Integrated) DoD** = Department of Defense
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
CMM(I) History
Source: CMMI® for Development, Version 1.3 (CMMI-DEV, V1.3), CMU/SEI-2010-TR-033, ESC-TR-2010-033, November 2010 http://www.sei.cmu.edu/cmmi/
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Software Process Assessment with CMMI
Staged
ML 1
ML2
ML3
ML4
ML5
Defines 5 maturity levels (MLs); in order to achieve a maturity level all process areas associated to this level, plus all process areas associated with levels below must have a certain minimal capability.
Continuous
A maturity profile is established based on the capabilities of individual process areas
PA PA P
roc
ess A
rea
Cap
ab
ilit
y
0
1 2
3
PA
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Causal Analysis and Resolution (CAR) Organizational Performance Management (OPM)
5 Optimizing
4 Quantitatively Managed
3 Defined
2 Managed
(Repeatable)
Quantitative Project Management (QPM) Organizational Process Performance (OPP)
Requirements Development (RD) Technical Solution (TS) Product Integration (PI) Verification (VER) Validation (VAL) Organizational Process Focus (OPF) Organizational Process Definition (OPD) Organizational Training (OT) Risk Management (RSKM) Integrated Project Management (IPM) Decision Analysis and Resolution (DAR)
Requirements Management (REQM) Project Planning (PP) Project Monitoring and Control (PMC) Supplier Agreement Management (SAM) Measurement and Analysis (MA) Process and Product Quality Assurance (PPQA) Configuration Management (CM)
1 Performed
Process Areas Level CMMI Levels and Process Areas (staged)
The 22 Process Areas (PAs) are grouped into four categories:
- Engineering - Process Management - Project Management - Support
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Capability vs. Maturity Levels
Level Continuous Representation: Capability Levels
Staged Representation: Maturity Levels
Level 0 Incomplete --
Level 1 Performed Initial
Level 2 Managed Managed
Level 3 Defined Defined
Level 4 -- Quantitatively Managed
Level 5 -- Optimizing
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Target Profiles and Equivalent Staging
The following rules define equivalent staging:
To achieve Maturity Level 2, all Process Areas assigned to Maturity Level 2 must achieve Capability Level 2 or 3.
To achieve Maturity Level 3, all Process Areas assigned to Maturity Levels 2 and 3 must achieve Capability Level 3.
To achieve Maturity Level 4, all Process Areas assigned to Maturity Levels 2, 3, and 4 must achieve Capability Level 3.
To achieve Maturity Level 5, all Process Areas assigned to Maturity Levels 2, 3, 4, and 5 must achieve Capability Level 3.
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Continuous Representation: Target vs. Achievement Profiles
Maturity: Level 1
Capability Capability Capability Level 1 Level 2 Level 3
Requirements Management Project Planning Project Monitoring and Control Supplier Agreement Management Measurement and Analysis Process and Product Quality Assurance Configuration Management
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Continuous Representation: Assessment Profiles
Level 2 Level 3
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Progression of Process Maturity Levels (Staged)
CMMI:
• Start is always at ML 1
1. initial
2. managed
3. defined
4. quant. managed
5. optimizing
basic project management
process definition
quantitative process management
process control
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Stage “Managed”
input (requirements)
control (budget, schedule, standards)
resources (staff, tools)
construct the system
output (code, documentation)
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
PAs – “Managed”
To move to ML 2 (Managed) focus is on process areas:
• Requirements management
• Project planning
• Project monitoring & control
• Sub-contract management
• Measurement and analysis
• Quality assurance
• Configuration management
Causal Analysis and Resolution (CAR) Organizational Performance Management (OPM)
5 Optimizing
4 Quantitatively Managed
3 Defined
2 Managed
(Repeatable)
Quantitative Project Management (QPM) Organizational Process Performance (OPP)
Requirements Development (RD) Technical Solution (TS) Product Integration (PI) Verification (VER) Validation (VAL) Organizational Process Focus (OPF) Organizational Process Definition (OPD) Organizational Training (OT) Risk Management (RSKM) Integrated Project Management (IPM) Decision Analysis and Resolution (DAR)
Requirements Management (REQM) Project Planning (PP) Project Monitoring and Control (PMC) Supplier Agreement Management (SAM) Measurement and Analysis (MA) Process and Product Quality Assurance (PPQA) Configuration Management (CM)
1 Performed
Process Areas Level
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Stage “Defined”
design & define
code & unit test
integrate/ system test
requirements
design methods
tools, staff etc
system design
tested modules inspection
criteria
tools, staff etc.
test plans
tools, staff etc. software system
Management at the previous level tends to see the project as a whole. At this level an attempt is made to break down the project into a set of component sub-activities and to make the progress and effectiveness of these sub-activities visible to the wider world. Sub-activities are broken down into even lower level activities until you get to activities carried out by individuals or small teams.
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
PAs – “Defined”
To move to ML 3 (Defined) focus on process areas:
• Requirements development and technical solution
• Product integration
• Verification and validation
• Organizational process definition
• Organizational process focus
• Organizational training
• Risk management
• Integrated project management
• Decision analysis and resolution
Causal Analysis and Resolution (CAR) Organizational Performance Management (OPM)
5 Optimizing
4 Quantitatively Managed
3 Defined
2 Managed
(Repeatable)
Quantitative Project Management (QPM) Organizational Process Performance (OPP)
Requirements Development (RD) Technical Solution (TS) Product Integration (PI) Verification (VER) Validation (VAL) Organizational Process Focus (OPF) Organizational Process Definition (OPD) Organizational Training (OT) Risk Management (RSKM) Integrated Project Management (IPM) Decision Analysis and Resolution (DAR)
Requirements Management (REQM) Project Planning (PP) Project Monitoring and Control (PMC) Supplier Agreement Management (SAM) Measurement and Analysis (MA) Process and Product Quality Assurance (PPQA) Configuration Management (CM)
1 Performed
Process Areas Level
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Stage “Quantitatively Managed”
design & define
code & unit test
integrate/ system test
requirements
design methods
tools, staff etc
system design
tested modules inspection criteria
tools, staff etc.
test plans
tools, staff etc. software system
manage directives
design faults
syste
m f
ailu
res
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
PAs – “Quantitatively Managed”
To move to ML 4 (Quantitatively Managed) focus on process areas:
• Quantitative project management
• Organizational process performance
Causal Analysis and Resolution (CAR) Organizational Performance Management (OPM)
5 Optimizing
4 Quantitatively Managed
3 Defined
2 Managed
(Repeatable)
Quantitative Project Management (QPM) Organizational Process Performance (OPP)
Requirements Development (RD) Technical Solution (TS) Product Integration (PI) Verification (VER) Validation (VAL) Organizational Process Focus (OPF) Organizational Process Definition (OPD) Organizational Training (OT) Risk Management (RSKM) Integrated Project Management (IPM) Decision Analysis and Resolution (DAR)
Requirements Management (REQM) Project Planning (PP) Project Monitoring and Control (PMC) Supplier Agreement Management (SAM) Measurement and Analysis (MA) Process and Product Quality Assurance (PPQA) Configuration Management (CM)
1 Performed
Process Areas Level
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Optimize
old development system
new development system
Stage “Optimizing”
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
PAs – “Optimizing”
To move to ML 5 (Optimizing) focus on process areas:
• Causal analysis and resolution
• Organizational performance management
Causal Analysis and Resolution (CAR) Organizational Performance Management (OPM)
5 Optimizing
4 Quantitatively Managed
3 Defined
2 Managed
(Repeatable)
Quantitative Project Management (QPM) Organizational Process Performance (OPP)
Requirements Development (RD) Technical Solution (TS) Product Integration (PI) Verification (VER) Validation (VAL) Organizational Process Focus (OPF) Organizational Process Definition (OPD) Organizational Training (OT) Risk Management (RSKM) Integrated Project Management (IPM) Decision Analysis and Resolution (DAR)
Requirements Management (REQM) Project Planning (PP) Project Monitoring and Control (PMC) Supplier Agreement Management (SAM) Measurement and Analysis (MA) Process and Product Quality Assurance (PPQA) Configuration Management (CM)
1 Performed
Process Areas Level
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Maturity Levels: What do we now about the projects?
In Out ML 1
In Out ML 2
In Out ML 3
In Out ML 4
In Out ML 5
MTAT.03.094 / Lecture 13 / © Dietmar Pfahl 2013
Next Lecture
• Date/Time:
• Friday, 06-Dec, 10:15-12:00
• Topic:
• Course wrap-up, review and exam preparation
• For you to do:
• Lab Task 6 assessment – recall that teams must
be present with all their team members!
• Use consultation opportunity for Lab Task 7