©2008 Information Architects, Inc. 1 Zen & the Art of Software Quality Jim Highsmith.
-
Upload
primrose-shelton -
Category
Documents
-
view
217 -
download
2
Transcript of ©2008 Information Architects, Inc. 1 Zen & the Art of Software Quality Jim Highsmith.
©2008 Information Architects, Inc. 1
Zen & the Art of Software Quality
Jim Highsmith
©2008 Information Architects, Inc. 2
Things to Think About
• What is Quality?
• How do we manage for Quality?
• How do we measure Quality?
• How do we measure successin Agile organizations?
©2008 Information Architects, Inc. 3
What is Quality?
How many of you come to work each Monday morning determined to perform poor quality work?
©2008 Information Architects, Inc. 4
What is Quality?
“Quality is the continuing stimulus which causes us to create the world in which we live.”
“Quality… you know what it is, yet you don’t know what it is.”
Robert Pirsig, Zen and the Art of Motorcycle Maintenance (1974)
©2008 Information Architects, Inc. 5
The Dichotomy of Quality
• Intrinsic or Extrinsic?– Does quality exist in the things we
observe or is it subjective, existing only in the eye of the observer?
• The “House of Cards”
“You take your analytic knife, put the point directly on the term Quality and just tap, not hard, gently, and the whole world splits, cleaves, right in two—hip and square, classic and romantic, technological and humanistic—and the split is clean.”
—Robert PirsigZen and the Art of Motorcycle Maintenance
(1974)
©2008 Information Architects, Inc. 6
What is Quality?
• “Quality is conformance to user requirements.” —Phillip Crosby, Quality is Free (1980s)
• “Quality is the absence of defects that would make software stop completely or produce unacceptable results.”
—Capers Jones, Applied Software Measurement (1991)• “Quality is achieving excellent levels of fitness for use,
conformance to requirements, reliability, and maintainability.”—Watts Humphrey, Managing the Software Process
(1980s)• “Quality is value to some person.”
—Jerry Weinberg, Quality Software Management (1992)• “Quality is customer expectations delivered (2007).”
—Internal company definition
©2008 Information Architects, Inc. 7
Definition of Quality
Quality
is always
value to
some
person.
“To have value, software must be more than perfect. It must be useful to someone.” —Jerry Weinberg
©2008 Information Architects, Inc. 8
Quality is Personal
• High Performance is High Quality– to users who notice low performance.
• Elegant Coding is High Quality– to developers who place high value on the opinions of
their peers.
• Zero Defects is High Quality– to users who would be disturbed by those defects.
• Lots of Features is High Quality– to marketers who believe that features sell products.
©2008 Information Architects, Inc. 9
How Many Persons?
• Total Value = Sum Each Person’s Value• Customer’s Quality measures might include:
– Usefulness to their job– Ease of use – Timely – Correctness
• Developer’s Quality measures might include:– Personal view of excellence– Maintainability– Level of defects– Understandable code
©2008 Information Architects, Inc. 10
The Dichotomy of Quality
• A false dichotomy?
Total Quality
Extrinsic Quality = Value
Intrinsic Quality = Quality
©2008 Information Architects, Inc. 11
Is Intrinsic Quality Important?
• Leading edge enterprises employ technologies that can approach 99% cumulative defect removal rates.
• The norm for US firms is a cumulative defect removal rate of 75%.
• A cumulative defect removal rate of 95% on a project appears to be a nodal point where several other benefits accrue. For projects of similar size and type, these projects:– have the shortest schedules.– have the lowest quantity of effort in terms of person-months– have the highest levels of user satisfaction after release.
—Capers Jones, Applied Software Measurement (1991)Remember this number
©2008 Information Architects, Inc. 12
Science Instruments Co. From Technology Focus to Customer Focus
• Overhaul the entire product development process• Average results from 6 before- and 6 after-Agile projects
Previous Performance
Current Performance
Percent Improvement
Project Cost $2.8 Million $1.1 Million -$1.7M(-61%)
Project
Schedule18 months 13.5 months -4.5 mo
(-24%)
Cumulative
Defects2,270 381 -1889
(-83%)
Staffing 18 11 -7(-39%)
Source: Michael Mah, Cutter Consortium and QSM Associates
©2008 Information Architects, Inc. 13
Software Company I
• Team about 35 people
• > 1million LOC, >17,000 automated tests
IndustryAverage
Current Performance Improvement
Project Cost $3.5 Million $2.2 Million -$1.3M(-37%)
Project
Schedule12.6 months 7.8 months -4.8 mo
(-38%)
Cumulative
Defects2,890 1,450 1,440
(-50%)
Staffing 35 35 na
Source: Michael Mah, QSM Associates
©2008 Information Architects, Inc. 14
Software Company II
• Team about 100 people, highly distributed team• Database: 7,300 projects, 500+organizations, 18 countries:
PI among the very highest recorded
IndustryAverage
Current Performance Improvement
Project Cost $5.5 Million $5.2 Million -$.3M(-5%)
Project
Schedule15 months 6.3 months -8.7 mo
(-58%)
Cumulative
Defects713 635 78
(-11%)
Staffing 40 92 +52(+130%)
Source: Michael Mah, Cutter Consortium and QSM Associates
©2008 Information Architects, Inc. 15
0 2 4 6 8 10 12 14 16 18 20 22 24
Productivity Index (PI) w/ ±1 Standard Deviation
Avionics
Business
Command and Control
Microcode
Process Control
Real Time
Scientific
System
Telecommunications
Information
Engineering
Real Time
Productivity Index (industry values)
Source: Michael Mah, QSM Associates
©2008 Information Architects, Inc. 16
Why is Intrinsic Quality so Important?
• The Impact of code quality on testing
• Error Location Dynamics
• Error Feedback Ratio
• Technical Debt
©2008 Information Architects, Inc. 17
The Impact of Code Quality on Testing
Development: 10 days,
4 people, 4 KLOC, 1 d/KLOC
Development: 10 days,
4 people, 4 KLOC, 15 d/KLOC
How long to test? Assume ½ day to find & fix per defect.
Test time= 2 days
Test time= 30 days
Outcome: no time to finish testing, technical debt increases!
©2008 Information Architects, Inc. 18
12108 6 4 2 0 0
8
16
24
32
40
48
56
64
Time
Errors Located
Difficult errorstake longer to find:
1 hr/d to 50 hr/d
Error Location Dynamics
©2008 Information Architects, Inc. 19
The time to finish removing errorsis critically dependent on the error feedback ratio. The three simulations differ only in their feedback ratios.A 20% difference in feedback ratio leads to an 88% difference in completion time, but the next 10% increase leads to a 112% increase.
The time to finish removing errorsis critically dependent on the error feedback ratio. The three simulations differ only in their feedback ratios.A 20% difference in feedback ratio leads to an 88% difference in completion time, but the next 10% increase leads to a 112% increase.
ERROR FEEDBACK: Errors put into a system when attempting to correct other faults.
ERROR FEEDBACK RATIO: The number of problems created per fix.
EFR = ERRORS CREATED / ERRORS RESOLVED
Error Feedback Ratio
©2008 Information Architects, Inc. 20
Technical Debt
• Once on far right of curve, all choices are hard
• If nothing is done, it just gets worse
• In applications with high technical debt, estimating is nearly impossible
• Only 3 strategies– Do nothing, it gets worse– Replace, high cost/risk– Incremental refactoring,
commitment to invest 1 2 3 4 5 6 7 8
Years
Technical Debt
Co
st
of
Ch
ang
e (
Co
C)
ProductRelease
ActualCoC
Optimal CoC
Customer Responsiveness
©2008 Information Architects, Inc. 21
Why is Intrinsic Quality so Important?
• The impact of code quality on testing—potential for exponential increases in testing time
• Error Location Dynamics—difficult errors take much more time
• Error Feedback Ratio—bad fixes can cause catastrophic dynamics
• Technical Debt—combination (and design problems) can wreck the best of intentions towards customers
©2008 Information Architects, Inc. 22
Total Quality
Extrinsic Quality = Value
Intrinsic Quality = Quality
You have to do a lot of things to reach 95% cumulative defect removal levels.
Delivering Total Quality
©2008 Information Architects, Inc. 23
Introductory Stories
Cutter Sr. Consultant Helen Pukszta
“I recently asked a colleague [CIO] whether he would prefer to deliver a project somewhat late and overbudget but rich with business benefits or one that is on-time and underbudget but of scant value to the business. He thought it was a tough call, and then went for the on-time scenario. Delivering on-time and within budget is part of his IT department’s performance metrics. Chasing after the elusive business value, over which he thought he had little control anyway, is not.”
©2008 Information Architects, Inc. 24
Introductory Stories
• Recent BusinessWeek-Boston Consulting Group survey showed that 72% of the senior executives in the survey named innovation as one of their top three priorities.
• “Experimentation matters because it is through learning equally what works and what doesn’t that people develop great new products, services, and entire businesses. But in spite of the lip service that is paid to ‘testing’ and ‘learning from failure,’ today’s organizations, processes, and management of innovation often impede experimentation.”
—Stefan ThomkeExperimentation Matters: Unlocking the Potentialof New Technologies for InnovationHarvard Business School Press, 2003.
©2008 Information Architects, Inc. 25
Introductory Stories
• In Artful Making, Harvard Business School Professor and Cutter Fellow Rob Austin and coauthor Lee Devin discuss a US $125 million IT project disaster in which the company refused to improvise and change from the detailed plan set down prior to the project’s start. “Plan the work and work the plan” was their implicit mantra. And it led them directly to a costly and destructive course of action.… We’d all like to believe that this kind of problem is rare in business. It’s not.”
©2008 Information Architects, Inc. 26
Introductory Stories
• Standish Group “Chaos Reports”– 1994 — 82% challenged or failures– 2001 — 72% challenged or failures
• Definition of “success”– successful — the project is completed on time and on budget,
with all the features and functions originally specified; – challenged — the project is completed and operational, but over
budget, late, and with fewer features and functions than initially specified; and
– failed — the project is canceled before completion or never implemented.
• Where are OUTCOMES in this definition? When we play into the traditional measures of “success,” we obscure the benefits of agile development.
©2008 Information Architects, Inc. 27
Conclusions About the Current Environment
• Innovation is critical to business success and agility is key to innovation.
• Management processes & performance measures restrict agility and innovation.
• Conformance to Plan (schedule, scope, cost) is the norm for measuring success.
• Schedule is more important than customer value (outcome).
©2008 Information Architects, Inc. 28
So,
• To build agile organizations we MUST change how we measure performance.
• We must acknowledge that our performance measurement system impacts agility and our ability to innovate.
• We must alter our obsession with time to an obsession for outcomes; that is, customer value.
• We must separate the outcome performance measurements from the output performance measurements.
©2008 Information Architects, Inc. 29
Conceptual Background Sources
• Beyond Budgeting: How Managers Can Break Free from the Annual Performance Trap (2003), Jeremy Hope and Robin Fraser.
• Measuring and Managing Performance in Organizations (1996), Rob Austin.
©2008 Information Architects, Inc. 30
Conceptual Background
• Beyond Budgeting: How Managers Can Break Free from the Annual Performance Trap (2003), Jeremy Hope and Robin Fraser:
“Budgets have since been hijacked by a generation of financial engineers that have used them as remote control devices to ‘manage by the numbers.’ They have turned budgets into fixed performance contracts that force managers at all levels to commit to delivering specified financial outcomes, even though many of the variables underpinning those outcomes are beyond their control.”
©2008 Information Architects, Inc. 31
Budgeting Problems
• They are cumbersome and expensive.– “Some project leaders figure they have saved 95% of the time
that used to be spent on budgeting and forecasting.”– “A vast area of well-documented ignorance.” — Ken Orr
• They are out of kilter with the competitive environment.– “It is when the gap between key budget assumptions and
emerging reality widens to the point at which the two bear little relation to each other that the problems begin.” — Hope & Fraser
• Gaming the numbers is rampant.– Hiring criteria for budget analyst (project manager).
©2008 Information Architects, Inc. 32
Beyond Budgeting: Principles
1. Provide a governance framework based on clear principles and boundaries.
2. Create a high-performance climate based on relative success.
3. Give people freedom to make local decisions that are consistent with governance principles and the organization’s goals.
4. Place the responsibility for value-creating decisions on frontline teams.
5. Make people accountable for customer outcomes.
6. Support open and ethical information systems.
©2008 Information Architects, Inc. 33
Conceptual Background
• Measuring and Managing Performance in Organizations (1996), Rob Austin:
“If there is a single message that comes from this book, it is that trust, honesty, and good intentions are more efficient in many social contexts than verification, guile, and self-interest.”
©2008 Information Architects, Inc. 34
Austin — How Measurement Systems Become Dysfunctional
Step 1: Measurement system installed.
Step 2: Performance tends to improve while people figure out the system.
Step 3: People, under pressure, focus on measurement goals rather than outcomes. (There is always a disconnect between the desired outcome and the measurement. Examples: (1) productivity; lines of code per day, (2) compensation based on CMMi level)
Step 4:
Time
Metric
Desired Outcome
Performance measurement
©2008 Information Architects, Inc. 35
Austin — Dysfunctional Measurement Systems
• Furthermore, “the implementation of more sophisticated measures caused more sophisticated dysfunctional reactions.” Those being measured “can increase their rewards if they can successfully obliterate the correlation between true output and measured performance.”
• “The ideal to which organizations should aspire is one in which workers are internally motivated and measurement provides them with self-assessment information. Measurement and motivation are decoupled. The challenge for managers is to become more trusting, able to inspire and communicate, and willing to help rather than be helped.”
©2008 Information Architects, Inc. 36
APMS — Goals
• To focus any enterprise group (project team, department, division, or company) on a set of desired strategic or tactical outcomes.
• To encourage those groups (project teams, etc.) to perform at a high level — measures of output.
©2008 Information Architects, Inc. 37
APMS — Guidelines
• Build the measurement system on a foundation of trust, honesty, and an intent to increase organizational value.
• Place the most emphasis on measuring outcomes, not outputs, even if the metrics are not as easy to obtain, nor as precise.
• Create outcome metrics whose constraint metrics are as broad as possible (tolerance for variations) in order to encourage agility.
• Create output informational metrics that support people’s innate internal motivational needs and provide them with aggregate measures of overall, rather than individual, performance.
©2008 Information Architects, Inc. 38
Outcome Performance Metrics
• Outcomes are a project community responsibility.• Outcomes are business related.
– IRACIS — Increase Revenue, Avoid Cost, Improve Service – Value of features/stories delivered– Value may be linked to reduced risk– Basis for decision making, not just reporting
• We must alter our obsession with time to an obsession for outcomes; that is, customer value.– Fixing time (time boxing) isn’t the same as obsessing over
schedule (fixing time enables us to focus on delivering value).– Time is a component of value.
©2008 Information Architects, Inc. 39
Outcome Performance Metrics
• Capabilities, Features, Use Cases, Stories enable outcomes– They are NOT outcomes! Scope is NOT an outcome!– Variable Priority — MoSCoW (from DSDM); Must have; Should
have; Could have; Won’t have– Variable Implementation — Minimal, Optimal, Expansive
Ability to process a special retail sale.
2
Ability to process a special retail sale.
4
Ability to process a special retail sale.
8
Minimal
OptimalExpansive
Should Have Feature
©2008 Information Architects, Inc. 40
The New Challenge
Innovation
Business Value
Workload
Security
Compliance
Demand
Cost Reduction
Martin Curley, Director IT Innovation, Intel Corporation
©2008 Information Architects, Inc. 41
Business Value Maturity Model (Intel)
Simple ROI & IT Business Case Discipline
Total Cost of Ownership
No or Ad Hoc Practices
Options/Portfolio Mgt
Optimized Value
Level 1
Level 2
Level 3
Level 4
Level 5
©2008 Information Architects, Inc. 42
Intel’s 17 Standard Measures of Value
• Days of inventory reduction
• Days of receivables outstanding
• Headcount reduction
• Headcount productivity
• System end-of-life
• Materials discounts
• Capital, hardware, and software avoidance
• Unit cost avoidance
• Factory uptime
• Scrap reduction
• Risk avoidance
• Time-to-Market
• Opening new markets
• Optimizing existing markets
• Cross selling
• Vendor-of-choice
Intel White Paper: Defining the Value of E-business: Seventeen Standard Measures
©2008 Information Architects, Inc. 43
Traditional Gantt Chart — What Does It Emphasize?
ID Task Name1 1 Secure Financing
4 2 Form Corporation
8 3 File for State Approvals
13 4 Form Market Council
17 5 Systems Selection and Acquisition
23 6 Hire FFCS Staff
39 7 Develop Field Sales Manual & Forms
44 8 Develop Field Sales Training Material
49 9 Develop Insurance Products for FFCS
55 10 Form Customer Council
56 11 Develop FFCS Accounting Procedures
64 12 Obtain Legal Representation in Each State
69 13 Determine Maximum Finance Rate Permissible in Each State
74 14 Develop Field & Operational Incentive Plans
82 15 Custom Printing Implementation
83 16 Select Priority States
84 17 Vendor System Implementation
85 18 Develop FFCS Internal Procedures
98 19 Reforecast First Year Operating Projections
108 20 Server and Communications Network
158 21 Telecommunications
168 22 Develop Internal, Management and Field Sales Reporting
181 23 HO Product Interfaces Testing
182 24 Vendor System Preview
183 25 FFCS Systems Running in 'Mock' Mode for Testing
184 26 Vendor System Installation and Training
100%
100%
63%
100%
100%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
Qtr 1, 1994 Qtr 2, 1994 Qtr 3, 1994 Qtr 4, 1994 Qtr 1, 1995 Qtr 2, 1995 Qtr 3, 1995 Qtr 4, 1995
©2008 Information Architects, Inc. 44
A Parking Lot Diagram — What Does It Emphasize?
PhysicalDesign
(33)
Apr 2001
JM
Hubbing
(20)
Apr 2001
JM
SatisfyTransport
Item
(16)
Feb 2001
JM
Route throughBearer System
(25)
Feb 2001
JM
ProtectedRoute
(8)
Apr 2001
JM
Retest TrailDiversity
(18)
Apr 2001
CA
SaveTrail Design
(10)
Apr 2001
PS
A-ZOne Hop
(14)
Dec 2000
PS
A-Z2 Hops
(21)
Dec 2000
PS
A-ZMultiple Hops
(22)
Dec 2000
PS
EstablishPathing
(19)
Dec 2000
PS
Comply withDiversity
Constraints
(13)
Mar 2001
PS
EstablishCost
(7)
Dec 2000
PS
Select BearerSystem
(7)
Dec 2000
PS
Logical BearerSystems
(13)
Apr 2001
PS
EstablishProducts and
Models
(9)
Oct 2000
JM
ExplodeDesignModel
(19)
Feb 2001
JM
EstablishDiversityLevels
(19)
Oct 2000
PS
GenerateConstraints
(20)
Dec 2000
LT
CheckConstraints forA Pathing Point
(13)
Jan 2001
LT
CaptureDetails
(17)
Dec 2000
PS
Generate andTrack
Site Events
(16)
Apr 2001
JM
Generate andResolve
Order Activities
(13)
Mar 2001
JM
RunAutodesign
(24)
Apr 2001
JM
EstablishProduct
(12)
Oct 2000
PS
EstablishProduct
Attributes
(15)
Nov 2000
PS
EstablishProduct
Attribute Groups
(12)
Nov 2000
PS
EstablishTemplates
(7)
Dec 2000
PS
EstablishAutodesign
Mapping
(6)
Dec 2000
PS
Protection
(4)
Mar 2001
PS
AutodesignTransportShortfall
(14)
Apr 2001
JM
EstablishSite
(11)
Oct 2000
LT
EstablishNode
(14)
Oct 2000
LT
EstablishNetworkElement
(9)
Oct 2000
LT
EstablishEquipment
(15)
Dec 2000
LT
EstablishDesign ItemsFor Models
(26)
Oct 2000
JM
KEY: Work In Progress Attention Full Completion Progress Bar
Establish Product Catalog (PC) Establish Network Arrangement (NW)
Establish Order (OM)Establish Diversity (DV)Establish Design Product (DP)
Inter-System Pathing (XP)
Intra-System Pathing (SP)
System Selection (SS)
Establish Trails (TR)
Establish User
(11)
Nov 2000
JMUsers (UM)
Establish CIXTrail Design
(15)
Dec 2000
CA
Lookup CITrail Design
(23)
Apr 2001
LT
100% 100%
100% 100% 100% 100%
100%42%
100%
100%
100% 100% 100%
100%
100%
100% 100%
100%
100%
100%89%
89% 1%
100% 100%
1%
0% 0%
Courtesy Jeff DeLuca
©2008 Information Architects, Inc. 45
Outcome Performance Metrics — Constraints
• Scope, Schedule, Cost are not Outcomes, but CONSTRAINTS
– Agile projects need constraints, not predictions
• Agility enables us to to alter course in mid-project in order to deliver the desired outcomes, therefore:
• Tighter constraints, less room for agility, reduces the probability of desired outcomes.
©2008 Information Architects, Inc. 46
0
10
20
30
40
50
60
70
80
90
100
1 2 3 4 5
Iterations
Sto
rie
s Planned
Revised
Completed
Story Performance
©2008 Information Architects, Inc. 47
Capability Value Delivered
Denne, Mark, and Jane Cleland-Huang. Software by Numbers:
Low-Risk, High-Return Development. Prentice Hall, 2003.
Capability Value Delivered
20 45 75 105 130 160 170 175
350
650
925
1150
13501450
1525 1575
0
200
400
600
800
1000
1200
1400
1600
1800
2000
1 2 3 4 5 6 7 8
Iterations
# f
ea
ture
s;
Do
llars
in '0
00
s
Features
Value in $000's
©2008 Information Architects, Inc. 48
Metrics for Constraints: Quality
Growth Curves in KLOCs
0
10
20
30
40
50
60
70
1 2 3 4 5 6 7
Iterations
KL
OC
s Functions
Functional Test
Unit Test
©2008 Information Architects, Inc. 49
Outcomes and Constraints
Outcomes first: Value, Quality
Constraints second:Scope, Schedule, Cost
©2008 Information Architects, Inc. 50
Output Performance Metrics
• Intentions are the key to success• Give a team the data they need to get better
– Compare its progress over time against itself — are we getting better?
– Compare performance against other internal teams — how do we compare with others in our organization?
– Compare performance against an external measure of like industries or projects — how do we compare with other companies?
• “Did we do the best we could do”, rather than “Did we meet the plan?”
©2008 Information Architects, Inc. 51
Output Performance Metrics
• Five Core Metrics (Larry Putnam, SLIM Model)1. Quantity of function — i.e., scope, measured in terms of user
stories, use cases, requirements, or features (depending on a particular situation. These may be measured ultimately as objects, modules, classes, or lines of code.)
2. Productivity — expressed as functionality produced for the time and effort
3. Time — the duration of the project in calendar months
4. Effort — the amount of effort expended in person-months
5. Reliability — as expressed in terms of defect rate
• Next 4 slides courtesy of Cutter Sr. Consultant Michael Mah.
©2008 Information Architects, Inc. 52
Gathering Project Data
©2008 Information Architects, Inc. 53
Project ModelStaffing & Probability Analysis
Avg Staff (people)<Single Goal - MBI 1.1946>
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26Jul'01
Aug Sep Oct Nov Dec Jan'02
Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Jan'03
Feb Mar Apr May Jun Jul Aug Sep
0
5
10
15
20
25A
vg Staff (people)
876543 21
R&DC&T
Milestones 0 - CSR 1 - SRR 2 - HLDR 3 - LLDR 4 - CUT 5 - IC 6 - STC 7 - UAT 8 - FCR 9 - 99R 10 - 99.9R
Milestones 0 - CSR 1 - SRR 2 - HLDR 3 - LLDR 4 - CUT 5 - IC 6 - STC 7 - UAT 8 - FCR 9 - 99R 10 - 99.9R
SOLUTION PANEL <Single Goal - MBI 1.1946>
DurationEffortCost
Peak StaffMTTD
Start Date
C&T
24.0347
589719.50.3
9/29/2001
Life Cycle
27.0397
674719.50.3
7/1/2001
MonthsPM
$ (K)peopleDays
PI=17.0 MBI=1.2 Eff SLOC=376022
RISK GAUGE <Single Goal - MBI 1.1946>
<No constraints>
DurationEffort
Peak StaffQuality
% 0 10 20 30 40 50 60 70 80 90 100
CONTROL PANEL <Single Goal - MBI 1.1946>
PI
13.6 20.4
17.0
Peak Staff
15.6 23.4
19.5
Eff SLOC (K)
301 451
376
©2008 Information Architects, Inc. 54
PI vs Effective SLOC
10 100 1000
Effective SLOC (thousands)
0
5
10
15
20
25
PI
Elan 3.0
Elan 3.1
Analyst 1.4
Elan 3.2
Analyst QS 1.1
CondorShrubbery IR2/IR1
Elan 3.0
Elan 3.1
Analyst 1.4
Elan 3.2
Analyst QS 1.1
CondorShrubbery IR2/IR1
All Sy stems QSM 2002 Scientific Av g. Line Sty le 1 Sigma Line Sty le
Project 1
Proj 2
Proj 3
Proj 4
Proj 5
Project 6Project 7
Trend Analysis — Productivity Index (PI)
©2008 Information Architects, Inc. 55
Errors SIT-FOC vs Effective SLOC
10 100 1000
Effective SLOC (thousands)
1
10
100
1000
10000
Erro
rs S
IT-F
OC
Elan 3.1
Elan 3.0
Condor
Analyst QS 1.1
Analyst 1.4
Shrubbery IR2
Elan 3.1
Elan 3.0
Condor
Analyst QS 1.1
Analyst 1.4
Shrubbery IR2
Shrubbery IR1
All Sy stems QSM 2002 Scientific Av g. Line Sty le 1 Sigma Line Sty le
Project 1
Proj 5
Project 7
Project 6
Proj 2
Proj 3
Project 2
Trend Analysis — Defects During Test
©2008 Information Architects, Inc. 56
Current & Project-to-Date Team Performance
Productivity Index Schedule Reliability Effort Total Average
% of Company
Team 4 5 3 4 4 16 4.0 110 to > 124%Team 2 4 1 3 5 13 3.3 99 to > 109%Team 1 2 3 4 1 10 2.5 75 to > 99%Team 5 2 3 2 2 9 2.3 75 to > 99%Team 3 1 2 3 1 7 1.8 < 75%
©2008 Information Architects, Inc. 57
Remember
• Both Outcomes and Outputs must be considered– A very difficult project with good outcomes
may have lower outputs
• Outcome measures are a shared responsibility
• Output measures are used by teams to improve themselves
• All measurement systems become dysfunctional over time
©2008 Information Architects, Inc. 58
The Declaration of InterDependence
• We increase return on investment by making continuous flow of value our focus.
• We deliver reliable results by engaging customers in frequent interactions and shared ownership.
• We expect uncertainty and manage for it through iterations, anticipation, and adaptation.
• We unleash creativity and innovation by recognizing that individuals are the ultimate source of value, and creating an environment where they can make a difference.
• We boost performance through group accountability for results and shared responsibility for team effectiveness.
• We improve effectiveness and reliability through situational specific strategies, processes and practices.
©2005 David Anderson, Sanjiv Augustine, Christopher Avery, Alistair Cockburn, Mike Cohn, Doug DeCarlo, Donna Fitzgerald, Jim Highsmith, Ole Jepsen, Lowell Lindstrom, Todd Little, Kent McDonald,
Pollyanna Pixton, Preston Smith and Robert Wysocki. www.APLN.com
©2008 Information Architects, Inc. 59
The Product Lifecycle
ProductConceptualization
Initial ProductDevelopment
Continuing ProductDevelopment
ProductRoadmap
ProductBacklog
ProductReleasePlan (s)
Product Delivery
Continuous Flow of Value
©2008 Information Architects, Inc. 60
Total Quality
Extrinsic Quality
Intrinsic Quality
Delivering Total Quality
Functionality
User Experience
Works as specified
Adaptable
©2008 Information Architects, Inc. 61
Measuring Performance in an Agile Organization
Scope
Cost Schedule
The Traditional Iron Triangle
Value(Extrinsic quality)
Quality(Intrinsic quality)
Constraints(cost, schedule, scope)
The Agile Triangle
©2008 Information Architects, Inc. 62
Project Performance Goals
• Value– Meet or exceed customer value expectations– Value “chunks” are delivered in a release timebox
• Quality– All known defects are repaired– Defect levels through development are less than ½ industry
average– Technical debt remains low
• Constraints– Scope: all planned major value-generating capabilities are
delivered. Story scope varies by mutual agreement– Cost: actual costs are within agreed to limits (limits can change
by mutual agreement during the project)– Schedule: actual schedule is within agreed to limits (limits can
change by mutual agreement during the project)
©2008 Information Architects, Inc. 63
Things to Think About
• What is Quality?
• How do we manage for Quality?
• How do we measure Quality?
• How do we measure successin Agile organizations?