CSI SPIN Mumbai Chapter 2011
© Cybercom Datamatics Information Solutions.
-Priyank
email: [email protected]
METRICS FOR AGILE
ABOUT US
Measure
Metrics
© Cybercom Datamatics
QualitativeQuantitative
PARTIAL SUPERVISED
© Cybercom Datamatics
FULLY SUPERVISED
DEFINITIONS
Effort – the actual hours required to write the software.
Defect – the unaccepted functionality, hopefully identified by test case…
through web search - A flaw in a component or system that can cause the
component or system to fail to perform its required function.
Schedule/Duration –the calendar time to get something done
Cost – strongly correlated with effort, but duration also plays a role
Size – something that can be counted/measured. Hopefully it is
representative of effort.
Plan/Estimated – our educated guess, is a probability.
Actual – measured result.
Quality – A delight
METRICS FOR AGILE
- Efforts ,Top-Line, Velocity, Burn-Down,
- Cost
- Schedule, Time to market , Cycle time
- Defects
- Technical debt
Can help you
Understand about scrum performance
Drawing scrum progress, productivity,
predictability
Analyze quality and value
Pain points, Improvement areas
Motivation & Performance
Simple
Scrum (Time Boxed Continuous Iterations & Release)
NEED OF THESE METRICS
AGILE IS VALUE DRIVEN & ADAPTIVE
Constraints
Estimates
Value Driven
Features
Schedule Cost
Plan Driven
CostSchedule
Requirement
Agile - AdaptivePredictive
TOP-LINE, RELEASE BURN-UP
Base Measure –• Total Number of Story
Points
• Total Number of Sprints
Planned
• Story Points planned at
each sprint
• Story Points completed in
each sprint
VELOCITY
Velocity is relative measure of progress. It can be measured by Feature delivered in an iteration
& It is a measure of how much Product Backlog the team can complete in a given amount of time.
Feature are usually initial stories and some times are set of feature with some non features.
BURN DOWN
Burn-down chart shows the estimated number of hours required to complete
the tasks of the Sprint.
And similar to earned-value chart if you count the delivered functionality over
time – Accepted work.
It shows both the status and rate of progress (“velocity”) in a way that is both
clear and easy to discuss.
BURN UP
Burn-up chart shows the amount of Accepted work (that work which has been
completed, tested and met acceptance criteria)
And is shows the Scope - how much work is in the project as whole.
SCHEDULE & COST METRICS
Metrics can be derived from this –
Actual percent complete (APC)
= Complete Story Point/Total Story Points
Expected Percent Complete(EPC)
= Number of completed iterations /number of planned iteration
Planned Value (PV)= EPC x Budget
AC =Actual Cost in $ or soft-cost in Hrs spent
EV(Earned Value)=APC x Budget
Schedule Performance Index (SPI)
= EV/PV, greater than 1 is good (ahead of schedule)
Cost Performance Index (CPI)
= EV/ AC, greater than 1 is good (under budget)
Cost variance (CV) = EV – AC, greater than 0 is good (under budget)
Schedule variance (SV)= EV –PV, greater than 0 is good (ahead of schedule)
Value realization or Velocity.
Base Measure –
• Budget Allocated for the project
• Total Number of Story Points
• Total Number of Sprints Planned
• Story Points planned at each sprint
• Story Points completed in each sprint
• Release variance – plan vs. actual
In the given example -
Budget = 100 $
Total SP = 120
Total Sprint = 12
After 4th Sprint where in First Sprint SP
Accepted 9 out of 10, in Second Sprint 10
out of 10, in Third 10 : 10 & in Fourth
10:10
APC = 39/120 which is 0.325 , in % 32.5
EPC = 4/12 = 0.33 , in % 33.33
PV = 0.33 x 100 = 33
EV = 0.325 x 100 = 32.5
Lets assume is AC = 40 $ (or 400 Hrs,
where 10 Hrs = 1 $)
SPI = 32.5/33 = 0.98
CPI = 32.5/40 = 0.81
VALUE REALIZATION (VELOCITY)
DEFECTS
Defect Removal Efficiency (DRE) is a base measure which we can tailor for
Scrum
DRE = E / ( E + D )
Where E = No. of Errors found before delivery of the software and
D = No. of Errors found after delivery of the software
@Scrum
E = No. of Errors found before delivery of the software in any iteration (@ during sprint execution
)and
D = No. of Errors found after delivery of the software (@ Production )
Ideal DRE = 1.
DRE less than 1 needs RCA
TECHNICAL DEBT
Quality can be best view through code ….
Reference
http://nemo.sonarsource.org
Copyright
http://sonarsource.org
FEW MORE BASICS QUALITY METRICS
Technical debt
Test case, Bugs
Complexity
Cyclomatic Complexity
Violations
Class, Methods, Duplication, Comments etc..
QUALITY METRICS -
Reference
http://nemo.sonarsource.org
Copyright
http://sonarsource.org
REFERENCES -
http://www.mountaingoatsoftware.com
http://www.agilemodeling.com
http://jamesshore.com/Agile-Book/assess_your_agility.html
http://java.net/projects/hudson/
http://www.sonarsource.org/
http://docs.codehaus.org/display/SONAR/Metric+definitions
https://wiki.rallydev.com
http://www.infoq.com/
cdis.in