Software Release Readiness Metric ShipIT [0,1]

14
Software Release Software Release Readiness Metric Readiness Metric ShipIT [0,1] ShipIT [0,1] Piyush Ranjan Satapathy Piyush Ranjan Satapathy Department of Computer Science & Department of Computer Science & Engineering Engineering University of California Riverside University of California Riverside

description

Software Release Readiness Metric ShipIT [0,1]. Piyush Ranjan Satapathy Department of Computer Science & Engineering University of California Riverside. Outline. Why Release Metric ? Stages & Factors Considered Metrics Considered Formulation of ShipIT - PowerPoint PPT Presentation

Transcript of Software Release Readiness Metric ShipIT [0,1]

Page 1: Software Release Readiness   Metric ShipIT [0,1]

Software Release Readiness Software Release Readiness MetricMetric

ShipIT [0,1]ShipIT [0,1]

Piyush Ranjan SatapathyPiyush Ranjan Satapathy

Department of Computer Science & Department of Computer Science & EngineeringEngineering

University of California RiversideUniversity of California Riverside

Page 2: Software Release Readiness   Metric ShipIT [0,1]

OutlineOutline

Why Release Metric ?Why Release Metric ? Stages & Factors ConsideredStages & Factors Considered Metrics ConsideredMetrics Considered Formulation of ShipITFormulation of ShipIT Justification & Elaboration of the formula Justification & Elaboration of the formula Validation of the FormulaValidation of the Formula ConclusionConclusion ReferencesReferences Q & AQ & A

Page 3: Software Release Readiness   Metric ShipIT [0,1]

Why Release Metric ?Why Release Metric ? Cost:Cost: Penalty for late delivery?Penalty for late delivery? Confidence:Confidence: accurate estimate? Complexity?accurate estimate? Complexity? QualityQuality: : Danger of breaking something else? How much testing is needed?Danger of breaking something else? How much testing is needed? ScheduleSchedule: : Holidays? Vacation? Customer availability? Hard deadline?Holidays? Vacation? Customer availability? Hard deadline? RelationshipRelationship: : Unhappy customer? Reference site?Unhappy customer? Reference site? WorkaroundWorkaround: : Manual? Send staff on site?Manual? Send staff on site? QuantifyQuantify: : How much loss? second?How much loss? second? Accuracy:Accuracy: performance test environment?performance test environment? AlternativesAlternatives: : Optional download? Automatic download?Optional download? Automatic download? SupportSupport: : Ease of download? Increased support load?Ease of download? Increased support load? Competition:Competition: Market leader? Slipping?Market leader? Slipping? UsabilityUsability: : lose customers if slow?lose customers if slow?

Page 4: Software Release Readiness   Metric ShipIT [0,1]

Phases or Factors ConsideredPhases or Factors Considered

RequirementAnalysisDesign PhaseRequirementAnalysisDesign Phase Coding or Implementation PhaseCoding or Implementation Phase Testing PhaseTesting Phase Quality Assurance PhaseQuality Assurance Phase Manuals & DocumentationManuals & Documentation Early DeploymentEarly Deployment Early SupportEarly Support

Page 5: Software Release Readiness   Metric ShipIT [0,1]

Metrics ConsideredMetrics Considered1.RequirementAnalysisDesign Phase1.RequirementAnalysisDesign Phase

Planned vs Implemented featuresPlanned vs Implemented features List of unimplemented featuresList of unimplemented features List of New features coming on…List of New features coming on…

2. Coding Phase2. Coding Phase

2.1 Creating Source2.1 Creating Source System Modules ( Already Implementged no. Vs Total planned no)System Modules ( Already Implementged no. Vs Total planned no) Application Modules ( Planned Vs Implemented no)Application Modules ( Planned Vs Implemented no) GUI Modules (Planned vs Implemented no.)GUI Modules (Planned vs Implemented no.)

2.2 Creating Object (Use another way..2.2 Creating Object (Use another way..)) KSLOC (Thousands of Source Line of Codes)KSLOC (Thousands of Source Line of Codes) Number of Functional Points (determines Complexity)Number of Functional Points (determines Complexity) Known Anomalies in the codeKnown Anomalies in the code

2.3 Building Process2.3 Building Process Compilation TimeCompilation Time No of warnings during compilationNo of warnings during compilation Incremental Build time as per the platform dependencyIncremental Build time as per the platform dependency Incremental build time as per the compiler dependencyIncremental build time as per the compiler dependency

Page 6: Software Release Readiness   Metric ShipIT [0,1]

Metrics Considered…Metrics Considered…3. Testing Phase 3. Testing Phase

3.1 Testing (Finding Bugs)3.1 Testing (Finding Bugs) Unit testing ( Planned Vs Completed no of L1 testcases)Unit testing ( Planned Vs Completed no of L1 testcases) Integration testing (Planned vs Complted no of L2 testcases)Integration testing (Planned vs Complted no of L2 testcases) System testing (Planned vs completed no of L3 testcases)System testing (Planned vs completed no of L3 testcases)

3.2 Debugging3.2 Debugging No of open issuesNo of open issues Line CoverageLine Coverage

4. Quality Assurance ( Regression testing)4. Quality Assurance ( Regression testing) Total no of Test hours plannedTotal no of Test hours planned Debugged faults till dateDebugged faults till date Acceptable no of Faults Acceptable no of Faults

5. Manuals & Documentation5. Manuals & Documentation Requirement Documentation (% Completed)Requirement Documentation (% Completed) Design Documentation (% Completed)Design Documentation (% Completed) Implementation and Usability Documentation (% Completed)Implementation and Usability Documentation (% Completed) Test Plan Documentation (% Completed)Test Plan Documentation (% Completed) User guide Documentation (% Completed)User guide Documentation (% Completed)

Page 7: Software Release Readiness   Metric ShipIT [0,1]

Metrics Considered…Metrics Considered…6. Supervision (Early Deployment)6. Supervision (Early Deployment)

6.1 Installation Process6.1 Installation Process Distribution of softwares ( Planned vs Completed)Distribution of softwares ( Planned vs Completed) Installation of software (Planned No. vs Completed No.)Installation of software (Planned No. vs Completed No.) Acceptance testing (No of Planned Vs executed Testcases)Acceptance testing (No of Planned Vs executed Testcases)

6.2 Training process6.2 Training process Developing Training Materials ( % Completion)Developing Training Materials ( % Completion) Validating Training program (% Completion)Validating Training program (% Completion) Implementing Training program (% Completion)Implementing Training program (% Completion)

7. Support (Early Customer Feedback)7. Support (Early Customer Feedback) Handling Beta Customer BugsHandling Beta Customer Bugs Reapplying the software development cycleReapplying the software development cycle Major Metrics : Major Metrics :

Maintainability Index desired and Maintainability Index Maintainability Index desired and Maintainability Index reached reached

Page 8: Software Release Readiness   Metric ShipIT [0,1]

Formulation of ‘ShipIT’Formulation of ‘ShipIT’

ShipIT =ShipIT = [(W[(WRADRAD x RAD) + (W x RAD) + (WCODECODE x CODE) + (W x CODE) + (WTESTTEST x TEST) + (W x TEST) + (WQAQA x QA) + x QA) +

(W(WMD MD x MD) + (Wx MD) + (WSVSV x SV) +(W x SV) +(WSPSP x SP)] / 100 x SP)] / 100

Where,Where,RAD = RAD = Factor of contribution towards the completion of software Factor of contribution towards the completion of software

development from RequirementsAnalysisDesign Stagedevelopment from RequirementsAnalysisDesign StageSo are the CODE, TEST, QA, MD, SV and SP respectively…So are the CODE, TEST, QA, MD, SV and SP respectively…

AndAnd W WRAD, RAD, WWCODE,, CODE,, WWTESTTEST , W , WQA, QA, WWMD , MD , WWSV, SV, WWSP SP are all are all ЄЄ [0,100] [0,100] and all sum up to 100.and all sum up to 100.

Page 9: Software Release Readiness   Metric ShipIT [0,1]

Justification & Elaboration (1)Justification & Elaboration (1)Assumptions:Assumptions:

1. Perfect Waterfall Model ( No coding or testing phase until requirement analysis done)1. Perfect Waterfall Model ( No coding or testing phase until requirement analysis done)2. Target is only Major release2. Target is only Major release3. Release after deploying in Customers site and obtaining the proper MI.3. Release after deploying in Customers site and obtaining the proper MI.4. New requirements until the end of Detailed Design phase..Not after that..if so then for next version of release..4. New requirements until the end of Detailed Design phase..Not after that..if so then for next version of release..

Elaboration:Elaboration:

RAD = [( WRAD = [( WR R x R) + (Wx R) + (WAA x A) + (W x A) + (WDD x D)] / 100; Where W x D)] / 100; Where WR R +W+WAA + W + WD D = 100 and R,A,D = 100 and R,A,D ЄЄ [0,1]. [0,1].

CODE = [(WCODE = [(Wsourcesource x Source) + (W x Source) + (Wobjectobject x Object) + (W x Object) + (WBuildBuild x Build)] / 100 x Build)] / 100

TEST = [(WTEST = [(WBugfindingBugfinding x Bugfinding) + (W x Bugfinding) + (WDebuggingDebugging x Debugging)] / 100 x Debugging)] / 100

QA = [(Pseudo Test Hours Completed) / (Total test hours planned)QA = [(Pseudo Test Hours Completed) / (Total test hours planned) MD = [(WMD = [(WRDRD x RD) + (W x RD) + (WDD DD x DD) + (Wx DD) + (WID ID x ID) + (Wx ID) + (WTDTD x TD) + (W x TD) + (WUDUD xUD) ] / 100 xUD) ] / 100

SV = [(WSV = [(WIPIP x IP) + (W x IP) + (WTP TP x TP)]/ 100x TP)]/ 100 SP = [Maintainability Index reached / Maintainability Index desired]SP = [Maintainability Index reached / Maintainability Index desired]

Page 10: Software Release Readiness   Metric ShipIT [0,1]

Justification & Elaboration (2)Justification & Elaboration (2)Used Methods & Models:Used Methods & Models:

1. COCOMO Prediction Model1. COCOMO Prediction ModelEffort = a (KSLOC)Effort = a (KSLOC)bb

Time = a (effort)Time = a (effort)bb

2. Halstead’s Metrics Model 2. Halstead’s Metrics Model Effort = n1N2NlogN/ 2n2Effort = n1N2NlogN/ 2n2T = E/18 Sec.T = E/18 Sec.

3. Albrecht’s Function Points Model 3. Albrecht’s Function Points Model LOC = SourceStatement x FPLOC = SourceStatement x FPFP = UFC x TCF FP = UFC x TCF

4. Zero failure Method4. Zero failure Method

5. Stopping Rules Method5. Stopping Rules Method

6. Maintainability Index Method 6. Maintainability Index Method

Page 11: Software Release Readiness   Metric ShipIT [0,1]

Validation of the FormulaValidation of the Formula

ShipIT = [(22 x RAD) + (19 x CODE) + (30 x TEST) + (8 x QA) + (7 x MD) + (9 x SV) + (5 x ShipIT = [(22 x RAD) + (19 x CODE) + (30 x TEST) + (8 x QA) + (7 x MD) + (9 x SV) + (5 x

SP) ] / 100 SP) ] / 100 (From Research Data) [ref 1.](From Research Data) [ref 1.]

RAD = [( 30 x R) + (20 x A) + (50 x D)] / 100 RAD = [( 30 x R) + (20 x A) + (50 x D)] / 100

CODE = [(40 x Source) + (20 x Object) + (40 x Build)] / 100CODE = [(40 x Source) + (20 x Object) + (40 x Build)] / 100

TEST = [(65 x Bugfinding) + (35 x Debugging)] / 100 TEST = [(65 x Bugfinding) + (35 x Debugging)] / 100

QA = QA QA = QA

MD = [(15 x RD) + (15 x DD) + (15 x ID) + (25 x TD) + (30 xUD) ] / 100 MD = [(15 x RD) + (15 x DD) + (15 x ID) + (25 x TD) + (30 xUD) ] / 100

SV = [(50 x IP) + (50 x TP)]/ 100SV = [(50 x IP) + (50 x TP)]/ 100

SP = SPSP = SP

Source = [(57 x Sm) + (28 x Am) + (15 x Gm)] / 100 Source = [(57 x Sm) + (28 x Am) + (15 x Gm)] / 100 (From Ref 2)(From Ref 2)

Build = [(40x CT) + (30 x HT) + (15 x BPT) + (15 x BCT)] Build = [(40x CT) + (30 x HT) + (15 x BPT) + (15 x BCT)] (from Ref2)(from Ref2)

Bugfinding = [(35 xL1) + (35 xL2) + (30 xL3)] /100Bugfinding = [(35 xL1) + (35 xL2) + (30 xL3)] /100

Page 12: Software Release Readiness   Metric ShipIT [0,1]

ConclusionsConclusions Software Release Readiness Metric Important for Market Vs Features Vs Software Release Readiness Metric Important for Market Vs Features Vs

QualityQuality

no one tool or method should be relied on to arbitrarily make the final no one tool or method should be relied on to arbitrarily make the final determination of whether a software product should be released determination of whether a software product should be released

Detecting the measurable factors in software development life cycle is a Detecting the measurable factors in software development life cycle is a skill and comes from Experience…skill and comes from Experience…

Considering the most practical metrics and under certain assumptions the Considering the most practical metrics and under certain assumptions the formula defined for “ShipIT” holds true !!!!!!!formula defined for “ShipIT” holds true !!!!!!!

Page 13: Software Release Readiness   Metric ShipIT [0,1]

ReferencesReferences

……..24 Research papers…Can’t put all....24 Research papers…Can’t put all..

Ref1Ref1. . Robert B. Grady, Hewlett-Packard, “Successfully Applying Software metrics”, Robert B. Grady, Hewlett-Packard, “Successfully Applying Software metrics”,

IEEE Trans. Soft Engr., September 1994 (Vol. 27, No. 9), pp. 18-25 IEEE Trans. Soft Engr., September 1994 (Vol. 27, No. 9), pp. 18-25

Ref2. Ref2. Gregory A. Hansen, GAPI, “Simulating Software Development Processes”, Gregory A. Hansen, GAPI, “Simulating Software Development Processes”, IEEE Software, January 1996 (Vol. 29, No. 1), pp. 73-77 IEEE Software, January 1996 (Vol. 29, No. 1), pp. 73-77

Page 14: Software Release Readiness   Metric ShipIT [0,1]

Q & A Q & A