Trends within Software Process Improvement (SPI) · Trends within Software Process Improvement...
Transcript of Trends within Software Process Improvement (SPI) · Trends within Software Process Improvement...
1
PhD trial lecture
Trends within Software Process Improvement (SPI)
Tor Erlend Fægri
March 13, 2012
2
Lecture outline
● Fundamentals and origins of Software Process Improvement (SPI)
● Trend 1: Benchmarking
● Trend 2: Analytical
● Trend 3: Agile
● Challenges ahead
3
Introduction
Software development is type of work that demands effective processes to support organizing the work:
“A software process can be defined as the coherent set of policies, organizational structures, technologies, procedures, and artifacts that are needed to conceive, develop, deploy, and maintain a software product” (Fuggetta, 2000).
The software business is highly competitive:
• High speed of innovation (Baskerville et al. 2003)
• Demands continuous awareness of improvement opportunities
One area of opportunity is Software Process Improvement (SPI)
4
SPI fundamentals
Basic assumption of SPI:
“Better processes will give better outcomes” (Dybå, 2001).
Definitions:
“*SPI+ deals primarily with the professional management of software firms, and the improvement of their practice, displaying a managerial focus rather than dealing directly with the techniques that are used to write software” (Hansen et al., 2004).
“Software process improvement (SPI) is a continuous and evolutionary approach to improve a software organization’s capability to develop quality software in response to customer requirements” (Iversen et al., 2004).
5
How can better software processes help?
• The software process influences peoples’ daily work, their collaboration with others
• Standardization of work practices can simplify collaboration (Adler, 2005)
• Influence on work satisfaction (Sawyer et al. 1997, Melnik & Maurer, 2006)
• Improved performance:
• Better products (customer satisfaction, product qualities) (Ashrafi, 2003)
• Improved projects; timeliness, productivity (Staples & Niazi 2008)
• Improved organizational performance; ROI (van Solingen, 2004)
• Marketing tool:
• Certifications and assessments as ‘profiling’ (Aaen et al. 2001) and demonstration of capability (Staples & Niazi 2008)
6
Crafting 1960’
Formality 1970’
Productivity 1980’
Concurrency, time-to-market
1990’
Agility 2000
Global integration 2010
Software craft Code-and-fix
Waterfall processes
Agile methods
Standards, Maturity models
Risk-driven processes
Software reuse, UML
Hybrid, collaborative
methods
RUP
Software processes: The engineering heritage (from Boehm, 2006)
Quantitative basis for
improvement
Innovation speed
Emergent QA focus
O-O programming
Infl
uen
tial
te
chn
olo
gies
P
roce
ss
focu
s
Bu
sin
ess
dri
vers
Global resources
COTS, Open Source development
Systems of systems
7
Core SPI concerns and corresponding key ideas (from Aaen et al., 2001)
Management of intervention process:
• Organization. “A dedicated and adapted organization of SPI activities in a dynamic fashion relying primarily on projects.”
• Plan. “Goals, activities, and responsibilities of the overall intervention as well as specific improvement efforts are carefully planned.”
• Feedback. “Feedback is ensured through systematic measurements and assessments of the effects on software engineering practices.”
Approaches to intervention process:
• Evolution. “Evolutionary in nature focusing on experiential learning and stepwise improvements.”
• Norm. “It is based on idealized, and a priori defined normative and stable models of software engineering.”
• Commitment. “Careful building and development of commitments between the involved actors is essential to ensure dedication and legitimacy.”
Perspectives in intervention process:
• Process. “The main lever for improving quality and productivity is the software process that integrates people, management and technology.”
• Competence. “The building of software developers competencies are seen as the key resource for the software process.”
• Context. “The intention is to change the context of the software operation to establish sustainable support for the actors in the software prosess.”
8
Trend 1: Benchmarking SPI
• Conformant with the idea of “… idealized, and a priori defined normative and stable models of software engineering”
• Benchmarking SPI synonyms:
• Top-down (Thomas, 1994)
• Blueprints (Aaen et al., 2003)
• ‘Best-practice’ (Dybå, 2001)
• Norm-driven (Hansen et al., 2004)
• Key characteristics (Dybå, 2001):
• Pre-defined set of best-practices, common measure of success
• Focus on improving process
• Generalized goal
• Domain independent
9
Benchmarking SPI example: CMM
Objectives and origins
• To solve the problem of poor timeliness of software projects run by US Department of Defense.
• Watts Humphrey and his colleagues at SEI created the Capability Maturity Model for software (CMM-SW) in the late1980’s. (see Humphrey et al., 1991, Paulk et al., 1993)
• Inspired by Osterweil’s idea of ‘programming the software process’ to enable separation of process designers and process users. (Osterweil, 1987; Aaen 2003)
Key CMM concepts (Paulk et al., 1993):
• Mature software organizations
• Maturity levels
• Operational definitions
• Key Process Areas (goals, commitment, ability, measurement, verification)
• Process assessment
10
CMM case study; Motorola – Ireland (from Fitzgerald & O’Kane, 1999)
• Development of switching and communications infrastructure, 300 developers.
• Moving from level 1 (initial) to level 4 (managed).
• Moving from level 3 to level 4 required that two KPA’s were addressed:
1) Quantitative process (establishing quantitative goals for the software process)
2) Software quality management (establishing quantitative goals for the software product)
• Implementation:
1) Individual managers were made responsible for developing metrics (for example, requirements estimation accuracy) to indicate their group’s performance levels.
2) Full-time collection and analysis of data. This allowed the identification of special causes of process variance (see figure.)
11
CMM research findings and critique
Findings:
● Majority of empirical evidence from CMM/CMMI (Unterkalmsteiner et al., to appear)
● Many success stories from non-representative cases, e.g. very large organizations (Hansen et al., 2004)
● Non-conclusive results; numerous, but both positive and negative reports using CMM (Conradi & Fuggetta, 2002; Nielsen & Kautz, 2008)
● Failures are rarely reported (Hansen et al., 2004)
Critique:
• Higher maturity levels also reflect increasing bureaucratization and expands the barrier between process models and software practice (Adler, 2005).
• The common ‘yardstick’ drives change – not experience and organizational goals (Dybå, 2001).
• Being too costly, not appropriate for smaller organizations (Conradi & Fuggetta, 2002; Nielsen & Kautz, 2008).
• Smaller organizations adapt through exploration rather than optimization (Dybå, 2005).
12
Trend 2: Analytical SPI
• Not conformant to the idea of “… idealized, and a priori defined normative and stable models of software engineering” (Aaen et al., 2001)
• Synonyms:
• Bottom-up (McGarry, 1994)
• ‘Receipt-based’ (Aaen, 2003)
• Problem-driven (Hansen et al., 2001)
• Key characteristics:
• No common ‘yardstick’ to performance or conformance (Dybå, 2001).
• The individual software organization becomes point of origin:
• “… assumes that the organization's individual goals, characteristics, product attributes, and experiences must drive process change; that change is defined by a local domain instead of a universal set of ‘best practices’” (McGarry, 1994).
• SPI – both design and use – becomes a collaborative effort:
• “… process users collectively design the software processes through facilitation, reflection, and improvisation” (Aaen, 2003).
13
Analytical SPI example: Quality Improvement Paradigm (QIP) Objectives and origins:
• Using ‘lean enterprise concept’ by “concentrating production and resources on value-added activities that represent an organization's critical business processes” (Basili et al. 2002).
• Developed in cooperation between Univ. of Maryland and NASA (Basili & Caldiera, 1995).
• Overall ambition is to create a learning- and user- oriented organization (Conradi & Fuggetta, 2002).
Key QIP concepts (Basili & Caldiera, 1995):
• Mandates a local point of departure in SPI work (should be based in the particulars of the organization)
• Should be incremental and based on learning loops producing ‘experience packages’ (see figure)
• Goal-Question-Metric driven. Quantitative metrics inferred from conceptual level goals and operational level questions.
• ‘Experience Factory’ – organizational infrastructure that enables learning.
14
QIP case study: Experience Factory at Daimler-Benz (from Houdek et al., 1998; Dingsøyr & Conradi, 2002)
Approach:
● Implementation of separate EF’s in three different business areas (see figure at left).
● Measurements based on Goal-Question-Metric approach
Key findings:
• “Many sources of reusable experiences, and measurement is just one of them.”
• Elimination of many consistency errors due to improved reviewing process (see figure at right).
• Handling qualitative data was a bottleneck.
• Building predictive models from quantitative data was difficult when context information was missing.
15
QIP research findings, critique
Findings:
The impact on productivity from learning from experience is significant, but differs across analysis level [individuals, groups, organizational units] (Boh et al, 2007).
“We find claims such as the systems have saved time, made work easier, and removed problems due to new personnel who existed before” (Dingsøyr & Conradi, 2002)
Critique:
SPI is a top-down task that critically depends on organizational commitment (Thomas, 1994).
16
Trend 3: Agile SPI (from Börjesson et al., 2006; Salo & Abrahamsson, 2007, Pino et al., 2010)
• An emerging trend
• Pragmatic and practice-oriented
• Rooted in agile principles and development methods
“Flexible approach adapted with collective understanding of contextual needs to provide faster development times, responsiveness to rapid changes, increased customer satisfaction, and lower defect rates” (Salo & Abrahamsson, 2007).
Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan
[The Agile Manifesto, 2001]
• Key characteristics:
• Iterative, feedback driven (Börjesson et al., 2006)
• Hybrid between analytical SPI and software process
• Highly contextual, yet strongly tied to the (agile) software process (Salo & Abrahamsson, 2007)
17
Agile SPI example: ‘Guerilla tactics’ at Ericsson (from Börjesson et al. 2006)
• Diffusion-group strategy for process improvement using pilot-projects to create ‘critical mass’ and organization-wide adoption
• Pilot projects acting as change agents.
• The critical mass enables the crossing of ‘the chasm’ (see figure at right).
• Results:
• Chasm crossed after 5 months.
• Late majority was largely won after 1 year.
18
Agile SPI: Research findings and critique
• Research findings:
• Too early to conclude, not yet evaluated by others
• However, pragmatic, practice-oriented SPI can show much greater ROI than e.g. CMM (van Solingen, 2004).
• Critique:
• None available yet.
19
Evolution of trends
CMM
1980 1990 2000 2010 2020
QIP TQM
Agile methods
CMMI
Light methodologies
PSP, TSP
Agi
le S
PI
tren
d
Ben
chm
arki
ng
tren
d
An
alyt
ical
tr
end
Learning- centric SPI
Agile SPI
20
References P.S. Adler (2005). “The evolving object of software development,” Organization
12(3):401-435.
N. Ashrafi (2003). “The impact of software process improvement on quality: in theory and
practice,” Information & Management 40(7):677-690.
R. Baskerville, B. Ramesh, L. Levine, J. Pries-Heje and S. Slaughter (2003). “Is internet-
speed software development different?,” IEEE Software 20(6):70-77.
V.R. Basili and G. Caldiera (1995). “Improve software quality by reusing knowledge and
experience,” Sloan Management Review 37(1):55-64.
V.R. Basili, F.E. McGarry, R. Pajerski and M. Zelkowitz (2002). “Lessons learned from
25 years of process improvement: the rise and fall of the NASA software engineering
laboratory.” Proceedings of the 24rd International Conference onSoftware Engineering,
2002. ICSE 2002, ACM.
B. Boehm (2006). “A view of 20th and 21st century software engineering.” Proceedings
of the 28th international conference on software engineering, Shanghai, China, ACM.
W.F. Boh, S.A. Slaughter and J.A. Espinosa (2007). “Learning from experience in
software development: A multilevel analysis,” Management Science 53(8):1315-1331.
A. Börjesson, F. Martinsson and M. Timmerås (2006). “Agile improvement practices in
software organizations,” European Journal of Information Systems 15(2):169-182.
R. Conradi and A. Fuggetta (2002). “Improving software process improvement,” IEEE
Software 19(4):92-99.
T. Dingsøyr and R. Conradi (2002). “A survey of case studies of the use of knowledge
management in software engineering,” International Journal of Software Engineering and
Knowledge Engineering 12(4):391-414.
T. Dybå (2001). Enabling software process improvement: An investigation of the
importance of organizational issues. PhD thesis, Trondheim, Norwegian University of
Science and Technology, November 5, 2001.
T. Dybå (2005). “An empirical investigation of the key factors for success in software
process improvement,” IEEE Transactions on Software Engineering 31(5):410-424.
B. Fitzgerald and T. O'Kane (1999). “A longitudinal study of software process
improvement,” IEEE Software 16(3):37-45.
A. Fuggetta (2000). Software process: A roadmap. In The future of software engineering.
A. Finkelstein, eds. New York, ACM Press:25-34.
B. Hansen, J. Rose and G. Tjørnehøy (2004). “Prescription, description, reflection: the
shape of the software process improvement field,” International Journal of Information
Management 24(6):457-472.
F. Houdek, K. Schneider and E. Wieser (1998). Establishing experience factories at
Daimler-Benz - an experience report. In Proceedings of the 1998 International
Conference on Software Engineeringeds. Los Alamitos, IEEE Computer Society: 443-
447.
W.S. Humphrey, T.R. Snyder and R.R. Willis (1991). “Software process improvement at
Hughes-Aircraft,” IEEE Software 8(4):11-23.
J.H. Iversen, L. Mathiassen and P.A. Nilsen (2004). “Managing risk in software process
improvement: An action research approach,” MIS Quarterly 28(3): 395-433.
F. McGarry (1994). “Process improvement is a bottom-up task,” IEEE Software 11(4):13-
13.
G. Melnik and F. Maurer (2006). Comparative Analysis of Job Satisfaction in Agile and
Non-agile Software Development Teams. In XP2006. P. Abrahamsson et al, eds. Berlin
Heidelberg, Springer-Verlag. LNCS 4044:32-42.
P.A. Nielsen and K. Kautz, Eds. (2008). Software processes & knowledge: Beyond
conventional software process improvement. Aalborg, Software Innovation Publisher.
L. Osterweil (1987). “Software processes are software too.” Proceedings of the 9th
international conference on Software Engineering, Monterey, California, United States,
IEEE Computer Society Press.
M.C. Paulk, B. Curtis, M.B. Chrissis and C.V. Weber (1993). Capability Maturity Model
for Software, version 1.1. Technical report CMU/SEI-93-TR-024: 82.
F.J. Pino, O. Pedreira, F. Garcia, M.R. Luaces and M. Piattini (2010). “Using Scrum to
guide the execution of software process improvement in small organizations,” Journal of
Systems and Software 83(10):1662-1677.
O. Salo and P. Abrahamsson (2007). “An iterative improvement process for agile
software development,” Software Process: Improvement and Practice 12(1): 81-100.
S. Sawyer, J. Farber and R. Spillers (1997). “Supporting the social processes of software
development,” Information Technology & People 10(1):46-62.
M. Staples and M. Niazi (2008). “Systematic review of organizational motivations for
adopting CMM-based SPI,” Information and Software Technology 50(7–8): 605-620.
M. Thomas (1994). “Process improvement is a top-down task,” IEEE Software 11(4):12-
12.
M. Unterkalmsteiner, T. Gorschek, A.K.M.M. Islam, C.K. Cheng, B. Permadi and R. Feldt
(2012 - To appear). “Evaluation and measurement of software process improvement - a
systematic literature review,” IEEE Transactions on Software Engineering.
R. van Solingen (2004). “Measuring the ROI of software process improvement,” IEEE
Software 21(3):32-38.
I. Aaen, J. Arent, L. Mathiassen and O. Ngwenyama (2001). “A conceptual MAP of
software process improvement,” Scandinavian Journal of Information Systems 13:81-
101.
I. Aaen (2003). “Software process improvement: Blueprints versus recipes,” IEEE
Software 20(5):86-93.