BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods...

40
BIO PRESENTATION SUPPLEMENTAL MATERIALS International Conference On Software Testing Analysis & Review November 15-19, 2004 Anaheim, CA USA W12 November 17, 2004 3 PM APPLYING EXTREME PROGRAMMING TECHNIQUES TO AUTOMATED TESTING Neill McCarthy BJSS

Transcript of BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods...

Page 1: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

BIOPRESENTATIONSUPPLEMENTAL MATERIALS

International Conference OnSoftware Testing Analysis & Review

November 15-19, 2004Anaheim, CA USA

W12

November 17, 2004 3 PM

APPLYING EXTREME

PROGRAMMING TECHNIQUES TO

AUTOMATED TESTING

Neill McCarthyBJSS

Page 2: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

Neill McCarthy Neill is a career tester with over 14 years of experience. He has worked through the project development life cycle and covered a variety of methods and approaches. Neill has bought his experience to clients from a strategic level through to hands on delivery and has adapted his ideas to fit across a variety of contexts. In the last five years Neill has come across a number of different approaches including Agile that had challenged his previous structured approach to testing and has been learning to adapt these to assist him in his own work. In the last 3 years Neill has become interested in the open source community and has been involved in the Hyades project within Eclipse. Neill is a Technical Test Manager for BJSS. BJSS are a UK based IT consultancy that specialise in delivering projects to a methodology that is appropriate in the clients’ context. As well as providing Development, Outsource Management, Local Provisioning services they provide Technical Testing and Test Consultancy on complex projects.

Page 3: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

Applying eXtreme Programming Techniques to Automated Testing

Neill McCarthy

Star West 2004

[email protected]

www.bjss.co.uk

StarWest 2004 Session W12 © 2004

Page 4: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

Overview

• The Learning Objectives

• A Historic Context

• The Big Idea

• The Journey And The Destination

• Future Journeys : A Rough Map

StarWest 2004 Session W12 Neill McCarthy © 2004 1

Page 5: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

The Learning Objective

• Why we adopted XP for test automation

• The people challenges of using XP in testing

• A framework for agile test automation development

StarWest 2004 Session W12 Neill McCarthy © 2004 2

Page 6: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

The Historic Context: The Good

• +ve: TPI/ TMM approaches had been tried

• +ve: Investment in automation tools ($350K spent)

• +ve: Automated regression tests existed (8000+)

• +ve: Some coding standards were documented

We knew we had problems & we were trying to solve them

StarWest 2004 Session W12 Neill McCarthy © 2004 3

Page 7: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

The Historic Context: The Bad

• -ve: Automation was seen as a magic bullet that failed

• -ve: Slow reporting did not provide business confidence

• -ve: Limited interaction with development and users

• -ve: Test Code had single owners

Other Areas had lost confidence in us and we had internalised

StarWest 2004 Session W12 Neill McCarthy © 2004 4

Page 8: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

The Historic Context: The Ugly

• Testing had a poor image

• Lack of results: Buggy releases

• Too long to test: Too long to automate

• Automated scripts: manual interventions

• Test script updates took forever and broke other tests.

• Test Code: Lack of applied standards

• Test Code: Difficult to maintain

• Test Code: Coverage did not reflect use

We did ourselves no favours by trying to be too smart

StarWest 2004 Session W12 Neill McCarthy © 2004 5

Page 9: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

James Bach’s Automation Formula

1.Purchase an expensive GUI test execution tool

2. Define a lot of paper procedures

3. Hire an automation team to automate each one

4. Build a comprehensive test library and framework

5. Keep Fixing it...

4 out of 5 of these criteria were met

Our approach did not fit our context

StarWest 2004 Session W12 Neill McCarthy © 2004 6

Page 10: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

Drivers and Constraints

• DRIVERS

• Required Improvement: Perception of Test Team

• Required Improvement: Relevant Test Coverage

• Required Improvement: Test Automation Quality

• Required Improvement: Timeliness of Test Creation

• Constraints

• No Additional Head Count

(recruitment freeze)

• Limited time to demonstrate Improvements (3 months)

• No New Tools Allowed ($350K Spent)

• No Impact to other Deliveries (2 Regulatory Projects)

• Limited Budget for Training ($10K)

StarWest 2004 Session W12 Neill McCarthy © 2004 7

Page 11: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

The Cultural Challenge

Time

Spend

Blame Culture

Lack of Ownership

Regulated - Process Orientated

StarWest 2004 Session W12 Neill McCarthy © 2004 14

Page 12: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

The Big Idea

• Test Driven for Development Code

• Waterfall based – Code Driven Development for Test Code

If Development test prior to coding why don’t Test?

Measure

r AdaptDeliver

Execute

Execute

Build

Analyse

Design

StarWest 2004 Session W12 Neill McCarthy © 2004 9

Page 13: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

Building the approach

• Identified the organisations existing methods

• Levered existing dev good practice from those who would show us

• Took elements that fitted limited team experience

• Prototyped a number of approaches without fear of failure

• Built on each success and held regular retrospectives

Agile: XP

Agile: Scrum

Agile: DSDM

Agile: RUP

StarWest 2004 Session W12 Neill McCarthy © 2004 10

Page 14: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

XP Techniques – Overview 1

Code

The customer is always available

Code must be written to agreed standards

Code the Unit Test First

All Production Code is pair programmed

Only one pair integrates code at a time

Integrate Often

Use Collective Code Ownership

Leave Optimisation till last

Plan

User Stories are Written

Release Planning Creates Schedule

Project velocity is measured

Project is divided into iterations

Iteration Planning Starts each iteration

Move People Around

A Stand up Meeting Starts each day

Fix XP when it Breaks

No Overtime

StarWest 2004 Session W12 Neill McCarthy © 2004 11

Page 15: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

XP Techniques – Overview 2

Test

All Code must have unit tests

All code must pass unit test prior to release

When a bug is found tests are created

Acceptance tests are run often and the score is published

Design

Simplicity

Chose a system metaphor

Use Cards for Design Sessions

Create Spike Solutions to reduce Risk

No Functionality is added early

Refactor whenever and wherever possible

StarWest 2004 Session W12 Neill McCarthy © 2004 12

Page 16: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

Starting the Journey: Our 1st Roadmap

• Agreed Initial Scope

• Brown bags on “the big idea”

• 3 sets of 1 week bursts agreed

• Each burst piloted some of XP

• Retrospectives post each burst

• 12 week proving: weekly goals

Retrospective

Adapt

Technique

Goal

Context

Pilot

StarWest 2004 Session W12 Neill McCarthy © 2004 13

Page 17: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

The People Challenges (1)

Over ThinkingOver Thinking

Feed BackFeed Back

Celebrating Success

Celebrating Success

BreakingCliques

BreakingCliques

DisorientationDisorientation

Over Confidence

Over Confidence

InexperienceInexperience

People ChallengesPeople

Challenges

StarWest 2004 Session W12 Neill McCarthy © 2004 15

Page 18: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

The People Challenges (2)

Buy InBuy In

Common OwnershipCommon Ownership

IntrovertedTeam

IntrovertedTeam

RolesChallengedRoles

Challenged

Used toProcessUsed toProcess

Need forNumbersNeed forNumbers

Fear of ChangeFear of Change

People ChallengesPeople

Challenges

StarWest 2004 Session W12 Neill McCarthy © 2004 16

Page 19: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

Our Immediate Successes

• Pair Programming

• User Stories

• Project Dashboard

• Stand up Meetings

Pair Programmed Scripts

0

200

400

600

800

1000

1 2 3 4 5 6 7 8 9 10 11 12

StarWest 2004 Session W12 Neill McCarthy © 2004 17

Page 20: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

Our Successes: over 12 Weeks

Script Maintenance Hours

0

20

40

60

80

100

120

1 2 3 4 5 6 7 8 9 10 11 12

Number of Relevant Tests

0

200

400600

800

1000

1 2 3 4 5 6 7 8 9 10 11 12

Number of Defects at Release

0

5

10

15

20

1 2 3 4 5 6 7 8 9 10 11 12

Test Interventions

0

1000

2000

3000

4000

1 2 3 4 5 6 7 8 9 10 11 12

StarWest 2004 Session W12 Neill McCarthy © 2004 18

Page 21: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

Our Failures: Over 12 weeks

• Did not keep communicating

• Stopped celebrating success

• Started to work over time

• Failed to embrace all feedback

• Over Complication of process

Average hours between dashboard updates

0

1

2

3

4

5

6

7

8

9

10

1 2 3 4 5 6 7 8 9 10 11 12

StarWest 2004 Session W12 Neill McCarthy © 2004 19

Page 22: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

Our Gains

Improved: Perception of Test Team

+Improved: Relevant Test Coverage

+Improved: Test Automation Quality

+Improved: Timeliness of Test Creation

=Improved: Team Morale, Productivity & Product Quality

StarWest 2004 Session W12 Neill McCarthy © 2004 20

Page 23: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

The New Context

Measure

AdaptDeliver

Execute Measure

AdaptDeliver

Execute

StarWest 2004 Session W12 Neill McCarthy © 2004 21

Page 24: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

XP Principles We left

Test

NONE

Design

Chose a system metaphor

Use Cards for Design Sessions

Code

The customer is always available

Code the Unit Test First

All Production Code is pair programmed

No Overtime

Plan

NONE

StarWest 2004 Session W12 Neill McCarthy © 2004 24

Page 25: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

We Wished We Knew That !

• Simplicity: People have had this idea (or similar) and want to share it

• Communication: People want you to succeed – if informed regularly

• Feedback: Learning to give rapid feedback is an art

• Courage: If it does not work: fix it, communicate & learn from failure

Agile is a paradigm and it’s a step learning curve

It’s easy if you embrace it and you have an agile state of mind

An experienced coach is not optional – but they can be remote

StarWest 2004 Session W12 Neill McCarthy © 2004 25

Page 26: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

The Next Journey

New TechniquesAdditional ToolsAddition ProjectsPerformance TestsImproved Process

StarWest 2004 Session W12 Neill McCarthy © 2004 26

Page 27: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

Getting Started: A Simple Map

Define Context

Hold Retrospectives

Refactor Process

Agree Quality Goals

Publish Manifesto

Create Dashboard

Execute Process

StarWest 2004 Session W12 Neill McCarthy © 2004 27

Page 28: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

Key Lessons

• Take the best methods form all schools

• Learn from the lessons Development and Project Management already have

• Tailor an approach to fit your context

Page 29: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

Links

• www.bjss.co.uk

• www.xprogramming.com

• www.agilealliance.org

• www.ambysoft.com

• www.martinfowler.com

• www.tejasconsulting.com

• www.compendiumdevelopment.co.uk

• www.stickyminds.com

• www.satisfice.com

• www.methodsandtools.com

• www.testdriven.com

• www.balagan.org.uk

• www.softwarereality.com

• www.workroom-productions.com

StarWest 2004 Session W12 Neill McCarthy © 2004 28

Page 30: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

StarWest 2004 Track Session W12 Paper Applying eXtreme Programming Techniques to Automated Testing

© Neill McCarthy 2004 1 of 11

Biography: Neill is a career tester with over 14 years of experience. He has worked through the project development life cycle and covered a variety of methods and approaches. Neill has bought his experience to clients from a strategic level through to hands on delivery and has adapted his ideas to fit across a variety of contexts. In the last five years Neill has come across, and successfully deliver projects with, a number of different approaches including Agile that had challenged his previous structured approach to testing and has learnt to adapt these to assist him in his own work. In the last 3 years Neill has become interested in the open source community and has been involved in the Hyades project within Eclipse. Neill is a Technical Test Manager for BJSS. BJSS are a UK based IT consultancy that specialise in delivering projects to a methodology that is appropriate in the clients’ context. As well as providing Development, Outsource Management, Local Provisioning services they provide Technical Testing and Test Consultancy on complex projects. Abstract: Automating manual tests was taking too long and we believed that overhead would become too high to maintain the automated tests. As the code base evolved and expanded, the performance and value of older automated tests deteriorated noticeably. We needed to find a way of reinvigorating test and bringing the automated tests back under our control. This paper demonstrates the lessons learned from developers in an eXtreme Programming (XP) framework and how we began applying these practices to our test automation project and the results of this change in paradigm. XP: The Theory. The fundamentals of eXtreme Programming have been covered many times within other papers and to an in-depth level within the XP Series of books, with significant case studies of its application, from Addison Wesley. A list of some of these can be found in the Bibliography within this paper. The best summation of the processes and concept of XP that I have so far come across is Ron Jefferies’: Extreme Programming is a discipline of software development based on the values of simplicity, communication, feedback and courage. It works by bringing the whole team together in the presence of simple practices, with enough feedback to enable the team to see where they are and to tune the practices to their unique situation. (www.xprogramming.com) We will come back to the key points of this later in the paper. There are a number of agile techniques of which XP is just one, for a further range see diagram one within this paper. We chose to mainly follow the tenants of XP as this was the one technique the team was most familiar with and it was also the one the management team had heard of. These two factors alone made it a much easier sell into the projects during our programme of improvement.

Page 31: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

StarWest 2004 Track Session W12 Paper Applying eXtreme Programming Techniques to Automated Testing

© Neill McCarthy 2004 2 of 11

Diagram one: A limited selection of Agile Techniques XP: The Practice As practiced XP often involves small teams and the testing is managed within the reuse of the harnesses created at the start of iterations prior to any code being cut. The user stories are then executed manually by the user community as selected at the start of the iteration. This has lead to the process often being ignored by the wider test community as it has been cut out of the loop, or it was perceived to require highly technical skilled coders to be involved in the creation of the tests at the unit level and again this left the test team out of the process. Standard System test was not perceived to fit the process and neither was the common use of vendor created automated GUI testing tools. The first time this concept was challenged and the results of the challenge were published was through the work of Lisa Crispin that led to her co-authorship of a book on the subject, see bibliography within this paper. Case Study: Applying XP Techniques to Automated Testing This paper is based on work carried out during the period 2002-2004. As permission for details of the clients has not been granted these cannot be explicitly referenced and no specific code examples can be supplied in this version of the paper. Setting the Context: Whilst working on a site with a large number of automated tests across a range of complex, inter-related and frequently changing systems we were finding the overhead of managing the automated tests from a GUI test automation tool was becoming onerous. The systems were running on a four weekly release cycle. The majority of each release cycle was made up of fixes for the production environment for the systems. These defects were often not discovered during the testing of the project due to the lack of time and in correct coverage being executed during automation. The priority was being given to the execution of the regression tests and not to the further investigation of the defects, this was a long standing entrance criteria to the production environment that the full regression suite would be executed, this was eight thousand scripts that though only took 16 hours to execute from the test lab, from 10 machines this was not happening successfully in practice due to the effort required to maintain the scripting and due to a significant number of manual interventions required.

Agile: XP

Agile: Scrum

Agile: DSDM

Agile: Crystal Methods

Page 32: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

StarWest 2004 Track Session W12 Paper Applying eXtreme Programming Techniques to Automated Testing

© Neill McCarthy 2004 3 of 11

The Challenge: Our challenge was to attempt to improve the quality of the test automation and rebuild the reputation of the testing group and the view of the management team of the concept of test automation. The perceived failure of the tools was going to have a direct impact on the future spend of the test team as we were not demonstrating a return on investment by an improvement in process over the 12 months since the spend was incurred. From the initial feedback from management it was felt the following four goals would be the drivers to demonstrate the success of the actions we were taking:

• Required Improvement: Perception of Test Team • Required Improvement: Relevant Test Coverage • Required Improvement: Test Automation Quality • Required Improvement: Timeliness of Test Creation

No specific targets for these were set out the beginning of the process as we did not wish to force artificial constraints on ourselves or to provide targets that were not achievable whilst we learnt the new methodology. This also assisted us in removing the fear of the initiative being perceived and judged as another failure. The actions we could take to enable the improvements were limited by the following constraints:

• No Additional Head Count (recruitment freeze): No additional recruitment was allowed as the team was at headcount for the year and within the division the additional contracts spend had been reached to met the demands of regulatory projects.

• Limited time to demonstrate Improvements (3 months): The test team felt they had a quarter to retrieve the testing effort and implement significant improvements. This was self imposed by the team to be inline with the year end and the planning cycle for the following business years’ initiatives and spending cycles. This alignment was felt as the best way to bed in these practices as an ongoing activity.

• No New Tools Allowed: $350K had been spent in the previous year on the purchase of a major tools vendor tool set. In addition spend had occurred on the training of the teams within these tools and a significant amount of test collateral had been created with them by the point we were certain the solution was no longer appropriate. The corporate standard had been signed off to these tools and the decision process was perceived as taking too long for this to change prior to the collapse of our automated testing effort.

• No Impact to other Deliveries: There were requirements to meet the deadlines for two Regulatory Projects which non compliance to would have had a material impact on the business. No resources were diverted from these and it was the driver of these programmes that led to the breaking of the no overtime principle.

• Limited Budget for Training: $10K was the spend remaining within the test teams current year plan and no new budget was agreed for that quarter.

The Process Adopted: The process adopted followed a two iteration approach. The first was a guerrilla exercise where we piloted internally within the test and development teams to learn the concepts and build our own confidence. The second was a more formal approach where we invited management along to the brown bag sessions to explain to them the initiatives we had taken and how we felt we could use these to take the testing within the organisation forward.

Page 33: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

StarWest 2004 Track Session W12 Paper Applying eXtreme Programming Techniques to Automated Testing

© Neill McCarthy 2004 4 of 11

Diagram two: the process adopted during the three week concept proofing The process adopted in diagram two was applied through the initial three week bursts of activities. These were executed over a twelve week period prior to the agreement to enter the true change programme and where short one of experiments tried as guerrilla testing whilst delivering the normal service to the ongoing programme. The guerrilla approaches used were adapted from talks previously given by James Lyndsay and Harry Robinson, see bibliography: Conference materials and papers. The key to the approach was the demonstration and working with the test team and a lead developer as our “tool-smith” and mentor through the process of XP and assisting use in reframing our testing context to the use of these approaches to automated test scripting within the constraints of the GUI automation tools. The initial activities broke down into the following steps:

• Context: Agreed Initial Scope – the initial activities for adoption were discussed at a high level it was agreed we would try the techniques in short burst against a couple of the projects which were rapid deliveries of quick fixes and had under a 40 hour delivery.

• The work done on these would be done in parallel with the business as usual testing and would fitted around this work as a way of proving the processes and fit of the approach to our context.

• These were seen as the projects where we could quickly prove if the ideas and approaches would work for us prior to approaching the management team with another proposal.

• Goal: Brown bags on “the big idea”: These set the scene with the key technical peers within the

development and test communities to enable the process to begin and for each side to share their experiences and views in a non confrontational manner. These were facilitated internally with no involvement from other management until the initial concepts had been thrashed out and agreed.

• Technique: Stand ups on the techniques being used and desired to be used were held with the clear

statements of the days’ goals, successes and any blocks that had prevented the goals being achieved. These were updated in the test meeting room on our dashboard for clear visibility for all interested parties.

Retrospective

Adapt

Technique

Goal

Context

Pilot

Page 34: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

StarWest 2004 Track Session W12 Paper Applying eXtreme Programming Techniques to Automated Testing

© Neill McCarthy 2004 5 of 11

• Pilot: three sets of one week bursts were agreed within the test team to try and establish some of the processes and as a way of gathering some initial information, including our feelings and comfort factor with the ideas in action as well as more conventional metrics, to assist us in deciding if this was something we were willing to take to management.

• Each burst piloted some of XP: Each of the one week burst, as described in the Scrum method, was

used to prove some of the techniques we would be looking to adopt for the on going test effort. These were selected by the team in our early brown bag sessions and where used with the enforcement of no change to these processes during that forty hour burst. This though counter to the XP ideal of constant change allowed us the opportunity to try these concepts without the team being distracted by other activities and losing the focus of trying to prove these concepts. The code these tests were piloted on was also “frozen” as these tests were executed during the last week prior to a release window.

• Retrospectives post each burst: The principle of holding retrospectives was introduced from the start of

the process to enable to share our experiences in the open forum to enable continuous learning. Initially these were not totally successful as we as a team learnt new behaviours and had to be willing to break from our traditional analytical introspection.

• 12 week proving: Weekly goals were set as were daily task goals. The success of these are not was

tracked through the morning stand up meetings and the weekly, Thursday night, team get together. The items that were not working or preventing us achieving the four identified quality targets were either removed or new goals were set and the expectations of all parties managed at the stand up meetings.

Diagram three: the process adopted during the twelve week exercise

Define Context

Hold Retrospectives

Refactor Process

Agree Quality Goals

Publish Manifesto

Create Dashboard

Execute Process

Page 35: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

StarWest 2004 Track Session W12 Paper Applying eXtreme Programming Techniques to Automated Testing

© Neill McCarthy 2004 6 of 11

The Process Used During the 12 week exercise: Through the 12 week exercise the roadmap in diagram three was used. This was created to help guide us through the process and provide a closed loop for communication and feedback. It also acted as a check for us if we felt we were losing velocity or straying from our intended destination. This is simplified as actually we iterated around this loop a number of times during the 12 week process and found that as we became more agile we could rapidly progress towards a goal. We also found that in the periods after the initial 12 week pilot, with out a coach or mentor, we often used this iterative loop to lose site of the initial goals and go off route as a team as we no longer used the process to guide us and allowed our scope to creep to far. This was resolved when a coach, albeit a remote one, assisted us in finding our path again and allowing us to deliver to our goals and occasionally say no to requests if we could not achieve them. The Techniques Used: The table below lists the techniques associated with XP as defined in the XP series of books published by Addison Wesley. Those in the table, diagram 4: XP techniques, marked with a star are techniques that by the end of the process we were no longer using. This was because we could not adapt them to our context due to constraints within the organisation, were not well enough understood within the group to be applied or were attempted during the pilot and found not to bring value to the process for us as a test team. The impact of these exclusions and exceptions are discussed in the results section of this document. Fuller explanations of these descriptions can be found within the literature given within the bibliography section of this document and have not been covered here for the sake of brevity. Plan User Stories are Written Release Planning Creates Schedule Project velocity is measured Project is divided into iterations Iteration Planning Starts each iteration Move People Around A Stand up Meeting Starts each day Fix XP when it Breaks No Overtime*

Code The customer is always available* Code must be written to agreed standards Code the Unit Test First* All Production Code is pair programmed* Only one pair integrates code at a time Integrate Often Use Collective Code Ownership Leave Optimisation till last

Design Simplicity Chose a system metaphor* Use Cards for Design Sessions* Create Spike Solutions to reduce Risk No Functionality is added early Refactor whenever and wherever possible

Test All Code must have unit tests All code must pass unit test prior to release When a bug is found tests are created Acceptance tests are run often and the score is published

Diagram 4: XP processes and techniques The Results: The graphs and tables numbered one to six at the end of this paper, capture some of the information we gathered during the 12 week exercise to understand the actions taken and the impact on the project. The twelve week deadline we had taken provided us with some simple trend lines for the improvements. It will be required to later take additional applications and graphs and trends to identify additional patterns of improvement or via the dashboard to identify rapidly any drop in velocity or improvements so immediate action can be taken.

Page 36: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

StarWest 2004 Track Session W12 Paper Applying eXtreme Programming Techniques to Automated Testing

© Neill McCarthy 2004 7 of 11

The results of not using specific techniques as defined in Diagram four are explained below. The customer is always available: This was restricted for the project due to the regulatory projects that were in progress. The immediate impact of this was that occasionally during the exercise we would lose velocity as we were unable to confirm our position with user stories or the coverage of the scripts we had created. This also at times prevented us celebrating the success of activities with all the players involved as they were not available at the time of the success but had been involved in the contributing actions. Code the Unit Test First: This principle was not always adhered to and was dropped due to the nature of the projects. However this was agreed to be reinstated by the end of the process as it was part of the cause of some creeping errors we observed during the 12 week cycle that could have been addressed if it had been adhered to. All Production Code is pair programmed: We did not always follow this tenant due to resource constraints. This often when not adhered to lead to code that was not really acceptable creeping in. This often came as the result of a team member running of in a bout of excitement over an idea. The loss of this principle was a mistake and in retrospect should never have occurred. The future use of the approach is expected to reinstate this idea. No Overtime: We broke this rule more often then I can count during the 12 week period. This was not constructive and led to some fractious discussions between tired team members, myself included. This was caused by our initial enthusiasm running away with us and driving us forward with undue haste. We also had a culture of overtime and long hours which though counter productive once established is hard to shift. The other problem we encountered by the dropping of this rule was we ended up using our initial metrics and goals for script creation not as a guide and yardstick but as a self enabling and self limiting measure. If we were falling behind or felt we were at a stand up we would use over time to get back to the goal hence the straight line in graph 5 of this paper rather then the more expected stepped improvement. After the 12 weeks our velocity dropped significantly as we set new more realistic goals and targets. Use Cards for Design Session: We as a team failed to embrace the concepts of cards. We ended up using a white board and often over complicated the solution to our scripting effort. We found the limiting size of cards to be a step to far and also we were uncomfortable with this within the regulated framework we were operating. We also tended to over formalise the test scripts to fit the test repository and management tool in place. In retrospect we should have taken this step though at the time it felt a step to far from our structured world. Conclusions: XP assisted us in providing rapid improvements in our delivery of testing and quality products into the business. In addition this was provided with a rapid turn around and noticeable result to the team morale and the perception of the team within the business and development communities. This was not only due to the improvement of the quality of the code but on the rituals and practices associated with XP with its core drivers of:

• Simplicity • Communication • Feedback • Courage

These four simple principles help us in the following ways. Simplicity: Assisted us from taking a step back from our complex coding within automation based on end to end test scripts with a rigid framework based on the Action-Word approach. This though once useful was acting as a limit to our thinking and was pushing us down a single path of though and prevented us from looking to other solutions. In addition this approach was complex and not flexible enough to cope with the regular release cycle and rapid changes within the environment.

Page 37: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

StarWest 2004 Track Session W12 Paper Applying eXtreme Programming Techniques to Automated Testing

© Neill McCarthy 2004 8 of 11

The simplification of test scripts and the removal of many of the complex routines for error calling within the scripts was advantageous as it lead to short and simple test code that was clear and could be easily and rapidly debugged or re-factored to fit changes to the test environment including data and functions. In addition this regular re-factoring allowed for the code to be optimised and reduced the test execution cycles over the period, though this was not a significant drop, down to 14 hours from 16 hours, it was a start that has provided encouragement that as our coding matures we may gain further advantages from this. Simplicity, buy the use of Stories and a simple visible dashboard allowed for clear and better communication with the business and development community that helped in raising our profile and our success rate on projects by using clear and easily understood language and concepts. Communication: Communication was vital to our success. This was eased by the simplicity of the approaches and the concepts used. The frequent and highly visible communication at first was difficult as it was not a good cultural fit with the team, we found however as we had success and celebrated these success we were able to improve the communication as we were no longer seen as “bad news bears” by the other areas within the business. Feedback: Rapid and accurate feedback enabled us to be agile and successful as all interested parties were aware of the effects of actions in a timely manner. It also enabled us to adjust and adapt when we were not meeting our own expectations for the activities we had agreed. The hardest challenge we had with feedback was the manner of achieving this successfully and to take on board criticism without personalising the issues. This was improving through practice and with facilitator/ coaches assisting us however this was not fully resolved by the end of this exercise. Courage: we discovered the courage to apply and adopt the techniques through our early successes. The ease of the adoption of some of the principles and the benefits we gained from the three proceeding principles also removed much of the fear of failure we suffered. This was due to the rapid cycles of activities any element of the approach that was not seen or felt to be adding benefit could be quickly addressed prior to it becoming an issue for the ongoing projects. Salient learning Point: The Toolkit should be created for each for each project and a light version of the information gathered and roadmap used kept so that it can be used on other projects as a basis for the approach adopted. All lessons learnt from all areas in an agile approach should be bought back in to the teams knowledge and shared as there are valuable lessons from development and project management that should be captured and utilised testing. Some Additional areas for Exploration: The following areas for this approach and method have yet to be explored. These are currently on our list to do and will be implemented as appropriate across projects where we believe there will be a benefit, the cultural and technology fit are present and where the results will be measurable. Application to Performance testing: At the time of this paper we only had applied these techniques to the automated regression testing using vendor specific GUI Testing Tools. We prioritised these as our area with the highest return. The performance testing however also towards the end of the placement showing similar issues with the size and complexity of scripts, a similar approach is being considered for this area in combination with a different approach to the modelling and creation of the performance test profiles. Some of the concepts for this approach have already been created and publish by Paul Down at Embarcadero in his paper Goals based performance testing as applied with their Extreme Test tool. See bibliography. Application of other Agile Methods: To further enhance the creation of additional tests and coverage for the system agile modelling techniques are also being investigated. These will provide additional approaches and coverage to the test planning and execution as well as hopefully by the combination of techniques this will enable further improvements in the testing lifecycle. IBM-Rational’s RUP approach has been identified as a possible approach for a number of projects which the test group will be involved in and it is felt there will be significant benefit in realigning some of the elements of these approaches to the automation effort.

Page 38: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

StarWest 2004 Track Session W12 Paper Applying eXtreme Programming Techniques to Automated Testing

© Neill McCarthy 2004 9 of 11

For further reading on Agile Modelling and for approaches to Model Driven Architectures there are some excellent articles and links on the sites of Harry Robinson and of Alan Richardson Application of Agile Methods for Test scripting to enterprise level programmes: It is the intention to not only role this out to additional projects but to explore the possibilities on larger programmes at an enterprise level. The scaling of agile methods to enterprise level programmes is still in its early stages and papers on this have been published by BJSS and ThoughtWorks. Application of Process with tools including open source: The use of the automation tools during the period of this work was dictated by a pervious expenditure. There was also a risk aversion to the adoption of open source tools. On new projects and with the wider adoption of Linux and other open source initiatives such as the Eclipse IDE there are opportunities to explore the use of additional tools within this context. The additional benefit this should bring will be the ability to re-factor and carry through the developer created tests through the lifecycle through the use of the Eclipse and Hyades projects.

Page 39: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

StarWest 2004 Track Session W12 Paper Applying eXtreme Programming Techniques to Automated Testing

© Neill McCarthy 2004 10 of 11

Acknowledgements: Alan Richardson of Compendium Developments for years of valuable support and acting as a sounding board on many occasions Mike Swaby for being the original developer in the pub who convinced me to put my money where my mouth was James Bach at Satisifice for allowing me to reference his materials and original ideas in the presentation Richard Durham of Citrix for talking the original ideas through with me and assisting me in finding some meaningful metrics, for once Andy Schneider and all the team at BJSS, for allowing me the time to prepare and come to conference, and for listen to this through its iterative development to the current version presented. Bibliography: Title

Author

Publisher

Extreme Programming -Explained Kent Beck Addison-Wesley Testing Extreme Programming Lisa Crispin & Tip House Addison-Wesley

Agile Development Alistair Cockburn Planning Extreme Programming Kent Beck and Martin Fowler Addison-Wesley Extreme Programming Explored William C. Wake Addison-Wesley Extreme Programming Instilled R. Jefferies, A. Anderson, C. Hendrickson Addison-Wesley Extreme Programming in Practice James Newkirk, Robert C. Martin Addison-Wesley Extreme Programming Applied Ken Auer, Roy Miller Addison-Wesley

Links: www.bjss.com www.xprogramming.com www.agilealliance.org www.ambysoft.com www.martinfowler.com www.tejasconsulting.com www.compendiumdevelopment.co.uk www.geocities.com/harry_robinson_testing www.embarcadero.com

• www.satisfice.com • www.methodsandtools.com • www.testdriven.com • www.balagan.org.uk • www.softwarereality.com • www.stickyminds.com • www.workroom-productions.com • www.eclipse.org/hyades • www.thoughtworks.com

Conference Materials and Papers: Goals Based Testing – Paul Down, Embarcadero. Q-Bit Testing Expo, London, UK 2003 The paper is available from Embarcadero on request. Agile Test Automation – James Bach, Satisifice, BCS Sigist, London, UK 2003 A low tech testing dashboard – James Bach, Satisifice, US, Star 1999 Selling your management on a new testing process – Harry Robinson, Microsoft, US, StarWest 2003 Scaling agile projects – Andrew Schneider, BJSS, UK, DSDM Leeds Road show 2004

Page 40: BJSS - StickyMinds€¦ · Building the approach • Identified the organisations existing methods • Levered existing dev good practice from those who would show us • Took elements

StarWest 2004 Track Session W12 Paper

Applying eXtreme Programming Techniques to Automated Testing

© Neill McCarthy 2004 11 of 11

Table 1: 12 Week Plot of Automated Tests with interventions Required Table 2: Team hours spent in maintaining Automated Tests

Week # Number ScriptsNumber Interventions % Week # Number Scripts Hours to Maintain

1 8000 4000 50 1 8000 1202 8000 3980 49.75 2 8000 1193 8000 3800 47.5 3 8000 1174 8000 3200 40 4 8000 1125 8000 2500 31.25 5 8000 1006 8000 2000 25 6 8000 807 8000 800 10 7 8000 608 8000 600 7.5 8 8000 409 8000 550 6.875 9 8000 37

10 8000 540 6.75 10 8000 3511 8000 530 6.625 11 8000 3712 8000 500 6.25 12 8000 40

Graph 1: Test Interventions Graph 2: Script Maintenance Time

Table 3: 12 week plot of scripts with agreed User Stories Table 4: Number of new defects identified in productionWeek # Number Scripts Relevant Test Week # Number Scripts Defects

1 8000 500 1 8000 202 8000 250 2 8000 203 8000 300 3 8000 204 8000 350 4 8000 185 8000 400 5 8000 186 8000 450 6 8000 187 8000 500 7 8000 188 8000 600 8 8000 129 8000 700 9 8000 12

10 8000 800 10 8000 1211 8000 850 11 8000 1212 8000 1000 12 8000 5

Graph 3: Relevant Coverage Graph 4:New Defects in production

Table 5: Pair Programmed Scripts Table 6: Working hours since dashboard last updatedWeek # Number Scripts Pair Programmed Week # Av. Work Hours

1 8000 75 1 12 8000 150 2 23 8000 225 3 24 8000 300 4 25 8000 375 5 36 8000 450 6 47 8000 525 7 58 8000 600 8 69 8000 675 9 8

10 8000 750 10 811 8000 825 11 412 8000 1000 12 4

Graph 5: Pair Programmed automated scripts Graph 6: Working Hours since dashboard update

0500

1000150020002500300035004000

1 2 3 4 5 6 7 8 9 10 11 12 0

20

40

60

80

100

120

140

1 2 3 4 5 6 7 8 9 10 11 12

0

200

400

600

800

1000

1200

1 2 3 4 5 6 7 8 9 10 11 12 0

5

10

15

20

25

1 2 3 4 5 6 7 8 9 10 11 12

0

1

2

3

4

5

6

7

8

9

1 2 3 4 5 6 7 8 9 10 11 12

0

100

200

300

400

500

600

700

800

900

1000

1 2 3 4 5 6 7 8 9 10 11 12