Agile Test Management and Reporting—Even in a Non-Agile Project

13
rent Session Presented by: Paul Holland Te s Brought to you by: 340 Corporate Way, Suite Orange Park, FL 32073 8882 T9 Concur 4/8/2014 2:00 PM “Agile Test Management and ReportingEven in a NonAgile Project” sting Thought 300, 688770 9042780524 [email protected] www.sqe.com

description

Whether you have dedicated test teams or testers distributed over Scrum teams, you have the challenge of planning, tracking, and reporting their testing not only in a meaningful way but also in a way that can adapt to the rapidly changing environment of software development projects. Many commonly used planning methods do not allow for flexibility, and reporting often relies on horribly flawed metrics including number of test cases executed or test pass percentage. Paul Holland explains a planning, tracking, and reporting method he developed during his last five years as a test manager at Alcatel-Lucent. Paul describes how he uses powerful “high-tech” tools like whiteboards and spreadsheets to create easy-to-understand visual representations of his group’s testing. Learn how you can create status reports that provide the details that upper management seeks. These status reports are effective in both waterfall and agile environments—and will stand up to management scrutiny.

Transcript of Agile Test Management and Reporting—Even in a Non-Agile Project

Page 1: Agile Test Management and Reporting—Even in a Non-Agile Project

 

 

 

rent Session 

 

Presented by: 

Paul Holland Te s 

  

Brought to you by: 

  

340 Corporate Way, Suite   Orange Park, FL 32073 888‐2

T9 Concur4/8/2014   2:00 PM     

“Agile Test Management and Reporting‐ Even in a Non‐Agile Project” 

 

 

sting Thought  

    

300,68‐8770 ∙ 904‐278‐0524 ∙ [email protected] ∙ www.sqe.com 

Page 2: Agile Test Management and Reporting—Even in a Non-Agile Project

Paul Holland Testing Thoughts  

An independent software test consultant and teacher, Paul Holland has more than sixteen years of hands-on testing and test management experience, primarily at Alcatel-Lucent where he led a transformation of the testing approach for two product divisions, making them more efficient and effective. As a test manager and tester, Paul focused on exploratory testing, test automation, and improving testing techniques. For the past five years, he has been consulting and delivering training within Alcatel-Lucent and externally to companies such as Intel, Intuit, Progressive Insurance, HP, RIM, and General Dynamics. Paul teaches the Rapid Software Testing course for Satisfice. For more information visit testingthoughts.com.

Page 3: Agile Test Management and Reporting—Even in a Non-Agile Project

1

Agile Test Management and Reporting –Even in a non-Agile Project

Paul HollandConsulting Tester and Teacher

at Testing Thoughts

My Background• Independent S/W Testing consultant since Apr 2012• 16+ years testing telecommunications equipment y g q p

and reworking test methodologies at Alcatel-Lucent• 10+ years as a test manager• Presenter at STAREast, STARWest, Let’s Test,

EuroSTAR and CAST• Keynote at KWSQA conference in 2012• Facilitator at 35+ peer conferences and workshopsFacilitator at 35 peer conferences and workshops• Teacher of S/W testing for the past 5 years• Teacher of Rapid Software Testing

– through Satisfice (James Bach): www.satisfice.com• Military Helicopter pilot – Canadian Sea KingsApril, 2013 ©2013 Testing Thoughts 2

Page 4: Agile Test Management and Reporting—Even in a Non-Agile Project

2

Attributions

• Over the past 10 years I have spoken with many people regarding Software Testing I cannotpeople regarding Software Testing. I cannot directly attribute any specific aspects of this talk to any individual but all of these people (and more) have influenced my opinions and thoughts on metrics:– Cem Kaner, James Bach, Michael Bolton, RossCem Kaner, James Bach, Michael Bolton, Ross

Collard, Doug Hoffman, Scott Barber, John Hazel, Eric Proegler, Dan Downing, Greg McNelly, Ben Yaroch

April, 2013 ©2013 Testing Thoughts 3

Elements of Bad Metrics

1. Measure and/or compare elements that are inconsistent in size or compositioninconsistent in size or composition– Impossible to effectively use for comparison– How many containers do you need for your

possessions?

– Test Cases and Test StepsTest Cases and Test Steps• Greatly vary in time required and complexity

– Bugs• Can be different severity, likelihood - i.e.: risk

April, 2013 ©2013 Testing Thoughts 4

Page 5: Agile Test Management and Reporting—Even in a Non-Agile Project

3

Elements of Bad Metrics

2. Create competition between individuals and/or teamsand/or teams– They typically do not result in friendly competition– Inhibits sharing of information and teamwork– Especially damaging if compensation is impacted

– Number of xxxx per tester – Number of xxxx per feature

April, 2013 ©2013 Testing Thoughts 5

Elements of Bad Metrics

3. Easy to “game” or circumvent the desired intentionintention– Easy to be improved by undesirable behaviour

– Pass rate (percentage): Execute more simple tests that will pass or break up a long test case into many smaller ones

– Number of bugs raised: Raising two similar bug reports instead of combining them

April, 2013 ©2013 Testing Thoughts 6

Page 6: Agile Test Management and Reporting—Even in a Non-Agile Project

4

Elements of Bad Metrics

4. Contain misleading information or gives a false sense of completenessfalse sense of completeness– Summarizing a large amount of information

into one or two numbers out of context

Coverage (Code Path)– Coverage (Code, Path)• Misleading information based on touching the code

once– Pass rate and number of test cases

April, 2013 ©2013 Testing Thoughts 7

Impact of Using Bad Metrics• Promotes bad behaviour:

Testers may create more smaller test cases instead of− Testers may create more smaller test cases instead of creating test cases that make sense

− Execution of ineffective testing to meet requirements− Artificially creating higher numbers instead of doing

what makes sense− Creation of tools that will mask inefficiencies (e.g.: lab

equipment usage)− Time wasted improving the “numbers” instead of

improving the testing

April, 2013 ©2013 Testing Thoughts 8

Page 7: Agile Test Management and Reporting—Even in a Non-Agile Project

5

Impact of Using Bad Metrics• Gives Executives a false sense of test coverage

– All they see is numbers out of contexty– The larger the numbers the better the testing– The difficulty of good testing is hidden by large “fake”

numbers• Dangerous message to Executives

– Our pass rate is at 96% so our product is in good shapeCode co erage is at 100% o r code is completel– Code coverage is at 100% - our code is completely tested

– Feature specification coverage is at 100% - Ship it!!!• What could possibly go wrong?

April, 2013 ©2013 Testing Thoughts 9

So … Now what?

• I have to stop counting everything. I feel k d d dnaked and exposed.

• Track expected effort instead of tracking test cases using:

Whiteboard– Whiteboard– Excel spreadsheet

April, 2013 ©2013 Testing Thoughts 10

Page 8: Agile Test Management and Reporting—Even in a Non-Agile Project

6

Whiteboard• Used for planning and tracking of test

executionexecution• Suitable for use in waterfall or agile (as long

as you have control over your own team’s process)

• Use colours to track:F– Features, or

– Main Areas, or– Test styles (performance, robustness, system)

April, 2013 ©2013 Testing Thoughts 11

Whiteboard• Divide the board into four areas:

– Work to be done (divided into two sections)Work to be done (divided into two sections)– Work in Progress– Cancelled or Work not being done– Completed work

• Red stickies indicate issues (not just bugs)• Create a sticky note for each half day of work (or

mark # of half days expected on the sticky note)• Prioritize stickies daily (or at least twice/wk)• Finish “on-time” with low priority work incomplete

April, 2013 ©2013 Testing Thoughts 12

Page 9: Agile Test Management and Reporting—Even in a Non-Agile Project

7

Sticky Notes

• All of these items are optional – add your own p yelementsUse what makes sense to your situation– Charter Title (or Test Case Title)– Estimated Effort– Feature area

Tester name– Tester name– Date complete– Effort (# of sessions or half days of work)

• Initially, estimated -> replace with actual

April, 2013 ©2013 Testing Thoughts 13

Actual Sample Sticky

Charter TitleCharter Title

Tester

AreaArea

EffortApril, 2013 ©2013 Testing Thoughts 14

Page 10: Agile Test Management and Reporting—Even in a Non-Agile Project

8

Whiteboard Example

End of week 1Out of 7 weeks

April, 2013 ©2013 Testing Thoughts 15

Whiteboard

• Consider adding a dotted line across the W k t b D tiWork to be Done section.

• Items below the line are likely not going to be executed before time expires

• Not all tests can be executedAllows for prioritization so the right tests• Allows for prioritization so the right tests get executed

April, 2013 ©2013 Testing Thoughts 16

Page 11: Agile Test Management and Reporting—Even in a Non-Agile Project

9

Whiteboard Example

End of week 6Out of 7 weeks

April, 2013 ©2013 Testing Thoughts 17

Reporting• An Excel Spreadsheet with:

– List of Charters– Area– Estimated Effort– Expended Effort– Remaining Effort– Tester(s)– Start Date– Completed DateCo p e ed a e– Issues– Comments

• Does NOT include pass/fail percentage or number of test cases

April, 2013 ©2013 Testing Thoughts 18

Page 12: Agile Test Management and Reporting—Even in a Non-Agile Project

10

Sample ReportCharter Area

Estimated Effort

Expended Effort

Remaining Effort Tester Date Started

Date Completed

Issues Found Comments

Investigation for high QLN spikes on EVLTH/W

Performance 0 20 0 acode 12/10/2011 01/14/2012ALU01617032

Lots of investigation. Problem was on 2-3 out of 48 ports which just happened to be 2 of the 6 ports I tested.

ARQ Verification under different RA Modes ARQ 2 2 0 ncowan 12/14/2011 12/15/2011

POTS interference ARQ 2 0 0 --- 01/08/2012 01/08/2012Decided not to test as the H/W team already tested this functionality and time was tight.

Expected throughput testing ARQ 5 5 0 acode 01/10/2012 01/14/2012

INP vs. SHINE ARQ 6 6 0 ncowan 12/01/2011 12/04/2011

To translate the files properly had to install

April, 2013 ©2013 Testing Thoughts 19

INP vs. REIN ARQ 6 7 5 jbright 01/06/2012 01/10/2012

To translate the files properly, had to install Python solution from Antwerp. Some overhead to begin testing (installation, config test) but was fairly quick to execute afterwards

INP vs. REIN + SHINE ARQ 12 12

Traffic delay and jitter from RTX ARQ 2 2 0 ncowan 12/05/2011 12/05/2011

Attainable Throughput ARQ 1 4 0 jbright 01/05/2012 01/08/2012

Took longer because was not behaving as expected and I had to make sure I was testing correctly. My expectations were wrong based on virtual noise not being exact.

Weekly Report

• A PowerPoint slide indicating the important issues (not a count but a list)issues (not a count but a list)– “Show stopping” bugs– New bugs found since last report– Important issues with testing (blocking bugs,

equipment issues, people issues, etc.)Risks (updates and newly discovered)– Risks (updates and newly discovered)

– Tester concerns (if different from above)– The slide on the next page indicating progress

April, 2013 ©2013 Testing Thoughts 20

Page 13: Agile Test Management and Reporting—Even in a Non-Agile Project

11

Sample Report

90

"Awesome Product" Test Progress as of 02/01/2012Original Planned Effort

30

40

50

60

70

80

ort(

pers

on h

alf d

ays)

EffortExpended Effort

Total Expected Effort

Direction of lines indicates effort trend since last report

Solid centre bar=finished Green: No concernsYellow: Some concernsRed: Major concerns

April, 2013 ©2013 Testing Thoughts 21

0

10

20

ARQ SRA Vectoring Regression H/W Performance

Effo

Feature

April, 2013 ©2013 Testing Thoughts 22