Experience Report

12
Huddle Your QA Experience Report Dawn Carvell Pushpa Reddy

Transcript of Experience Report

Page 1: Experience Report

Huddle Your QAExperience Report

Dawn CarvellPushpa Reddy

Page 2: Experience Report

ContextWe were sent into project where the company

has started to be agileThey had not delivered for 2 yearsThey had spent 6 months deriving 800 page

requirement specificationWe were bought in to deliver itOne of our goals was also enable to continue

after we leave

Page 3: Experience Report

Problem DefinitionNot co-located yet in the same buildingThey did not have clue what the other people doing

Developers Manual Testers Automation Testers NFR testers BAs

• If an integration issue if found (which was very common) it took weeks to fix it

• Testers test after developers complete development (usually months)

Page 4: Experience Report

Automation scripts took 48 hours in 14 parallel machines to execute

It took weeks to analyse results and to find whether the script failed for genuine reasons

Developers had no clue what scripts containQAs found hundreds of defect in each build,

they were asked to not to find anymore defects

To top it all off the QAs were threatened to lose their jobs if they can’t cope with change

Page 5: Experience Report

What we did?Not co-located yet in the same building so that

no one knew what anyone else are doing

Moved everyone around so that the whole team was able to sit together

Page 6: Experience Report

They did not have clue what the other people doing QAs sat with developersQAs paired with devs to write automated

acceptance testsQAs paired with BAs to write acceptance

criteriaTeam did regular performance testing

Page 7: Experience Report

Automation scripts took 48 hours in 14 parallel machines to executeA tool which can integrate with continuous

integration system was chosen for frequent execution of tests

We used light weight tool which took lesser time to execute the tests

Tests were carefully selected and incrementally built to cover just enough and no more than necessary

Page 8: Experience Report

If an integration issue if found (which was very common) it took weeks to fix itDue to continuous integration bugs were fed

back faster and issues were fixed earlier in the life cycle which costs less

Continuous communication also helped in fixing the defects found in manual regression and exploratory which were still fresh in devs minds

Page 9: Experience Report

Testers test after developers complete development (usually months)As we had continuous integration we were able

to get a build anytime we can, so that we could test story by story basis

This also gave us a confidence in the build because the tests are successfully passed

Page 10: Experience Report

It took weeks to analyse results and to find whether the script failed for genuine reasonsAs the test suite is continuously built as the

development is done, it was easier to find the reason for failing test (it’s more likely that new test is failed)

Checking in small changes if the build failed its more clearer that the new change had caused the fail.

Page 11: Experience Report

Developers had no clue what scripts contain

As QAs and devs paired to write acceptance test, the entire team was in a sync about the contents of test suite

Page 12: Experience Report

QAs found hundreds of defects in each build, they were asked to not to find anymore defectsLess number of defects were found per build as

we were continuously testing