Tester performance evaluation

Post on 18-Dec-2014

1.568 views 0 download

description

 

Transcript of Tester performance evaluation

Tester Performance Evaluation

Liang Gao (lgao@sigma-rt.com)

Attributes of Good Testers

• Stubborn but reasonable• Fearless in front of developers & management• Sense of smell (dig the right place, profiling)• Common sense (when to trust developers)• Learn to knowledge broad & shallow (inter-

feature dependencies)• Methodological follow the process/procedure

Definition of Performance Management

• Defined the performance metrics to let tester deliver the quality work.

• Should not be something that they can deliver without doing the real useful work.

• Bug counts?• Hours of work?• Certification?• Peer Ratings?• Customers ratings?

Key Tasks of A Tester• Writing bug reports?• Designing, running, modifying test cases?• Developing test strategies / plans / ideas?• Editing technical documentation?• Interact with developers?• Writing support materials for the help desk or field support?• Facilitate inspections / reviews?• Requirements analysis—meet, interview, interpret needs of

stakeholders?• Release management? Archiving? Configuratio management? • Develop automation scripts?

Different Role in the Test Team • Management

– Dev Test Group manager

– System and solution test manager

– Regression test manager

– Tools group manager

• Engineer

– New feature testing engineer

– Regression engineer

– System / Solution testing engineer

– Testing Tools developer

Performance is Relative. Top Performance Group Bottom Performance Group

Top Sub

Group

Bottom Sub

Group

Middle Group

Top Sub

Group

Bottom Sub

Group

Middle Group

Quantity Metrics of Dev Test Engineer• Number of the initiative (Under Minimum supervision)

• and also delivered Number of the defects + fixed defects

• Number of the test cases designed.

• Number of the test cases manually executed.

• Number of the automation scripts developed in lines. .

• Number of the review meetings attended/invited as reviewer, number of the feedbacks gave

• Defects average response time

• Number of test cases missed during design phase

• Number of the defects missed during cross testing

Quantity Metrics of Regression Test Engineer

• Number of the initiative and also delivered • Number of the regression defects + fixed

defects• Number of the test cases executed • Number of the testbed integrated. . • Number of the review meetings

attended/invited as reviewer, number of the feedbacks gave

• Defects average response time

Quantity Metrics of Tools Engineer• Number of the initiative and also delivered • Number of the defects fixed• Number of the tools developed • Lines of the code developed• Number of the users using the tool . • Number of the review meetings

attended/invited as reviewer, number of the feedbacks gave

• Defects average response time

Quantity Metrics of System Testing Engineer

• Number of the initiative and also delivered • Number of the defects reported• Number of the test case executed • Number of the testbed built . • Number of the review meetings attended/invited as

reviewer, number of the feedbacks gave• Defects average response time• Number of test cases missed during design phase• Number of the defects missed during cross testing

Quality Metrics of Defects• Severity and priority of the defects

• Customer related defects

• Check the Defect Quality Checklist

Bug Quality Checklist

Quality Metrics of Test Cases• Complexity and priority of the test cases

• Check the Test Case Quality Checklist

Test Case Quality Check List

Quality Metrics of Scripts• Check the Scripts Quality Checklist

Script Quality Checklist

Quality Review Process• Random pick 2 bugs, or scripts or test cases

tester developed in this quarter.

• Manager carry out a face-to-face review with the tester against the checklist and give a score

Quality Review Process• Missed test case ratio– During a review session, how many new test

cases are proposed by the reviewers, and the ratio with the existing test cases.

• Missed defects during cross testing– Same test case, if others can test out bugs during

a cross testing, then it is a negative impact on the previous tester

• Customer found bugs root cause analysis