Tester performance evaluation

16
Tester Performance Evaluation Liang Gao (lgao@sigma- rt.com)

description

 

Transcript of Tester performance evaluation

Page 1: Tester performance evaluation

Tester Performance Evaluation

Liang Gao ([email protected])

Page 2: Tester performance evaluation

Attributes of Good Testers

• Stubborn but reasonable• Fearless in front of developers & management• Sense of smell (dig the right place, profiling)• Common sense (when to trust developers)• Learn to knowledge broad & shallow (inter-

feature dependencies)• Methodological follow the process/procedure

Page 3: Tester performance evaluation

Definition of Performance Management

• Defined the performance metrics to let tester deliver the quality work.

• Should not be something that they can deliver without doing the real useful work.

• Bug counts?• Hours of work?• Certification?• Peer Ratings?• Customers ratings?

Page 4: Tester performance evaluation

Key Tasks of A Tester• Writing bug reports?• Designing, running, modifying test cases?• Developing test strategies / plans / ideas?• Editing technical documentation?• Interact with developers?• Writing support materials for the help desk or field support?• Facilitate inspections / reviews?• Requirements analysis—meet, interview, interpret needs of

stakeholders?• Release management? Archiving? Configuratio management? • Develop automation scripts?

Page 5: Tester performance evaluation

Different Role in the Test Team • Management

– Dev Test Group manager

– System and solution test manager

– Regression test manager

– Tools group manager

• Engineer

– New feature testing engineer

– Regression engineer

– System / Solution testing engineer

– Testing Tools developer

Page 6: Tester performance evaluation

Performance is Relative. Top Performance Group Bottom Performance Group

Top Sub

Group

Bottom Sub

Group

Middle Group

Top Sub

Group

Bottom Sub

Group

Middle Group

Page 7: Tester performance evaluation

Quantity Metrics of Dev Test Engineer• Number of the initiative (Under Minimum supervision)

• and also delivered Number of the defects + fixed defects

• Number of the test cases designed.

• Number of the test cases manually executed.

• Number of the automation scripts developed in lines. .

• Number of the review meetings attended/invited as reviewer, number of the feedbacks gave

• Defects average response time

• Number of test cases missed during design phase

• Number of the defects missed during cross testing

Page 8: Tester performance evaluation

Quantity Metrics of Regression Test Engineer

• Number of the initiative and also delivered • Number of the regression defects + fixed

defects• Number of the test cases executed • Number of the testbed integrated. . • Number of the review meetings

attended/invited as reviewer, number of the feedbacks gave

• Defects average response time

Page 9: Tester performance evaluation

Quantity Metrics of Tools Engineer• Number of the initiative and also delivered • Number of the defects fixed• Number of the tools developed • Lines of the code developed• Number of the users using the tool . • Number of the review meetings

attended/invited as reviewer, number of the feedbacks gave

• Defects average response time

Page 10: Tester performance evaluation

Quantity Metrics of System Testing Engineer

• Number of the initiative and also delivered • Number of the defects reported• Number of the test case executed • Number of the testbed built . • Number of the review meetings attended/invited as

reviewer, number of the feedbacks gave• Defects average response time• Number of test cases missed during design phase• Number of the defects missed during cross testing

Page 11: Tester performance evaluation

Quality Metrics of Defects• Severity and priority of the defects

• Customer related defects

• Check the Defect Quality Checklist

Bug Quality Checklist

Page 12: Tester performance evaluation

Quality Metrics of Test Cases• Complexity and priority of the test cases

• Check the Test Case Quality Checklist

Test Case Quality Check List

Page 13: Tester performance evaluation

Quality Metrics of Scripts• Check the Scripts Quality Checklist

Script Quality Checklist

Page 14: Tester performance evaluation

Quality Review Process• Random pick 2 bugs, or scripts or test cases

tester developed in this quarter.

• Manager carry out a face-to-face review with the tester against the checklist and give a score

Page 15: Tester performance evaluation

Quality Review Process• Missed test case ratio– During a review session, how many new test

cases are proposed by the reviewers, and the ratio with the existing test cases.

• Missed defects during cross testing– Same test case, if others can test out bugs during

a cross testing, then it is a negative impact on the previous tester

• Customer found bugs root cause analysis

Page 16: Tester performance evaluation