Why what and how of Agile Retrospectives - Agile Greece Summit 2015 - Ben Linders
Experiences with Data Feedback - Better Software 2004 - Ben Linders
-
Upload
ben-linders -
Category
Software
-
view
124 -
download
0
Transcript of Experiences with Data Feedback - Better Software 2004 - Ben Linders
Rev A 2004-06-28 1
Make What’s Counted Count:Experiences with Data Feedback.
2004 Better Software Conference,September 30, San Jose, USA
Ben LindersOperational Development & Quality
Ericsson R&D, The [email protected], +31 161 24 9885
Rev A 2004-06-28 2
Overview
• Why feedback?• Experiences
– Product Quality– R&D performance
• Key Success factors & Pitfalls• Conclusions
Feedback: Bridging the gap between data and actions!
Rev A 2004-06-28 3
Ericsson, The Netherlands
• Benelux Market Unit & Worldwide Main R&D Design Center • R&D: Intelligent Networks
– Strategic Product Management– Product marketing & technical sales support– Provisioning & total project management– Development & maintenance– Integration & Verification– Customization– Supply & support
• 1300 employees, of which 350 in R&D
Measurements on all R&D levels
Rev A 2004-06-28 4
Measurements: Analyze
Data, lots of data, and nothing but data … how can you get a meaning out of it?
We tried:• Historical data: takes long to build up, much effort before
data can be used• Industry data: hard to get, often too general• “Brute force” SPC: conclusions didn’t match with our
perception and insight of the situation
Rev A 2004-06-28 5
Measurements: Actions
The purpose of measuring is … to take actions!
Insufficient actions:• No insight in causes• Debates on the measurement• Insufficient responsibility for results
Rev A 2004-06-28 6
Effectiveness of Measurements
Change needed:Show relation between measurements and daily work, people should get insight in their own performanceGet people involved from definition until resultsAssure that “vital few actions” are done
Co-operation: Line/projects – Operational Development!
Rev A 2004-06-28 7
Feedback: Definition
“Information about the past, delivered in the present, which mayinfluence future behavior” Seashore, Seashore & Weinberg, 1992
Analyze to understand current performanceChange behavior to reach better results
“Information about collected data delivered to the people who have been doing the work, in order to support their understanding of the situation at hand and help them to take the needed actions” Linders, 2004
Rev A 2004-06-28 8
Feedback: ConceptsPeople are capable of analyzing their own performance and results,
you just have to provide them with the right data
Empowerment: Make decisions at the lowest possible level
Assure that you have valid data, before drawing conclusions
Don’t use data to punish people
Feedback is a means to discuss data in an open atmosphere, enabling early conclusions and actions!
Rev A 2004-06-28 9
Feedback: Deployment
• Feedback should be:– on something that is considered important– quick and frequent– specific, valid, and understandable
Start small, with a team that is open for it and willing to tryGet feedback on how you are doing feedback
Rev A 2004-06-28 10
Experience 1: Measuring Product QualityOld approach: Quality Engineer gathered data, did analysis, and presentedconclusions to design and test teams (often in a big report)
Drawback:Teams didn’t understand the dataData available when the project was finished: Too lateNo insight how measurement related to their workTeams didn’t feel the need for changes
Result:Hardly any improvement of product or process quality
Rev A 2004-06-28 11
Project Defect Model• Project Defect Model:
– to control quality of the product during development– and improve development/inspection/test processes
• Way of working:– Estimate # defects made during development per phase– Estimate defect detection rate per phase – Track the estimates against actual # defects found
Rev A 2004-06-28 12
Project Defect Model: FeedbackNew approach: Quality Engineer provides the model, and coaches the teams inestimating, tracking actuals, and drawing conclusions
Benefits:Teams develop understanding on their way of workingTeams have better insight on their progress/resultsTeams feel involved, it’s their data/conclusions
Result:Conclusions (problem/risk) lead to early actionsTeams get recognized for good results
Rev A 2004-06-28 13
Experience 2: Steering R&D performanceOld approach: Quality Engineer collected target data, and presented conclusionsto the management (or reported on them).
Drawback:Much debates about the data/measurementNo insight in causes when targets were not metBlame and denial
Result:Metrics didn’t support controlled improvement of the performance
Rev A 2004-06-28 14
Balanced Scorecard• Balanced Scorecard:
– comprehensive set of measurable targets– from different focus areas
• Way of working:– Collection of data by the Quality Engineers– Feedback/Interview sessions with managers
• Show the raw data• Ask management for explanation• Have management draw conclusions and take actions
– Document the conclusions/actions with the data
Rev A 2004-06-28 15
Balanced Scorecard: FeedbackNew approach: Quality Engineer shows the raw data, signals trends, anomalies,and red flags, asks critical questions; working towards actions
Benefits:Management knows what happened, together with Quality Engineerthey can pinpoint and analyze the causes of insufficient performanceManagement feel involved, it’s their data/conclusions
Result:Managers take earlier actionManagement Team focuses on show stoppers, while still keepingoverview of total performance
Rev A 2004-06-28 16
Key Success FactorsData collected must relate to the organization’s goals
Management Support is crucial
Quality Engineers have a central role
People providing the data should be rewarded
Order of data – analysis – conclusions – actions – predictions
Communicate, communicate, communicate!
Rev A 2004-06-28 17
Pitfalls
People might distrust data, say that it is wrong:Ask them for correct data (but don’t try to be perfect: optimize!)
People sometimes do not want to participate in analysisMake clear that only they can draw conclusions
People sometimes do not want to take actions:Top-down goal setting, assure they accept the targets
People are wary of changeStart with less critical measurement, communicate successes
Rev A 2004-06-28 18
Conclusions
Feedback improves effectiveness of measurements:
– Earlier insight in performance & risks
– Involvement of those whose work is measured
– Actions taken by teams and middle management
Make what’s counted count!
Rev A 2004-06-28 19
Further reading References
– What did you say? The art of giving and receiving feedback. Seashore, Seashore & Weinberg, Douglas Charles press, 1992
– Getting things done when you are not in charge. Geoffrey M. Bellman, Fireside, 1993.
– How to talk about work performance: A feedback primer. Esther Derby, Crosstalk December 2003, page 13-16.
Papers– Controlling Product Quality During Development with a Defect Model,
Proceedings ESEPG 2003– Make what’s counted count, Better Software magazine, march 2004
Ben Linders, Ericsson R&D, The [email protected], +31 161 24 9885