Using metrics to influence developers, executives, and stakeholders

68
@LMaccherone @TheAgileCraft What? So what? NOW WHAT? Using metrics to influence developers, executives, and stakeholders Topic presented by: Larry Maccherone @LMaccherone

Transcript of Using metrics to influence developers, executives, and stakeholders

  1. 1. @LMaccherone @TheAgileCraft What? So what? NOW WHAT? Using metrics to influence developers, executives, and stakeholders Topic presented by: Larry Maccherone @LMaccherone
  2. 2. @LMaccherone @TheAgileCraft
  3. 3. @LMaccherone @TheAgileCraft What? So what? NOW WHAT?
  4. 4. @LMaccherone @TheAgileCraft What?
  5. 5. @LMaccherone @TheAgileCraft
  6. 6. @LMaccherone @TheAgileCraft So what?
  7. 7. @LMaccherone @TheAgileCraft What? Visualization is like photography. Impact is a function of focus, illumination, and perspective. Credit: Edward Tufte
  8. 8. @LMaccherone @TheAgileCraft NOW WHAT?
  9. 9. @LMaccherone @TheAgileCraft Dont Launch!
  10. 10. @LMaccherone @TheAgileCraft Prevent your own disastrous decisions
  11. 11. @LMaccherone @TheAgileCraft Larry Maccherone @LMaccherone
  12. 12. @LMaccherone @TheAgileCraft Cognitive bias works against good decisions
  13. 13. @LMaccherone @TheAgileCraft We don't see things the way they are. We see things the way we are. ~The Talmud
  14. 14. @LMaccherone @TheAgileCraft Next slide is a movie click to play
  15. 15. @LMaccherone @TheAgileCraft Denying the Evidence
  16. 16. @LMaccherone @TheAgileCraft 5 Truths about cognitive bias 1. Very few people are immune to it. 2. We all think that we are part of that small group. 3. You can be trained to get much, much better. Douglass Hubbard How to Measure Anything 4. We do a first-fit pattern match. Not a best-fit pattern match. And we only use about 5% of the information to do the matching. 5. We evolved to be this way (survival trait).
  17. 17. @LMaccherone @TheAgileCraft
  18. 18. @LMaccherone @TheAgileCraft An example of overcoming cognitive bias
  19. 19. @LMaccherone @TheAgileCraft Trained/Calibrated Untrained/Uncalibrated Statistical Error Ideal Confidence 30% 40% 50% 60% 70% 80% 90% 100% 50% 60% 80% 90% 100% 25 75 71 65 58 21 17 68 152 65 45 21 70% Assessed Chance Of Being Correct PercentCorrect 99 # of Responses We are overconfident when assessing our own uncertainty Copyright HDR 2007 [email protected] But, training can calibrate people so that of all the times they say they are X% confident, they will be right X% of the time
  20. 20. @LMaccherone @TheAgileCraft Equivalent Bet calibration What year did Newton published the Universal Laws of Gravitation? Pick year range that you are 90% certain it would fall within. Win $1,000: 1. It is within your range; or 2. You spin this wheel and it lands green Adjust your range until 1 and 2 seem equal. Even pretending to bet money works. 90% 10%
  21. 21. @LMaccherone @TheAgileCraft Agile Teams, Programs, and Portfolios benefit from similar calibration exercises
  22. 22. @LMaccherone @TheAgileCraft How to avoid cognitive bias in decision making Don't focus on consensus. Ritual dissent is a much more successful approach. But that doesnt explain _______. An FBI agent knew that some folks were being trained to fly but not take off and land. Assign someone the role of devils advocate. Israels 10th man. In other words Really consider the other ALTERNATIVES
  23. 23. @LMaccherone @TheAgileCraft Every decision is a forecast!
  24. 24. @LMaccherone @TheAgileCraft Quality of decision depends upon: 1. alternatives considered, and 2. models used to forecast the outcome of those alternatives.
  25. 25. @LMaccherone @TheAgileCraft Group decisions
  26. 26. @LMaccherone @TheAgileCraft 1. Different Models 2. Different Values 3. Different Risk Tolerance Why do people disagree? favor different alternatives Fear-based decision making
  27. 27. @LMaccherone @TheAgileCraft
  28. 28. @LMaccherone @TheAgileCraft For a given alternative, let: Pg = Probability of good thing happening Vg = Value of good thing happening Then: Value of the alternative = Pg Vg
  29. 29. @LMaccherone @TheAgileCraft An agile product management example
  30. 30. @LMaccherone @TheAgileCraft $8M Best case (25%) $1M Likely case (50%) $1M Worst case (25%) 1 $2M$2M$1M2 Which strategy is best for your company? PW VW = .25 -$1.00M = -$0.25M PL VL = .50 $1.00M = $0.50M PB VB = .25 $8.00M = $2.00M ----------- $2.25M for your career? PW VW = .25 $1.00M = $0.25M PL VL = .50 $2.00M = $1.00M PB VB = .25 $2.00M = $0.50M ----------- $1.75M
  31. 31. @LMaccherone @TheAgileCraft If you get only 1 project then strategy 2 is better 75% of the time If you get projects then strategy 1 is better 100% of the time How many projects do you need for strategy 1 to be better more often than not?
  32. 32. @LMaccherone @TheAgileCraft
  33. 33. @LMaccherone @TheAgileCraft Play with it yourself at: http://jsfiddle.net/lmaccherone/j3wh61r7/
  34. 34. @LMaccherone @TheAgileCraft
  35. 35. @LMaccherone @TheAgileCraft Did any of you get emotional about the $1M loss? Did any of you want to question the $8M number? Weve totally eliminated fear from the equation changed the nature of the conversation
  36. 36. @LMaccherone @TheAgileCraft Argument is about who is right. Decision making is about what is right. One more thing about group decision making
  37. 37. @LMaccherone @TheAgileCraft An agile delivery date forecast example
  38. 38. @LMaccherone @TheAgileCraft Monte Carlo Forecasting What it looks like Credit for graphic: Rally Software
  39. 39. @LMaccherone @TheAgileCraft Seek to change the nature of the conversation
  40. 40. @LMaccherone @TheAgileCraft 1. Different Models 2. Different Values 3. Different Risk Tolerance Why do people favor different alternatives?
  41. 41. @LMaccherone @TheAgileCraft
  42. 42. @LMaccherone @TheAgileCraft Models and Values Models calculate probability in terms of proxy variables Values translate those probabilities into money The Monte Carlo forecast didnt give us dollars. It gave us a probability curve for project duration. To translate that to money, we need to know how we value time-to-market, quality, etc. Cost-of-delay analysis should work here.
  43. 43. @LMaccherone @TheAgileCraft Example model and values system Team performance feedback Duplicate great performance Bend the curve early on declining performance Decide what process improvements to implement next Dimensions of performance 1. Productivity 2. Predictability 3. Time-to-market 4. Responsiveness 5. Quality 6. Customer satisfaction 7. Employee engagement 8. Build-the-right-thing 9. Code quality
  44. 44. @LMaccherone @TheAgileCraft Economic value weighting of team performance Proxy variables translated to something unit-less Weight (adding to 100%) applied to chosen dimensions Examples: Medical device manufacturer values QUALITY 60% for Quality 10% each for Productivity, Predictability, Time-to-market, and Responsiveness Mobile game team values TIME-TO-MARKET 40% for Time-to-market 20% each for Productivity and Responsiveness 10% each Quality and Predictability
  45. 45. @LMaccherone @TheAgileCraft Avoid milk toast values
  46. 46. @LMaccherone @TheAgileCraft Criteria for great visualization Credits: Edward Tufte (mostly) Stephen Few Gestalt School of Psychology
  47. 47. @LMaccherone @TheAgileCraft 1. Answers the question, "Compared with what? (So what?) Trends Benchmarks
  48. 48. @LMaccherone @TheAgileCraft 2. Shows causality, or is at least informed by it. The primary chart used by the NASA scientists showed O-ring failure indicators by launch date. Tufte's alternative shows the same data by the critical factor, temperature. The fateful shuttle launch occurred at 31 degree. Tufte's visualization makes it obvious that there is great risk for any launch at temperatures below 66 degrees.
  49. 49. @LMaccherone @TheAgileCraft 3. Tells a story with whatever it takes. Still Moving Numbers Graphics And Maybe some fun
  50. 50. @LMaccherone @TheAgileCraft 4. Is credible. Calculations explained Sources Assumptions Who (name drop?) Drill-down How? Etc.
  51. 51. @LMaccherone @TheAgileCraft 5. Has business value. (or value in its social context) The ODIM framework O U T C O M E D E C I S I O N I N S I G H T M E A S U R E THINK EFFECT like Vic Basilis Goal-Question-Metric (GQM) but without ISO/IEC 15939 baggage
  52. 52. @LMaccherone @TheAgileCraft 6. Shows differences easily. aka: Save the pie for dessert Credit: Stephen Few (Perceptual Edge) http://www.perceptualedge.com/ar ticles/08-21-07.pdf
  53. 53. @LMaccherone @TheAgileCraft 6. Shows differences easily. (continued) Can you compare the market share from one year to the next? Quickly: Which two companies are growing share the fastest? One pie chart is bad. Multiple pie charts are worse!!!
  54. 54. @LMaccherone @TheAgileCraft 6. Shows comparisons easily. (continued) How about now? Can you compare the market share from one year to the next?
  55. 55. @LMaccherone @TheAgileCraft 7. Allows you to see the forest AND the trees.
  56. 56. @LMaccherone @TheAgileCraft 8. Informs along multiple dimensions.
  57. 57. @LMaccherone @TheAgileCraft 8. Informs along multiple dimensions. (continued)
  58. 58. @LMaccherone @TheAgileCraft 9. Leaves in the numbers where possible.
  59. 59. @LMaccherone @TheAgileCraft 10. Leaves out glitter. Examples of how NOT to do it.
  60. 60. @LMaccherone @TheAgileCraft Top 10 criteria for great visualization 1. Answers the question, "Compared with what? (SO What?) 2. Shows causality, or is at least informed by it. (NOW WHAT?) 3. Tells a story with whatever it takes. 4. Is credible. 5. Has business value or impact in its social context. 6. Shows differences easily. 7. Allows you to see the forest AND the trees. 8. Informs along multiple dimensions. 9. Leaves in the numbers where possible. 10. Leaves out glitter. Credits: Edward Tufte Stephen Few Gestalt (School of Psychology)
  61. 61. @LMaccherone @TheAgileCraft Influencing with data
  62. 62. @LMaccherone @TheAgileCraft The rider and the elephant Direct the rider Motivate the elephant Shape the path Jonathan Haidt The Happiness Hypothesis (also mentioned in Switch)
  63. 63. @LMaccherone @TheAgileCraft Seven Deadly Sins of Agile Measurement If you do metrics wrong, you will harm your agile transformation 1. Sin: Measurement as a lever Virtue: Measurement as feedback 2. Sin: Unbalanced metrics Virtue: 1 each for Do it fast/right/on-time, and Keep doing it 3. Sin: Metrics can replace thinking Virtue: Blah 4. Sin: Expensive metrics Virtue: 1st work with the data you are already passively gathering 5. Sin: Using a convenient metric Virtue: Outcomes Decisions Insight Metric (ODIM) 6. Sin: Bad analysis Virtue: Simple stats and simulation 7. Sin: Single outcome forecasts Virtue: Forecasts w/ probability
  64. 64. @LMaccherone @TheAgileCraft NOW WHAT? Come to the AgileCraft Booth Questions answered Seven Deadly Sins of Agile Measurement Demo of how AgileCraft answers NOW WHAT? and is the best way to scale agile
  65. 65. @LMaccherone @TheAgileCraft They say Nobody knows whats gonna happen next: not on a freeway, not in an airplane, not inside our own bodies and certainly not on a racetrack with 40 other infantile egomaniacs. Days of Thunder Trying to predict the future is like trying to drive down a country road at night with no lights while looking out the back window. Peter Drucker Never make predictions, especially about the future. Casey Stengel
  66. 66. @LMaccherone @TheAgileCraft When you come to a fork in the road take it! ~Yogi Berra What? the metrics and analysis So what? how it compares/trends what it means NOW WHAT? every decision is a forecast
  67. 67. @LMaccherone @TheAgileCraft NOW WHAT? Come to the AgileCraft Booth Questions answered Seven Deadly Sins of Agile Measurement Demo of how AgileCraft answers NOW WHAT? and is the best way to scale agile
  68. 68. @LMaccherone @TheAgileCraft Extra Slides