Post on 20-May-2015
description
0
Cause and Effect What You Need to Know About Desktop Support KPI’s
Session #408: April 18, 2013 Jeff Rumburg, Managing Partner, MetricNet LLC
1
Observations From 25 Years of Desktop Support Data
More than 900 Desktop Support Benchmarks
Global Database
25+ Key Performance Indicators
Nearly 60 Industry Best Practices
1
2
Characteristics of World-Class Desktop Support
Desktop Support consistently exceeds customer expectations
Result is high levels of customer satisfaction
MTTR is below average for Incidents and Service Requests
Costs are managed at or below industry average levels
Cost per Ticket, per Incident, and per Service Request is below average
Minimizes Total Cost of Ownership (TCO)
Desktop Support follows industry best practices
Industry best practices are defined and documented
Desktop Support follows industry best practices
Every transaction adds value
A positive customer experience
Drives a positive view of IT overall
© 2013 MetricNet, LLC, www.metricnet.com
Tickets, Incidents, and Service Requests
Incident Volume + Service Request Volume = Ticket Volume
Tickets
Incidents Service Requests
Unplanned work that requires a physical touch to a device
Hardware break/fix
Device failure
Connectivity failure
Planned work that requires a physical touch to one or more devices
Move/Add/Change
Hardware or software upgrade
Device refresh
Device set-up
© 2013 MetricNet, LLC, www.metricnet.com 3
The Premise Behind Desktop Support KPI’s
We’ve all heard the expression…
“If you’re not measuring it,
you’re not managing it!”
But there’s more to the story…Lots more!
© 2013 MetricNet, LLC, www.metricnet.com 4
Two Paradigms for Desktop Support KPI’s
The Historical Approach The Holistic Approach
Measurement
(75%)
Analysis
(15%)
Prescription
(7.5%)
Action
(2.5%)
Measurement
(5%)
Analysis
(20%)
Prescription
(30%)
Action
(45%)
Inc
rea
sin
g V
alu
e!
© 2013 MetricNet, LLC, www.metricnet.com 5
6
4
3
2
1
Customer
Enthusiasm
An Industry MegaTrend: The Holistic Use of KPI’s
Measure
Diagnose
Prescribe
Implement
Model
Component Description
1. Measure
Measure help
desk
performance on
an ongoing basis
2. Diagnose
Benchmark
performance and
conduct a gap
analysis
3. Prescribe Define actions to
close the gap
4. Implement
Implement your
action plan and
improve
performance © 2013 MetricNet, LLC, www.metricnet.com
The Most Common Desktop Support KPI’s
Cost per Ticket
Cost per Incident
Cost per Service Request
Cost Productivity
Service Level
Quality
Ticket Handling
Technician Average Incident Response Time (min)
% of Incidents Resolved in 24 Hours
Mean Time to Resolve Incidents (hours)
Mean Time to Complete Service Requests (days)
Technician Utilization
Tickets per Technician-Month
Incidents per Technician-Month
Service Requests per Technician-Month
Ratio of Technicians to Total Headcount
Customer Satisfaction
First Contact Resolution Rate
(Incidents
% Resolved Level 1 Capable
% of Tickets Re-opened
Technician Satisfaction
New Technician Training
Hours
Annual Technician Training
Hours
Annual Technician Turnover
Technician Absenteeism
Technician Tenure (months)
Technician Schedule
Adherence
Average Incident Work Time (min)
Average Service Request Work Time (min)
Average Travel Time per Ticket (min)
And there are hundreds more!!
Workload
Tickets per Seat per Month
Incidents per Seat per
Month
Service Requests per Seat
per Month
Incidents as a % of Total
Ticket Volume
© 2013 MetricNet, LLC, www.metricnet.com 7
Controllable vs. Non-Controllable KPI’s
Causal
Factors
Drivers
Cost and Quality are the Macro Measures
The Macro Measures tell the story of your
performance
They are good for communicating the
performance of Desktop Support
But you cannot control them directly
Workload metrics are driven by Causal Factors
The Causal Factors define the volume and mix
of work performed by Desktop Support
Desktop Support has very little control over the
Causal Factors
They are a function of your IT environment
Productivity, Service Level, Technician, and Ticket
Handling are the underlying drivers of performance
These are the metrics that drive your performance
You can control these metrics directly
It is through these metrics that you can influence the
Macro Measures, and improve your performance
© 2013 MetricNet, LLC, www.metricnet.com 8
Causal Factors: The Workload Drivers
Causal
Factors
Drivers
Causal Factors include:
Device count and mix
Mix of desktop vs. laptop
computers
Number of mobile devices
Average age of devices
Standardization of desktop
environment
User population density
High rise vs. campus vs. field
User work location
Office vs. home vs. field
And numerous other factors…
© 2013 MetricNet, LLC, www.metricnet.com 9
The Most Common Desktop Support KPI’s
Cost per Ticket
Cost per Incident
Cost per Service Request
Cost Productivity
Service Level
Quality
Ticket Handling
Technician Average Incident Response Time (min)
% of Incidents Resolved in 24 Hours
Mean Time to Resolve Incidents (hours)
Mean Time to Complete Service Requests (days)
Technician Utilization
Tickets per Technician-Month
Incidents per Technician-Month
Service Requests per Technician-Month
Ratio of Technicians to Total Headcount
Customer Satisfaction
First Contact Resolution Rate
(Incidents
% Resolved Level 1 Capable
% of Tickets Re-opened
Technician Satisfaction
New Technician Training
Hours
Annual Technician Training
Hours
Annual Technician Turnover
Technician Absenteeism
Technician Tenure (months)
Technician Schedule
Adherence
Average Incident Work Time (min)
Average Service Request Work Time (min)
Average Travel Time per Ticket (min)
And there are hundreds more!!
Workload
Tickets per Seat per Month
Incidents per Seat per
Month
Service Requests per Seat
per Month
Incidents as a % of Total
Ticket Volume
© 2013 MetricNet, LLC, www.metricnet.com 10
Desktop Support KPI’s: Which Ones Really Matter?
Cost per Ticket Cost
Productivity
Quality
Call Handling
Technician Utilization
Customer satisfaction
First contact resolution rate (incidents)
Technician Technician Job Satisfaction
Aggregate Balanced scorecard
TCO % Resolved Level 1 Capable
Service Level Mean Time to Resolve
© 2013 MetricNet, LLC, www.metricnet.com 11
12
Benchmarking Case Study:
Optimizing Desktop Support Performance
© 2013 MetricNet, LLC, www.metricnet.com
Your Desktop Support Performance
Performance of
Benchmarking Peer
Group
Determine How Best in Class
Achieve Superiority
Adopt Selected Practices of
Best in Class
Build a Sustainable Competitive Advantage
The ultimate
objective of
benchmarking
COMPARE
The Benchmarking Methodology
Read MetricNet’s whitepaper on Desktop Support Benchmarking. Go to www.metricnet.com to download your copy!
© 2013 MetricNet, LLC, www.metricnet.com 13
Cost per Ticket Higher Lower
Higher AFTER BENCHMARKING
STARTING POINT: BEFORE
BENCHMARKING
BEST-IN-CLASS
PERFORMANCE CURVE
AVERAGE PERFORMANCE CURVE
The Goal of Benchmarking
Cu
sto
mer
Sa
tisfa
cti
on
© 2013 MetricNet, LLC, www.metricnet.com 14
Benchmarking Data Summary
Average Min MaxCost per Ticket $81 $68 $55 $117
Cost per Incident $57 $45 $34 $85
Cost per Service Request $144 $128 $107 $196
Customer Satisfaction 71% 81% 71% 89%
First Contact Resolution Rate (Incidents) 49% 55% 44% 82%
% Resolved Level 1 Capable 9% 17% 4% 23%
% of Tickets Re-opened 2.0% 6.0% 1.0% 8.0%
Technician Utilization 46% 54% 36% 84%
Tickets per Technician per Month 65 81 58 93
Incidents per Technician per Month 81 100 72 131
Service Requests per Technician per Month 42 53 35 71
Ratio of Technicians to Total Headcount 73% 84% 69% 88%
Average Incident Response Time (hours) 9.1 6.3 4.2 12.1
% of Incidents Resolved in 24 Hours 33% 51% 33% 62%
Mean Time to Resolve Incidents (days) 2.6 1.9 0.7 4.3
Mean Time to Complete Service Requests (days) 12.7 7.4 4.6 12.7
Technician Satisfaction 71% 84% 66% 87%
New Technician Training Hours 10 46 10 80
Annual Technician Training Hours 0 21 0 28
Annual Technician Turnover 53% 24% 8% 53%
Technician Absenteeism 19% 14% 9% 22%
Technician Tenure (months) 11 47 11 64
Technician Schedule Adherence 64% 87% 64% 91%
Average Incident Work Time (min) 65 49 38 77
Average Service Request Work Time (min) 119 94 80 110
Average Travel Time per Ticket (min) 18 23 12 49
Tickets per Seat per Month 0.79 0.90 0.70 1.68
Incidents per Seat per Month 0.57 0.65 0.5 1.2
Service Requests per Seat per Month 0.22 0.25 0.2 0.48
Incidents as a % of Total Ticket Volume 72% 72% 71% 71%
Deskside Support KPI's Peer GroupYour
Performance
Productivity
Workload
Cost
Quality
Metric Type
Service Level
Technician
Ticket Handling
© 2013 MetricNet, LLC, www.metricnet.com 15
The Two Foundation Metrics: Cost and Customer Satisfaction
Cost per Ticket Customer Satisfaction
© 2013 MetricNet, LLC, www.metricnet.com 16
Cost and Quality: Nothing Else Matters!
Lower Cost Cost per Ticket (Efficiency)
Cu
sto
mer
Sati
sfa
cti
on
(E
ffecti
ven
ess)
Top Quartile
Efficient and Effective
Lower Quartile
Middle Quartiles
Effective but not Efficient
Middle Quartiles
Efficient but not Effective
World-Class Desktop Support
Peer Group
Higher Cost
Lower
Quality
Higher
Quality
© 2013 MetricNet, LLC, www.metricnet.com 17
Technician Utilization: The Key Driver of Cost
Cost per Ticket Customer Satisfaction
Technician
Utilization
© 2013 MetricNet, LLC, www.metricnet.com 18
$0
$50
$100
$150
$200
$250
$300
30% 40% 50% 60% 70% 80% 90%
Techncian Utilization
Co
st
per
Tic
ket
Technician Utilization vs. Cost per Ticket
© 2013 MetricNet, LLC, www.metricnet.com 19
Desktop Support Technician Utilization Defined
Technician
Utilization
((Average number of Incidents handled by a technician in a month) X (Average Incident Work Time) +
(Average number of Service Requests handled by a technician in a month) X (Average Service Request Work Time) +
(Average number Tickets handled by a technician in a month) X (Average Travel Time per Ticket))
(Average number of days worked in a month) X (Number of work hours in a day) X (60 minutes/hr)
Technician Utilization is a measure of technician work and travel time, divided by
total time at work during the month
It takes into account both incidents and service requests handled by the
technicians
But it does not make adjustments for sick days, holidays, training time, project
time, or idle time
=
© 2013 MetricNet, LLC, www.metricnet.com 20
Example: Desktop Support Technician Utilization
Incidents per Technician per Month = 60
Service Requests per Technician per Month = 24
Average Tickets per Technician per Month = 84
Average Incident Work Time = 32 minutes
Average Service Request Work Time = 59 minutes
Average Travel Time per Ticket = 41 minutes
Technician
Utilization
((Average number of Incidents handled by a technician in a month) X (Average Incident Work Time) +
(Average number of Service Requests handled by a technician in a month) X (Average Service Request Work Time) +
(Average number Tickets handled by a technician in a month) X (Average Travel Time per Ticket))
(Average number of days worked in a month) X (Number of work hours in a day) X (60 minutes/hr) =
Technician
Utilization
((60 Incidents per Month) X (32 minutes) + (24 Service Requests per Month) X (59 minutes) +
(84 Tickets per Month) X (41 minutes))
(21.5 working days per month) X (7.5 work hours per day) X (60 minutes/hr) = =
70%
Technician
Utilization
© 2013 MetricNet, LLC, www.metricnet.com 21
The Drivers of Customer Satisfaction
Cost per Ticket Customer Satisfaction
Technician
Utilization
FCR
(Incidents)
Technician
Satisfaction
SL’s
MTTR
© 2013 MetricNet, LLC, www.metricnet.com 22
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
30% 40% 50% 60% 70% 80% 90% 100%
Technician Satisfaction
Cu
sto
mer
Sati
sfa
cti
on
Technician Satisfaction vs. Customer Satisfaction
© 2013 MetricNet, LLC, www.metricnet.com 23
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
30% 40% 50% 60% 70% 80% 90%
First Contact Resolution Rate (Incidents)
Cu
sto
mer
Sati
sfa
cti
on
Incident FCR vs. Customer Satisfaction
© 2013 MetricNet, LLC, www.metricnet.com 24
Incident MTTR vs. Customer Satisfaction
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
0.0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0
Incident MTTR (days)
Cu
sto
mer
Sati
sfa
cti
on
© 2013 MetricNet, LLC, www.metricnet.com 25
The Drivers of Technician Satisfaction
Cost per Ticket Customer Satisfaction
Technician
Utilization
FCR
(Incidents)
Technician
Satisfaction
SL’s
MTTR
Service
Levels: MTTR
Coaching Career Path Training Hours
© 2013 MetricNet, LLC, www.metricnet.com 26
New Technician Training Hours vs. Technician Job Satisfaction
20%
30%
40%
50%
60%
70%
80%
90%
100%
0.0 50.0 100.0 150.0 200.0 250.0
New Technician Training Hours
Tech
nic
ian
Sati
sfa
cti
on
© 2013 MetricNet, LLC, www.metricnet.com 27
Annual Training Hours vs. Technician Job Satisfaction
40%
50%
60%
70%
80%
90%
100%
0 20 40 60 80 100 120 140
Annual Technician Training Hours
Te
ch
nic
ian
Sa
tis
fac
tio
n
© 2013 MetricNet, LLC, www.metricnet.com 28
Technician Experience vs. Incident FCR
20%
30%
40%
50%
60%
70%
80%
90%
0.0 10.0 20.0 30.0 40.0 50.0 60.0 70.0
Technician Time on Job (months)
Incid
en
t F
CR
© 2013 MetricNet, LLC, www.metricnet.com 29
Measuring Defects: % Resolved Level 1 Capable
Cost per Ticket Customer Satisfaction
Technician
Utilization
FCR
(Incidents)
Technician
Satisfaction
SL’s
MTTR
Service
Levels: MTTR
Coaching Career Path Training Hours
% Resolved
Level 1
Capable
© 2013 MetricNet, LLC, www.metricnet.com 30
Cost of Resolution: North American Averages
Support Level Cost per Ticket
Vendor
Level 2: Desktop Support
Field Support
Level 3 IT
(apps, networking, NOC, etc.)
Level 1: Service Desk
$471
$196
$85
$62
$22
© 2013 MetricNet, LLC, www.metricnet.com 31
A Summary of the Major KPI Correlations
Cost per Ticket Customer Satisfaction
Technician
Utilization
FCR
(Incidents)
Technician
Satisfaction
Coaching Career Path Training Hours
SL’s
MTTR
Work/
Travel Time Techs/
Total FTE’s
Absenteeism/
Turnover
% Resolved
Level 1
Capable
Scheduling
Efficiency
Service
Levels: MTTR
© 2013 MetricNet, LLC, www.metricnet.com 32
Benchmarking Case Study: The Diagnosis
Average Min MaxCost per Ticket $81 $68 $55 $117
Cost per Incident $57 $45 $34 $85
Cost per Service Request $144 $128 $107 $196
Customer Satisfaction 71% 81% 71% 89%
First Contact Resolution Rate (Incidents) 49% 55% 44% 82%
% Resolved Level 1 Capable 9% 17% 4% 23%
% of Tickets Re-opened 2.0% 6.0% 1.0% 8.0%
Technician Utilization 46% 54% 36% 84%
Tickets per Technician per Month 65 81 58 93
Incidents per Technician per Month 81 100 72 131
Service Requests per Technician per Month 42 53 35 71
Ratio of Technicians to Total Headcount 73% 84% 69% 88%
Average Incident Response Time (hours) 9.1 6.3 4.2 12.1
% of Incidents Resolved in 24 Hours 33% 51% 33% 62%
Mean Time to Resolve Incidents (days) 2.6 1.9 0.7 4.3
Mean Time to Complete Service Requests (days) 12.7 7.4 4.6 12.7
Technician Satisfaction 71% 84% 66% 87%
New Technician Training Hours 10 46 10 80
Annual Technician Training Hours 0 21 0 28
Annual Technician Turnover 53% 24% 8% 53%
Technician Absenteeism 19% 14% 9% 22%
Technician Tenure (months) 11 47 11 64
Technician Schedule Adherence 64% 87% 64% 91%
Average Incident Work Time (min) 65 49 38 77
Average Service Request Work Time (min) 119 94 80 110
Average Travel Time per Ticket (min) 18 23 12 49
Tickets per Seat per Month 0.79 0.90 0.70 1.68
Incidents per Seat per Month 0.57 0.65 0.5 1.2
Service Requests per Seat per Month 0.22 0.25 0.2 0.48
Incidents as a % of Total Ticket Volume 72% 72% 71% 71%
Deskside Support KPI's Peer GroupYour
Performance
Productivity
Workload
Cost
Quality
Metric Type
Service Level
Technician
Ticket Handling
© 2013 MetricNet, LLC, www.metricnet.com 33
Desktop KPI’s: Which Ones Really Matter?
Cost per Ticket Cost
Productivity
Quality
Call Handling
Technician Utilization
Customer satisfaction
First contact resolution rate (incidents)
Technician Technician Satisfaction
Aggregate Balanced scorecard
TCO % Resolved Level 1 Capable
Service Level Mean Time to Resolve
© 2013 MetricNet, LLC, www.metricnet.com 34
Aggregate Metrics: The Balanced Scorecard
Step 1
Eight critical
performance
metrics have been
selected for the
scorecard
Step 2
Each metric has been
weighted according to its
relative importance
Step 3
For each performance metric,
the highest and lowest
performance levels in the
benchmark are recorded
Step 4
Your actual
performance for
each metric is
recorded in this
column
Step 5
Your score for each
metric is then
calculated: (worst case
– actual performance) /
(worst case – best
case) X 100
Step 6
Your balanced score for each
metric is calculated: metric
score X weighting
35
Worst Case Best Case
Cost per Incident 15.0% $312.00 $19.00 $48.00 90.1% 13.5%
Cost per Service Request 15.0% $556.00 $41.00 $113.00 86.0% 12.9%
Customer Satisfaction 25.0% 67.0% 94.0% 83.0% 59.3% 14.8%
Technician Utilization 15.0% 36.0% 84.0% 59.0% 47.9% 7.2%
First Contact Resolution Rate (incidents) 15.0% 38.0% 84.0% 61.0% 50.0% 7.5%
% of Incidents Resolved in 24 Hours 5.0% 19.0% 71.0% 58.0% 75.0% 3.8%
Mean Time to Complete Service Requests (days) 5.0% 18.4 1.8 5.8 75.9% 3.8%
Technician Satisfaction 5.0% 59.0% 93.0% 84.0% 73.5% 3.7%
Total 100.0% N/A N/A N/A N/A 67.1%
Balanced
Score
Your Actual
Performance
Metric
Score
Performance RangeMetric
WeightingPerformance Metric
© 2013 MetricNet, LLC, www.metricnet.com
Desktop Support Balanced Scorecard Benchmark
0.0%
10.0%
20.0%
30.0%
40.0%
50.0%
60.0%
70.0%
80.0%
90.0%
100.0%
Deskto
p S
up
po
rt B
ala
nced
Sco
res
High 86.2%
Average ----- 51.9%
Median 50.1%
Low 16.0%
Your Score 67.1%
Desktop Support Balanced Scores
Key Statistics
© 2013 MetricNet, LLC, www.metricnet.com 36
Desktop Support Scorecard Trend
40%
45%
50%
55%
60%
65%
70%
75%
80%
85%
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
Des
kto
p S
up
po
rt B
ala
nc
ed
Sc
ore
12 Month Average Monthly Score
© 2013 MetricNet, LLC, www.metricnet.com 37
Establishing Performance Targets
Achieving the performance targets recommended above will improve the Desktop Support
Balanced Score from 67.1% to 85.6%.
Desktop Support MetricCurrent
Performance
Performance
Target
Cost per Incident $48.00 $42.50
Cost per Service Request $113.00 $98.00
Customer Satisfaction 83.0% 90.0%
Technician Utilization 59.0% 65.0%
Incident First Contact Resolution 61.0% 70.0%
% of Incidents Resolved in 24 Hours 58.0% 70.0%
Mean Time to Complete Service Requests (days) 5.8 3.5
Technician Satisfaction 84.0% 90.0%
Desktop Support Balanced Score 67.1 85.6
© 2013 MetricNet, LLC, www.metricnet.com 38
% %
Driving Individual Accountability: The Technician Scorecard
Step 1
Eight critical
performance
metrics have been
selected for the
scorecard
Step 2
Each metric has been
weighted according to its
relative importance
Step 3
For each performance metric,
the highest and lowest
performance levels in the
benchmark are recorded
Step 4
Your actual
performance for
each metric is
recorded in this
column
Step 5
Your score for each
metric is then
calculated: (worst case
– actual performance) /
(worst case –best case)
X 100
Step 6
Your balanced score for each
metric is calculated: metric
score X weighting
39
Worst Case Best Case
Customer Satisfaction 20.0% 67.0% 94.0% 83.0% 59.3% 11.9%
First Contact Resolution Rate (Incidents) 20.0% 38.0% 84.0% 61.0% 50.0% 10.0%
Incidents Closed this Month 15.0% 19.9 76.9 59.8 69.9% 10.5%
Service Requests Closed This Month 15.0% 15.6 42.2 28.5 48.6% 7.3%
Unplanned Absenteeism 7.5% 0.0% 22.0% 19.0% 86.4% 6.5%
Teamwork 7.5% 0.0 5.0 3.5 69.9% 5.2%
Initiative 7.5% 0.0 5.0 5.0 100.0% 7.5%
Mentoring 7.5% 0.0 5.0 4.5 90.0% 6.7%
Total 100.0% N/A N/A N/A N/A 43.7%
Balanced
Score
Your Actual
Performance
Metric
Score
Performance RangeMetric
WeightingPerformance Metric
© 2013 MetricNet, LLC, www.metricnet.com
Monthly Technician Performance Postings
Sep'10 Aug '10 Jul '10 Jun '10 May '10 Apr '10
11 95.8% 98.0% 97.1% 95.7% 98.3% 97.3% 97.0%
32 92.8% 92.1% 90.3% 89.3% 84.6% 92.2% 90.2%
21 91.5% 88.5% 83.2% 94.0% 93.7% 93.5% 90.7%
35 91.0% 86.8% 85.2% 78.5% 80.5% 68.2% 81.7%
14 89.5% 89.1% 90.0% 90.1% 92.3% 92.1% 90.5%
26 83.8% 84.4% 90.2% 86.5% 77.8% 63.9% 81.1%
25 83.0% 73.6% 81.9% 72.1% 84.8% 87.9% 80.5%
15 70.4% 66.6% 53.3% 56.3% 56.6% 39.0% 57.0%
20 64.9% 66.5% 70.1% 56.9% 40.9% 72.7% 62.0%
31 62.3% 47.4% 22.7% 38.4% 26.0% 93.0% 48.3%
16 61.0% 62.8% 54.5% 45.9% 41.7% 62.7% 54.8%
17 57.9% 42.1% 32.3% 71.6% 60.3% 60.3% 54.1%
33 56.8% 75.5% 64.8% 80.3% 79.7% 73.5% 71.8%
13 52.2% 34.9% 61.0% 52.8% 58.9% 48.7% 51.4%
24 48.9% 66.7% 86.9% 87.7% 83.6% 74.5% 74.7%
28 46.4% 45.5% 19.3% 40.3% 28.8% 32.4% 35.4%
27 43.7% 26.5% 31.5% 24.3% 22.2% 17.2% 27.6%
19 41.5% 28.4% 50.1% 48.1% 71.1% 81.0% 53.4%
23 39.1% 52.3% 57.1% 86.4% 87.7% 88.9% 68.6%
22 36.8% 18.7% 19.3% 52.9% 66.4% 64.3% 43.1%
12 36.6% 43.2% 33.1% 65.7% 69.0% 86.0% 55.6%
30 36.3% 22.6% 23.5% 85.8% 81.5% 70.3% 53.3%
29 34.1% 44.9% 50.2% 28.3% 48.9% 36.9% 40.5%
34 33.4% 37.9% 23.1% 21.7% 29.7% 22.6% 28.0%
18 32.6% 68.4% 80.4% 88.4% 83.8% 91.6% 74.2%
Monthly Average 59.3% 58.5% 58.0% 65.5% 66.0% 68.4% 62.6%
Technician
Number
Monthly Scorecard Performance Six Month
Average
© 2013 MetricNet, LLC, www.metricnet.com 40
The Big Picture on Desktop Support KPI’s
41
42 © 2013 MetricNet, LLC, www.metricnet.com
84%
47%
31%29%
22%19%
8%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
Service Desk Desktop
Support
Network
Outages
VPN Training Enterprise
Applications
Desktop
Software
Factors Contributing to IT Customer Satisfaction
% S
ayin
g V
ery
Im
po
rtan
t
n = 1,044
Global large cap companies
Survey type: multiple choice
3 responses allowed per survey
84% cited the service desk as a very important factor in their overall satisfaction with corporate IT
47% cited desktop support as a very important factor in their overall satisfaction with corporate IT
43 © 2013 MetricNet, LLC, www.metricnet.com
Because Desktop Support is a Major Driver of Customer Satisfaction
44
0
10
20
30
40
50
60
70
0 1 2 3 4 5
Pro
du
cti
ve H
ou
rs L
ost
per
Em
plo
yee p
er
Year
Because Quality of Support Drives End-User Productivity
1 (top) 2 3 4 (bottom)
Customer Satisfaction 93.5% 84.5% 76.1% 69.3%
First Contact Resolution Rate 90.1% 83.0% 72.7% 66.4%
Mean Time to Resolve (hours) 0.8 1.2 3.6 5.0
Customer Satisfaction 94.4% 89.2% 79.0% 71.7%
First Contact Resolution Rate 89.3% 85.6% 80.9% 74.5%
Mean Time to Resolve (hours) 2.9 4.8 9.4 12.3
Service Desk
Desktop Support
Performance QuartileSupport Function Key Performance Indicator
37.4 46.9Average Productive Hours Lost per Employee per Year 17.1 25.9
Performance Quartile n = 60
© 2013 MetricNet, LLC, www.metricnet.com
45
Support Level Cost per Ticket
Vendor
Level 2: Desktop Support
Field Support
Level 3 IT
(apps, networking, NOC, etc.)
Level 1: Service Desk
$471
$196
$85
$62
$22
© 2013 MetricNet, LLC, www.metricnet.com
Because Support Has an Opportunity to Minimize TCO
Some Final Thoughts on Desktop Support KPI’s
When it comes to Desktop Support KPI’s, the 80/20 Rule applies
Less really is more!
Seven KPI’s, plus the Balanced Scorecard are all you need to holistically
measure and manage your Desktop Support organization
Understand the cause-and-effect relationship between KPI’s
This gives you the power to achieve desired outcomes in Desktop Support!
Not all Desktop Support KPI’s are controllable
Focus on the controllable KPI’s
Effective application of Desktop Support KPI’s can help to reduce and
minimize Total Cost of Ownership for End-User Support, and drive high
levels of Customer Satisfaction for all of IT
© 2013 MetricNet, LLC, www.metricnet.com 46
47
Thank You!
Thank you for attending this session.
Please don’t forget to complete an evaluation form!
Session #408: April 18, 2013 Jeff Rumburg, Managing Partner, MetricNet LLC
48
Question and Answer Session #408: April 18, 2013 Jeff Rumburg, Managing Partner, MetricNet LLC
49
About MetricNet
25 Years of Desktop Support Data
More than 900 Desktop Support Benchmarks
Global Database
25+ Key Performance Indicators
Nearly 60 Industry Best Practices
© 2013 MetricNet, LLC, www.metricnet.com 50
51 © 2013 MetricNet, LLC, www.metricnet.com
Benchmarking is MetricNet’s Core Business
Call Centers
Telecom
Information
Technology
Satisfaction
Technical Support
Customer Service
Telemarketing/Telesales
Collections
Service Desk
Desktop Support
Field Support
Price Benchmarking
Customer Satisfaction
Employee Satisfaction
52
Your Presenter: Jeff Rumburg
© 2013 MetricNet, LLC, www.metricnet.com
Jeff Rumburg is a co-founder and Managing Partner at MetricNet,
LLC. Jeff is responsible for global strategy, product development,
and financial operations for the company. As a leading expert in
benchmarking and re-engineering, Mr. Rumburg authored a best
selling book on benchmarking, and has been retained as a
benchmarking expert by such well-known companies as American
Express, Hewlett-Packard, and GM. Prior to co-founding
MetricNet, Mr. Rumburg was president and founder of The Verity
Group, an international management consulting firm specializing in
IT benchmarking. While at Verity, Mr. Rumburg launched a number
of syndicated benchmarking services that provided low cost
benchmarks to more than 1,000 corporations worldwide.
Mr. Rumburg has also held a number of executive positions at META Group, and Gartner, Inc. As
a vice president at Gartner, Mr. Rumburg led a project team that reengineered Gartner's global
benchmarking product suite. And as vice president at META Group, Mr. Rumburg's career was
focused on business and product development for IT benchmarking. Mr. Rumburg's education
includes an M.B.A. from the Harvard Business School, an M.S. magna cum laude in Operations
Research from Stanford University, and a B.S. magna cum laude in Mechanical Engineering. He
is author of A Hands-On Guide to Competitive Benchmarking: The Path to Continuous Quality
and Productivity Improvement, and has taught graduate-level engineering and business courses.
Mr. Rumburg serves on the Strategic Advisory Board for HDI.
53
Cause and Effect What You Need to Know About Desktop Support KPI’s
Session #408: April 18, 2013 Jeff Rumburg, Managing Partner, MetricNet LLC