IT Benchmarking – Stanford/MITist.mit.edu/.../files/migration/about/benchmarking/...2 Goals for...
Transcript of IT Benchmarking – Stanford/MITist.mit.edu/.../files/migration/about/benchmarking/...2 Goals for...
1
IT Benchmarking IT Benchmarking –– Stanford/MIT Stanford/MITHelp DeskHelp Desk
Senior Management UpdateSenior Management UpdateJuly 14, 2003July 14, 2003
2
Goals for TodayGoals for Today’’s Meetings Meeting
Provide deliverables promised in NovemberProvide deliverables promised in November Update on progress and delaysUpdate on progress and delays Shift focus for Help Desk BenchmarkingShift focus for Help Desk Benchmarking
–– Conclude as a projectConclude as a project–– Implement as a practiceImplement as a practice
Engender cultural shiftEngender cultural shift–– Management by anecdoteManagement by anecdote–– Management by measurementsManagement by measurements
3
Major phases: Where are we now?Major phases: Where are we now?
PreparePrepare: : Scope, team, goals, plansScope, team, goals, plansDefineDefine: : Collect data, develop benchmarksCollect data, develop benchmarksInterpret & TestInterpret & Test: Develop high level: Develop high level
interpretation; try initial changesinterpretation; try initial changesOperationalizeOperationalize: : Get it right, makeGet it right, make
measurement more easily repeatable;measurement more easily repeatable;refine the dashboardrefine the dashboard
LeverageLeverage: Use the measurements on a daily: Use the measurements on a dailybasis; use on a tactical and strategicbasis; use on a tactical and strategicbasisbasis
Spring 02
Summer 02
Nov 02
July-Dec 03
4
When we last metWhen we last met……
Developing the actual dashboardDeveloping the actual dashboardFocus nextFocus next
Wait 6-12 monthsWait 6-12 monthsFuture areas,Future areas,other schoolsother schools
$450K updates to Stanford systems$450K updates to Stanford systems
$450K updates to MIT systems$450K updates to MIT systemsInvestmentsInvestmentsrequiredrequired
Initial data, mostly cost-orientedInitial data, mostly cost-orientedAssessmentAssessment
Fully defined data; well-planned set ofFully defined data; well-planned set ofbenchmarksbenchmarks
Data &MetricsData &Metrics
5
Summary progress since NovemberSummary progress since November DashboardDashboard: Highly iterative work: Highly iterative work
»» What will we What will we reallyreally use to manage ourselves? use to manage ourselves?»» Stanford has better data; MIT has better auto-reportingStanford has better data; MIT has better auto-reporting
Customers: Customers: Weekly spot surveys & follow up (MIT)Weekly spot surveys & follow up (MIT)
CommunicationsCommunications»» Ivy +, EDUCAUSE prepIvy +, EDUCAUSE prep»» IT team leaders, management, some staffIT team leaders, management, some staff»» Extend methods to other IT activitiesExtend methods to other IT activities
SoftwareSoftware»» Negotiating for expanded Remedy support in FY04Negotiating for expanded Remedy support in FY04»» Secured funds for Secured funds for CaseTracker CaseTracker improvements (MIT)improvements (MIT)
Other effortsOther efforts»» Planning for joint customer satisfaction survey in Nov 03Planning for joint customer satisfaction survey in Nov 03»» Joint efforts for IT procurementJoint efforts for IT procurement
6
We linked goals to specific metrics andWe linked goals to specific metrics andthen created a dashboardthen created a dashboard
Invest Appropriately• % of budget• Clients served/FTE
Be Cost Effective• Cost per case by topic• Total costs by topic• Cases by media, including
self-help
Be Responsive• Elapsed time per case• Call abandonment• Hold time• Time to answer
Support Customer Needswith High Quality Service
• Annual customer survey• Spot-surveys on selected
transactions
Develop Effective, Mature Processes• # of contacts vs. # of days to resolve• Origin of Help Desk cases
Develop High Performing,Competent Teams
• Employee satisfactn survey• Individual perf. metrics• Team performance metrics• Training $$ / FTE• % Help Desk certification• Case volume compared to
staff skills mixSupport Rollout of New Systems• Case volume by topic 3 months
before and after launch• Minutes per case
7
Our focus has shifted to Our focus has shifted to operationaloperational data datato manage the workto manage the work
Working drafts of dashboard for each campusWorking drafts of dashboard for each campus–– Version for Senior managementVersion for Senior management
–– Version for line managersVersion for line managers
–– Replaces many current reportsReplaces many current reports
Ideally will have drill-down capabilityIdeally will have drill-down capability
Need software investments to track mostNeed software investments to track mostimportant data (case topic, minutes, etc.)important data (case topic, minutes, etc.)
8
Stanford Support Dashboard - DRAFT (05/26/2003)
Client Satisfaction % Clients reporting as Satisfied or Very Satisfied on per-case surveys Staff competency contributions to Client Satisfaction
Average Overall Sat.
Customer Satisfaction Metric% satisfied per week -- those who said 4 or 5 on Overall Satisfaction
N = ; 5 point scale
Problem Solving Cases Closed by Time Elapsed, per day Cases Created per day, by Medium Used
6 or more days
Graph of cases created, daily for the quarter to date
subdivided by method (email, voice, etc.)
1 to 5 days with a Total cases # overlaid on top
< 24 hours
Cases Open "Backlog"
Avg Cases Open
Open Cases by Topic -- Placeholder- # remaining open at the end of the day
- snapshot of the current 1-day results- stacked column chart, perhaps
NOTE: Cases by Topic is currently unavailable.
Phones ACD standard statistics, for all teams that use ACD. Miscellaneous Statistics
ACD Abandon Rate
ACD performance * Consultant ACD Logged Hours, average per day
Abandon rate chart, with a band showing Calls Offered, avg per day 138
goal levels (Good / Marginal / Check into It) Wait time, avg per day 39 sec
with rend lines Length of call, avg per day 4:275 m:s
Cases created by phonefor Stanford - we don't have this capability yet
Average Cases created per work day 377
goal is 10% * Cases created outside working hours
* for Stanford - we're still compiling these numbers
4.56
13
7890
200
400
600
800
1000
5/7 5/14 5/21 5/28 6/4
100%
86%88%
94%
78%
88%
98%
86%
92% 94%92%
88%
100%
60%
80%
100%
2/17 3/3 3/17 3/31 4/14 4/28 5/12
Results for Surveys from Cases created week beginning. .
0
50
100
150
200
2/17 3/3 3/17 3/31 4/14 4/28 5/12 5/26
0
25
50
75
100
125
150
175
200
2/17 2/24 3/3 3/10 3/17 3/24 3/31 4/7 4/14 4/21 4/28 5/5 5/12 5/19
6 or more
1 - 5
< 1
050
100150200250300350400450500
2/17 3/3 3/17 3/31 4/14 4/28 5/12 5/26
Voice/Other
Web
Professionalism4
5
2/17 3/17 4/17
Communication4
5
2/17 3/17 4/17
Availability4
5
2/17 3/17 4/17
Timeliness4
5
2/17 3/17 4/17
Expertise4
5
2/17 3/17 4/17
Resolution4
5
2/17 3/17 4/17
for Stanford -
we don't have this capability
yet
9
MIT Support Dashboard - DRAFT E (06/24/2003)NOTE: All data is for all of FCC except Athena. That is, Computnig Help Desk, Business Help, and the MIT Computer Connection.
Client Satisfaction % Clients reporting as Satisfied or Very Satisfied on per-case surveys Staff competency contributions to Client Satisfaction
Average Overall Sat.
Customer Satisfaction Metric
% satisfied per week -- those who said 4 or 5 on Overall Satisfaction
N = 218; 5 point scale
Problem Solving Age in Days of Cases Closing per Day Cases Created per day, by Medium Used
6 or more days
Graph of cases created, daily for the quarter to date
subdivided by method (email, voice, etc.)
1 to 5 days with a Total cases # overlaid on top
< 24 hours
Residual Open Cases (Open minus Closed plus previous balance) Team Components of Residual Open Cases
Avg Cases Open
Open Cases by Topic -- Placeholder
- # remaining open at the end of the day- snapshot of the current 1-day results
- stacked column chart, perhaps
NOTE: Cases by Topic is currently unavailable.
Phones ACD standard statistics, for all teams that use ACD (CHD and BusinessHelp) Miscellaneous Statistics
ACD Abandon Rate
ACD performance Consultant ACD Logged Hours, average per day 62
Abandon rate chart, with a band showing Calls Offered, avg per day 113
goal levels (Good / Marginal / Check into It) Wait time, avg per day 40 sec
with rend lines Length of call, avg per day 6:24 m:sCases created by phone 46 %
Average Cases created per work day #NAME?
goal is 10% Cases created outside working hours 9 %
4.53
10%
22%11%66%
423
95% 95% 96%
82%
95%92%
73%
88%86%
89%
73%
100%
86%
60%
80%
100%
3/24 3/31 4/7 4/14 4/21 4/28 5/5 5/12 5/19 5/26 6/2 6/9 6/16
Results for Surveys from Cases created week beginning. .
0
50
100
150
200
3/24 4/7 4/21 5/5 5/19 6/2 6/16
Calls
Hours
Wait
Abdn
0
25
50
75
100
125
150
3/24 3/31 4/7 4/14 4/21 4/28 5/5 5/12 5/19 5/26 6/2 6/9 6/16
WalkIn
Web
Vmail
Voice
Professionalism4
5
3/24 4/21 5/19 6/16
Communication4
5
3/24 4/21 5/19
Availability4
5
3/24 4/21 5/19 6/16
Timeliness4
5
3/24 4/21 5/19 6/16
Expertise4
5
3/24 4/21 5/19 6/16Resolution4
5
3/24 4/21 5/19 6/16
350
400
450
500
550
3/24 3/31 4/7 4/14 4/21 4/28 5/5 5/12 5/19 5/26 6/2 6/9 6/160
100
200
300
400
500
600
3/24 3/31 4/7 4/14 4/21 4/28 5/5 5/12 5/19 5/26 6/2 6/9 6/16
0
25
50
75
100
125
150
3/24 3/31 4/7 4/14 4/21 4/28 5/5 5/12 5/19 5/26 6/2 6/9 6/16
6 +
1 - 5
< 1
10
Updated financial data also provide insightUpdated financial data also provide insight
Standardized Standardized ““ticketticket”” definition definition–– Eliminated tickets for administrative overheadEliminated tickets for administrative overhead
–– StanfordStanford’’s ticket count decreased 25%s ticket count decreased 25%
–– Effect: StanfordEffect: Stanford’’s cost/case increasess cost/case increases
Impact of new system rolloutsImpact of new system rollouts–– Stanford has 8x the # of tickets on biz appsStanford has 8x the # of tickets on biz apps
–– Huge total cost of biz apps. supportHuge total cost of biz apps. support
–– Expense of password resetsExpense of password resets
11
Based on this operational data,Based on this operational data,several efforts are underwayseveral efforts are underway
StanfordStanford–– Refine the dashboard!Refine the dashboard!–– Clean up backlog dataClean up backlog data–– Investigate:Investigate:
»» Increase in call abandon rateIncrease in call abandon rateand voicemail ratesand voicemail rates
»» The The ““mesamesa”” effect of web- effect of web-submissionsubmission
–– Organizational change:Organizational change:»» Scale back student help deskScale back student help desk
for the summerfor the summer
»» Consolidate Help DeskConsolidate Help Deskand Bus Apps into singleand Bus Apps into singleunitunit
MITMIT–– Refine the dashboard!Refine the dashboard!
–– Clean up backlog dataClean up backlog data
»» Investigate 22% cases > 6Investigate 22% cases > 6daysdays
»» Focus on Focus on ““actionableactionable”” cases cases
–– Customer: Respond to ratingsCustomer: Respond to ratings< 4 & to specific negative< 4 & to specific negativecommentscomments
12
Most Most ““quick winsquick wins”” are complete are complete
OngoingOngoing
HiringHiring
BeginningBeginning
In placeIn place
CommunicationsCommunications Update line managersUpdate line managers
Integrate line managers into benchmarkingIntegrate line managers into benchmarking
Quick WinQuick WinIn PlaceIn PlaceReconfigure SpaceReconfigure Space
Quick WinQuick Win
Quick WinQuick Win
Quick WinQuick Win
Not quickNot quick
Proactively use dataProactively use data Generate & use weekly metric reportsGenerate & use weekly metric reports
Use weekly Use weekly ““standardsstandards”” in in indiv perf mngmntindiv perf mngmnt
Quick WinQuick WinQuick WinQuick WinCustomer feedback: Customer feedback: Weekly spot surveysWeekly spot surveys
Quick WinQuick Win
In PlaceIn Place
Quick WinQuick Win
Quick WinQuick Win
In PlaceIn Place
Quick WinQuick Win
In PlaceIn Place
Quick WinQuick Win
Quick WinQuick Win
In PlaceIn Place
Quick WinQuick Win
Quick WinQuick Win
TrackingTracking Track tickets at student-staffed Unix deskTrack tickets at student-staffed Unix desk
Track Track internalinternal hand-offs or hand-offs or ““tiers/escalationstiers/escalations””
Standardize work reporting categoriesStandardize work reporting categories
Track type of media for each caseTrack type of media for each case
Consolidate reporting functions into one ticket systemConsolidate reporting functions into one ticket system
Examine excess ticket counts in specific categoriesExamine excess ticket counts in specific categories
StanfordStanfordMITMIT
= Complete = Underway = Stalled
13
Key actions requiredKey actions required
Fully dynamic dashboard for roll-up, drill-downFully dynamic dashboard for roll-up, drill-down–– Dependent on software investmentsDependent on software investments–– Key to manage backlog, responsiveness, key operationsKey to manage backlog, responsiveness, key operations
Waiting for 2 key hires at StanfordWaiting for 2 key hires at Stanford Better support systemsBetter support systems
–– Installation of Knowledge Management SystemInstallation of Knowledge Management System–– Password reset toolsPassword reset tools–– Replacement of ACDReplacement of ACD
Focus on impact of specific roll-outs, new projectsFocus on impact of specific roll-outs, new projects–– Stanford roll out of Oracle FinancialsStanford roll out of Oracle Financials–– MIT roll out of SAP HR in the next six monthsMIT roll out of SAP HR in the next six months
14
Many longer term plans called for investmentsMany longer term plans called for investments
Next PhaseNext Phase
Next PhaseNext Phase
Next PhaseNext Phase
Long TermLong Term
High Performing TeamHigh Performing Team Solicit employee feedbackSolicit employee feedback
Track % of HD Certifications and training $ per employeeTrack % of HD Certifications and training $ per employee
Next PhaseNext Phase
Next PhaseNext Phase
Next PhaseNext Phase
Next PhaseNext Phase
In PlaceIn Place
ManagementManagement Create cross-functional ITSS team for Delphi rolloutCreate cross-functional ITSS team for Delphi rollout
Institute regular review of metrics with financeInstitute regular review of metrics with finance
Create Help Desk Create Help Desk ““Standard Standard Oprtng ProcdrsOprtng Procdrs”” & & HandbkHandbk
In DesignIn DesignIn DesignIn DesignBi-annual joint customer surveyBi-annual joint customer survey
Next PhaseNext Phase
Long TermLong Term
Next PhaseNext Phase
DraftDraft
Next PhaseNext Phase
Next PhaseNext Phase
Long TermLong Term
In PlaceIn Place
DraftDraft
Long TermLong Term
Software or Hardware InvestmentsSoftware or Hardware Investments Remedy changes; engage consultantRemedy changes; engage consultant
Casetracker : Casetracker : touch minutestouch minutes per case and per case and escalationsescalations (tiers) (tiers)
Knowledge Management SystemKnowledge Management System
ACD ACD ““call-boardcall-board””
Create Create ““dashboarddashboard””
Self-serve password reset toolsSelf-serve password reset tools
StanfordStanfordMITMIT
= Complete or On track = Moving = Stalled or at risk
15
Environments have changed.Environments have changed.Investments are still required for meaningful improvements.Investments are still required for meaningful improvements.
$15K$15KDashboardDashboard
$32K$32KCreation of Standard Operating ProceduresCreation of Standard Operating Procedures
$10K $10K$10K $10KJoint customer satisfaction surveyJoint customer satisfaction survey
$ 60K$ 60KSelf-serve password reset toolsSelf-serve password reset tools
$ 15K$ 15KACD ACD ““call-boardcall-board””
$300K $300K$300K $300K
$ $ ??
Purchase Knowledge Management SystemPurchase Knowledge Management System
Explore open-source knowledge managementExplore open-source knowledge management
$ 125K$ 125KCasetracker Casetracker consultant (internal funding in FY 04)consultant (internal funding in FY 04)
$ 60K$ 60KRemedy consultantRemedy consultant
MITMIT StanfordStanfordSoftware or Hardware InvestmentsSoftware or Hardware Investments
= Complete or On track = Moving = Stalled or at risk
16
Joint customer satisfaction benchmarks willJoint customer satisfaction benchmarks willadd tremendous depth and credibilityadd tremendous depth and credibility
Joint survey in Nov 2003Joint survey in Nov 2003–– Results available by JanuaryResults available by January
–– Both campuses will complete large Both campuses will complete large ERPs ERPs in Aug/Sepin Aug/Sep
–– 60-80% of questions in common60-80% of questions in common
Scope:Scope:–– ITSS and IS servicesITSS and IS services
–– A compliment to, not replacement of, the spot surveysA compliment to, not replacement of, the spot surveysof weekly HD transactionsof weekly HD transactions
17
Next steps: HD BenchmarkingNext steps: HD Benchmarking
Assess all HD benchmarksAssess all HD benchmarks
Meet with Meet with ““CC’’ss”” in December in December6 month review6 month review
Ivy+ Support workshop on 10/27Ivy+ Support workshop on 10/27
EDUCAUSE November 4-7EDUCAUSE November 4-7ExternalExternalcommunicationcommunication
Need to educate & develop buy-in!Need to educate & develop buy-in!
Linking front-line staff acrossLinking front-line staff acrosscampuses/coasts (video conference?)campuses/coasts (video conference?)
CommunicateCommunicate
Casetracker Casetracker & Remedy enhancements& Remedy enhancements
Knowledge Management SystemKnowledge Management SystemSoftwareSoftwareInvestmentsInvestments
Manage daily operations with dataManage daily operations with data
Dashboard, dataDashboard, dataUse it !Use it !
18
Next Steps:HD BenchmarkingNext Steps:HD Benchmarking
JJ FF MM
Consider inviting others to benchmarkConsider inviting others to benchmark
Present EDUCAUSE & the Present EDUCAUSE & the ““CC’’ss””
Conduct Ivy + workshopConduct Ivy + workshop
DD AAAA SS
Identify operational changesIdentify operational changes
Use dashboard; conduct 6-month reviewUse dashboard; conduct 6-month review
Internal communicationsInternal communications
Casetracker Casetracker & Remedy enhancements& Remedy enhancements
NNOOJJMonths:Months:
= On-site visits
2003 2004
19
Evaluate the bottom line results twice a yearEvaluate the bottom line results twice a year
Analyze topics, media, teams, and Analyze topics, media, teams, and costscosts–– Elapsed time, touch minutes, customerElapsed time, touch minutes, customer
satisfaction, volumes, quality of resolution, costsatisfaction, volumes, quality of resolution, costper caseper case
Assess teams and employeesAssess teams and employees–– Customer satisfaction, training, hours logged,Customer satisfaction, training, hours logged,
productivity, quality of resolution of casesproductivity, quality of resolution of cases
20
DiscussionDiscussion
Additional Additional ““metricsmetrics”” efforts in IS on each efforts in IS on eachcampuscampus
Joint efforts within ISJoint efforts within IS
Additional metrics efforts beyond ISAdditional metrics efforts beyond IS
Beyond MIT and Stanford?Beyond MIT and Stanford?
21
Efforts within ISEfforts within IS
Customer surveyCustomer survey
PC procurementPC procurementJointJoint
Consistent reportingConsistent reportingacross Support Deptacross Support Dept
Total EconomicTotal EconomicImpactImpact
EachEachCampusCampus
MITMITStanfordStanford
22
Future IS benchmarking?Future IS benchmarking?
Support of ERP systemsSupport of ERP systems
Data center managementData center management
TrainingTraining
Network securityNetwork security
Service level agreementsService level agreements
Others?Others?
23
Broaden beyond ISBroaden beyond IS
A cultural shift across the administrationA cultural shift across the administration–– Management by factManagement by fact
–– Dashboard for each areaDashboard for each area
–– Collaboration across departments,Collaboration across departments,»» At senior management meetings, joint effortsAt senior management meetings, joint efforts
»» Measuring the same things the same way (e.g., customer calls)Measuring the same things the same way (e.g., customer calls)
–– Other departments to considerOther departments to consider»» Controller, Sponsored Research, Facilities?Controller, Sponsored Research, Facilities?
Including other institutionsIncluding other institutions
24
Benchmarking to improveBenchmarking to improveadministration overalladministration overall
BenefitsBenefits–– Management by factManagement by fact –– creating a cultural value; creating a cultural value;
mitigating ad hoc myths through use of valid qualitativemitigating ad hoc myths through use of valid qualitativeand quantitative data.and quantitative data.
–– Meaningful, consistent data & metrics that tell a storyMeaningful, consistent data & metrics that tell a story–– Jolts to each system; quick start to implementJolts to each system; quick start to implement–– Good friendships and fun!Good friendships and fun!
Good fit in general for higher educationGood fit in general for higher education–– Open to sharingOpen to sharing–– Ideas have already been tested on another campusIdeas have already been tested on another campus–– Does not require IT systems to be identicalDoes not require IT systems to be identical–– Immediate benefits!Immediate benefits!
25
Context for Comparison: FY 03 SizingContext for Comparison: FY 03 Sizing
-5%-5%41,832 41,832 2243,81943,819Tickets ProcessedTickets Processed
-8%-8%25.025.026.726.7Staff (FTE)Staff (FTE)
42%42%$x$x$x$xAnnual Base BudgetAnnual Base Budget
Help Desk InformationHelp Desk Information
60%60%431 431 11268268Staff (FTE)Staff (FTE)
79%79%$x $x 11$x$xAnnual Base BudgetAnnual Base Budget
IT Department InformationIT Department Information
37%37%$2,134,000$2,134,000$1,554,000$1,554,000University Consolidated BudgetUniversity Consolidated Budget
28%28%25,22325,22319,71719,717Total PopulationTotal Population
39%39%14,33914,33910,31710,317StudentsStudents
16%16%10,88410,8849,4009,400Faculty & StaffFaculty & Staff
VarianceVarianceStanfordStanfordMITMITDemographicsDemographics
1 Includes providing telecommunications for Stanford’s hospital2 Annual ticket count estimated based on 8 months of actual data.
26
Gauging investment and effectivenessGauging investment and effectiveness
53%53%$64.18$64.18$41.83$41.83Help Desk Budget / TicketHelp Desk Budget / Ticket
5%5%1,6751,6751,5951,595Tickets / Help Desk FTETickets / Help Desk FTE
42%42%1,0101,010712712Population per HD EmployeePopulation per HD Employee
-48%-48%1.71.72.22.2Tickets / School PopulationTickets / School Population
3.1%3.1%4.2%4.2%Help Desk Budget / IT BudgetHelp Desk Budget / IT Budget
4.0%4.0%2.9%2.9%IT Dept Budget / University BudgetIT Dept Budget / University Budget
VarianceVarianceStanfordStanfordMITMIT
27
IT Benchmarking IT Benchmarking –– Stanford/MIT Stanford/MITHelp DeskHelp Desk
a. a. A surveyor's mark cut in some durable material, as a rock, wall, gate-pillar, faceA surveyor's mark cut in some durable material, as a rock, wall, gate-pillar, faceof a building, etc., to indicate the starting, closing, or any suitable intermediate,of a building, etc., to indicate the starting, closing, or any suitable intermediate,point in a line of levels for the determination of altitudes over the face of apoint in a line of levels for the determination of altitudes over the face of acountry. It consists of a series of wedge-shaped incisures, in the form of thecountry. It consists of a series of wedge-shaped incisures, in the form of the‘‘broad-arrowbroad-arrow’’ with a horizontal bar through its apex, thus . When the spot is with a horizontal bar through its apex, thus . When the spot isbelow sea-level, as in mining surveys, the mark is invertedbelow sea-level, as in mining surveys, the mark is inverted..
1842 FRANCIS1842 FRANCIS Dict Dict. Arts, Bench marks, in surveying, fixed points left on a line of. Arts, Bench marks, in surveying, fixed points left on a line ofsurvey for reference at a future time, consisting of cuts in trees, pegs driven intosurvey for reference at a future time, consisting of cuts in trees, pegs driven intothe ground, etc.1883 G. J. SYMONS Brit.the ground, etc.1883 G. J. SYMONS Brit. Rainf Rainf. 134 A series of levels has been. 134 A series of levels has beentaken from the gauge to an Ordnance bench mark.taken from the gauge to an Ordnance bench mark.
b.b. transf transf. and fig. A point of reference; a criterion, touchstone.. and fig. A point of reference; a criterion, touchstone.
1884 Science IV. 202/1 These star-places..are the reference-points and bench-marks1884 Science IV. 202/1 These star-places..are the reference-points and bench-marksof the universe. 1957 R. K. MERTON Student-Physician III. 195 Standardsof the universe. 1957 R. K. MERTON Student-Physician III. 195 Standardsrepresent represent ‘‘benchmarksbenchmarks’’ with which students compare their ability and with which students compare their ability andperformance. 1963 Economist 18 May 663/2 Foreign firms have failed toperformance. 1963 Economist 18 May 663/2 Foreign firms have failed toget..orders unless they have offered a price advantage of at least 50 per cent. Thisget..orders unless they have offered a price advantage of at least 50 per cent. Thisis the is the ‘‘bench~markbench~mark’’..
Source: Oxford English Dictionary, 2nd Edition, 1989. Source: Oxford English Dictionary, 2nd Edition, 1989.