Evaluating HRD Programs

41
Werner & DeSimone (2006) 1 Evaluating HRD Programs Chapter 7

description

Evaluating HRD Programs. Chapter 7. Learning Objectives. Define evaluation and explain its role/purpose in HRD. Compare different models of evaluation. Discuss the various methods of data collection for HRD evaluation. Explain the role of research design in HRD evaluation. - PowerPoint PPT Presentation

Transcript of Evaluating HRD Programs

Page 1: Evaluating HRD Programs

Werner & DeSimone (2006) 1

Evaluating HRD Programs

Chapter 7

Page 2: Evaluating HRD Programs

Werner & DeSimone (2006) 2

Learning Objectives Define evaluation and explain its

role/purpose in HRD. Compare different models of evaluation. Discuss the various methods of data collection for HRD evaluation. Explain the role of research design in HRD evaluation.Describe the ethical issues involved in conducting HRD evaluation. Identify and explain the choices available for translating evaluation results into dollar terms.

Page 3: Evaluating HRD Programs

Werner & DeSimone (2006) 3

Effectiveness

The degree to which a training (or other HRD program) achieves its intended purposeMeasures are relative to some starting pointMeasures how well the desired goal is achieved

Page 4: Evaluating HRD Programs

Werner & DeSimone (2006) 4

Evaluation

Page 5: Evaluating HRD Programs

Werner & DeSimone (2006) 5

HRD Evaluation

It is “the systematic collection of descriptive and judgmental information necessary to make effective training decisions related to the selection, adoption, value, and modification of various instructional activities.”

Page 6: Evaluating HRD Programs

Werner & DeSimone (2006) 6

In Other Words…

Are we training: the right peoplethe right “stuff”the right waywith the right materialsat the right time?

Page 7: Evaluating HRD Programs

Werner & DeSimone (2006) 7

Evaluation Needs

Descriptive and judgmental information needed Objective and subjective data

Information gathered according to a plan and in a desired format

Gathered to provide decision making information

Page 8: Evaluating HRD Programs

Werner & DeSimone (2006) 8

Purposes of Evaluation

Determine whether the program is meeting the intended objectivesIdentify strengths and weaknessesDetermine cost-benefit ratioIdentify who benefited most or leastDetermine future participantsProvide information for improving HRD programs

Page 9: Evaluating HRD Programs

Werner & DeSimone (2006) 9

Purposes of Evaluation – 2

Reinforce major points to be madeGather marketing informationDetermine if training program is appropriateEstablish management database

Page 10: Evaluating HRD Programs

Werner & DeSimone (2006) 10

Evaluation Bottom Line

Is HRD a revenue contributor or a revenue user?Is HRD credible to line and upper-level managers?Are benefits of HRD readily evident to all?

Page 11: Evaluating HRD Programs

Werner & DeSimone (2006) 11

How Often are HRD Evaluations Conducted?

Not often enough!!!Frequently, only end-of-course participant reactions are collectedTransfer to the workplace is evaluated less frequently

Page 12: Evaluating HRD Programs

Werner & DeSimone (2006) 12

Why HRD Evaluations are Rare

Reluctance to having HRD programs evaluated

Evaluation needs expertise and resources

Factors other than HRD cause performance improvements – e.g., Economy Equipment Policies, etc.

Page 13: Evaluating HRD Programs

Werner & DeSimone (2006) 13

Need for HRD Evaluation

Shows the value of HRDProvides metrics for HRD efficiencyDemonstrates value-added approach for HRDDemonstrates accountability for HRD activities

Page 14: Evaluating HRD Programs

Werner & DeSimone (2006) 14

Make or Buy Evaluation

“I bought it, therefore it is good.”“Since it’s good, I don’t need to post-test.”Who says it’s: Appropriate? Effective? Timely? Transferable to the workplace?

Page 15: Evaluating HRD Programs

Werner & DeSimone (2006) 15

Models and Frameworks of Evaluation

Table 7-1 lists six frameworks for evaluationThe most popular is that of D. Kirkpatrick: Reaction Learning Job Behavior Results

Page 16: Evaluating HRD Programs

Werner & DeSimone (2006) 16

Kirkpatrick’s Four Levels

Reaction Focus on trainee’s reactions

Learning Did they learn what they were supposed to?

Job Behavior Was it used on job?

Results Did it improve the organization’s

effectiveness?

Page 17: Evaluating HRD Programs

Werner & DeSimone (2006) 17

Issues Concerning Kirkpatrick’s Framework

Most organizations don’t evaluate at all four levelsFocuses only on post-trainingDoesn’t treat inter-stage improvementsWHAT ARE YOUR THOUGHTS?

Page 18: Evaluating HRD Programs

Werner & DeSimone (2006) 18

Data Collection for HRD Evaluation

Possible methods:InterviewsQuestionnairesDirect observationWritten testsSimulation/Performance testsArchival performance information

Page 19: Evaluating HRD Programs

Werner & DeSimone (2006) 19

Interviews

Advantages:FlexibleOpportunity for clarificationDepth possiblePersonal contact

Limitations:High reactive effectsHigh costFace-to-face threat potentialLabor intensiveTrained observers needed

Page 20: Evaluating HRD Programs

Werner & DeSimone (2006) 20

Questionnaires

Advantages:Low cost to administerHonesty increasedAnonymity possibleRespondent sets the paceVariety of options

Limitations:Possible inaccurate dataResponse conditions not controlledRespondents set varying pacesUncontrolled return rate

Page 21: Evaluating HRD Programs

Werner & DeSimone (2006) 21

Direct Observation

Advantages:NonthreateningExcellent way to measure behavior change

Limitations:Possibly disruptiveReactive effects are possibleMay be unreliableNeed trained observers

Page 22: Evaluating HRD Programs

Werner & DeSimone (2006) 22

Written Tests

Advantages:Low purchase costReadily scoredQuickly processedEasily administeredWide sampling possible

Limitations:May be threateningPossibly no relation to job performanceMeasures only cognitive learningRelies on normsConcern for racial/ ethnic bias

Page 23: Evaluating HRD Programs

Werner & DeSimone (2006) 23

Simulation/Performance Tests

Advantages:ReliableObjectiveClose relation to job performanceIncludes cognitive, psychomotor and affective domains

Limitations:Time consumingSimulations often difficult to createHigh costs to development and use

Page 24: Evaluating HRD Programs

Werner & DeSimone (2006) 24

Archival Performance Data

Advantages:ReliableObjectiveJob-basedEasy to reviewMinimal reactive effects

Limitations:Criteria for keeping/ discarding recordsInformation system discrepanciesIndirectNot always usableRecords prepared for other purposes

Page 25: Evaluating HRD Programs

Werner & DeSimone (2006) 25

Choosing Data Collection Methods

Reliability Consistency of results, and freedom from

collection method bias and error

Validity Does the device measure what we want

to measure?

Practicality Does it make sense in terms of the

resources used to get the data?

Page 26: Evaluating HRD Programs

Werner & DeSimone (2006) 26

Type of Data Used/Needed

Individual performanceSystemwide performanceEconomic

Page 27: Evaluating HRD Programs

Werner & DeSimone (2006) 27

Individual Performance Data

Individual knowledge

Individual behaviorsExamples: Test scores Performance quantity, quality, and

timeliness Attendance records Attitudes

Page 28: Evaluating HRD Programs

Werner & DeSimone (2006) 28

Systemwide Performance Data

ProductivityScrap/rework ratesCustomer satisfaction levelsOn-time performance levelsQuality rates and improvement rates

Page 29: Evaluating HRD Programs

Werner & DeSimone (2006) 29

Economic Data

ProfitsProduct liability claimsAvoidance of penaltiesMarket shareCompetitive positionReturn on investment (ROI)Financial utility calculations

Page 30: Evaluating HRD Programs

Werner & DeSimone (2006) 30

Use of Self-Report Data

Most common methodPre-training and post-training data Problems: Mono-method bias

Desire to be consistent between tests

Socially desirable responses Response Shift Bias:

Trainees adjust expectations to training

Page 31: Evaluating HRD Programs

Werner & DeSimone (2006) 31

Research Design

Specifies in advance:

the expected results of the study

the methods of data collection to be used

how the data will be analyzed

Page 32: Evaluating HRD Programs

Werner & DeSimone (2006) 32

Assessing the Impact of HRD

Money is the language of business.You MUST talk dollars, not HRD jargon.No one (except maybe you) cares about “the effectiveness of training interventions as measured by and analysis of formal pretest, posttest control group data.”

Page 33: Evaluating HRD Programs

Werner & DeSimone (2006) 33

HRD Program Assessment

HRD programs and training are investments

Line managers often see HR and HRD as costs – i.e., revenue users, not revenue producersYou must prove your worth to the organization – Or you’ll have to find another

organization…

Page 34: Evaluating HRD Programs

Werner & DeSimone (2006) 34

Two Basic Methods for Assessing Financial Impact

Evaluation of training costsUtility analysis

Page 35: Evaluating HRD Programs

Werner & DeSimone (2006) 35

Evaluation of Training Costs

Cost-benefit analysis Compares cost of training to benefits

gained such as attitudes, reduction in accidents, reduction in employee sick-days, etc.

Cost-effectiveness analysis Focuses on increases in quality,

reduction in scrap/rework, productivity, etc.

Page 36: Evaluating HRD Programs

Werner & DeSimone (2006) 36

Return on Investment

Return on investment = Results/Costs

Page 37: Evaluating HRD Programs

Werner & DeSimone (2006) 37

Calculating Training Return On Investment

    Results Results    

Operational How Before After Differences ExpressedResults Area Measured Training Training (+ or –) in $

Quality of panels % rejected 2% rejected 1.5% rejected

.5% $720 per day

    1,440 panels 1,080 panels

360 panels

$172,800      per day   per day     per year

Housekeeping Visual 10 defects 2 defects 8 defects

Not measur-    inspection   (average)   (average)     able in $

   using

  

   

 

  20-item

       

 

  checklist

       

Preventable Number of 24 per year 16 per year 8 per year

   accidents   accidents

       

  Direct cost $144,000 $96,000 per

$48,000 $48,000 per

   of each   per year   year

   year

   accident

       

 

 

Return

Investment

    Total savings: $220,800.00

ROI = =    

   

   

   SOURCE: From D. G. Robinson & J. Robinson (1989). Training for impact. Training and Development Journal, 43(8), 41. Printed by

permission.

Operational Results Training Costs

= $220,800$32,564

= 6.8

Page 38: Evaluating HRD Programs

Werner & DeSimone (2006) 38

Measuring Benefits

Change in quality per unit measured in dollars

Reduction in scrap/rework measured in dollar cost of labor and materials

Reduction in preventable accidents measured in dollars

ROI = Benefits/Training costs

Page 39: Evaluating HRD Programs

Werner & DeSimone (2006) 39

Ways to Improve HRD Assessment

Walk the walk, talk the talk: MONEY

Involve HRD in strategic planning

Involve management in HRD planning and estimation efforts Gain mutual ownership

Use credible and conservative estimates

Share credit for successes and blame for failures

Page 40: Evaluating HRD Programs

Werner & DeSimone (2006) 40

HRD Evaluation Steps

1. Analyze needs.2. Determine explicit evaluation strategy.3. Insist on specific and measurable

training objectives.4. Obtain participant reactions.5. Develop criterion measures/instruments

to measure results.6. Plan and execute evaluation strategy.

Page 41: Evaluating HRD Programs

Werner & DeSimone (2006) 41

Summary

Training results must be measured against costsTraining must contribute to the “bottom line”HRD must justify itself repeatedly as a revenue enhancer, not a revenue waster