57
1 Evaluating HRD Evaluating HRD Programs Programs Professor Jayashree Professor Jayashree Sadri and Dr. Sorab Sadri and Dr. Sorab Sadri Sadri

Evaluating HRD interventions

Embed Size (px)

DESCRIPTION

HRD INTERVENTION

Citation preview

Page 1: Evaluating HRD interventions

11

Evaluating HRD Evaluating HRD ProgramsPrograms

Professor Jayashree Sadri Professor Jayashree Sadri and Dr. Sorab Sadriand Dr. Sorab Sadri

Page 2: Evaluating HRD interventions

22

EffectivenessEffectiveness

The degree to which a training (or The degree to which a training (or other HRD program) achieves its other HRD program) achieves its intended purposeintended purpose

Measures are relative to some Measures are relative to some starting pointstarting point

Measures how well the desired Measures how well the desired goal is achievedgoal is achieved

Page 3: Evaluating HRD interventions

33

EvaluationEvaluation

Page 4: Evaluating HRD interventions

44

HRD EvaluationHRD Evaluation

Textbook definition:Textbook definition:

““The systematic collection of The systematic collection of descriptive and judgmental descriptive and judgmental information necessary to make information necessary to make effective training decisions related effective training decisions related to the selection, adoption, value, to the selection, adoption, value, and modification of various and modification of various instructional activities.”instructional activities.”

Page 5: Evaluating HRD interventions

55

In Other Words…In Other Words…

Are we training: Are we training: the right peoplethe right people the right “stuff”the right “stuff” the right waythe right way with the right materialswith the right materials at the right time?at the right time?

Page 6: Evaluating HRD interventions

66

Evaluation NeedsEvaluation Needs

Descriptive and judgmental Descriptive and judgmental information neededinformation needed– Objective and subjective dataObjective and subjective data

Information gathered according to Information gathered according to a plan and in a desired formata plan and in a desired format

Gathered to provide decision Gathered to provide decision making informationmaking information

Page 7: Evaluating HRD interventions

77

Purposes of EvaluationPurposes of Evaluation

Determine whether the program is Determine whether the program is meeting the intended objectivesmeeting the intended objectives

Identify strengths and weaknessesIdentify strengths and weaknesses Determine cost-benefit ratioDetermine cost-benefit ratio Identify who benefited most or leastIdentify who benefited most or least Determine future participantsDetermine future participants Provide information for improving Provide information for improving

HRD programsHRD programs

Page 8: Evaluating HRD interventions

88

Purposes of Evaluation Purposes of Evaluation –– 2 2

Reinforce major points to be madeReinforce major points to be made Gather marketing informationGather marketing information Determine if training program is Determine if training program is

appropriateappropriate Establish management databaseEstablish management database

Page 9: Evaluating HRD interventions

99

Evaluation Bottom LineEvaluation Bottom Line

Is HRD a revenue contributor or a Is HRD a revenue contributor or a revenue user?revenue user?

Is HRD credible to line and upper-Is HRD credible to line and upper-level managers?level managers?

Are benefits of HRD readily evident Are benefits of HRD readily evident to all?to all?

Page 10: Evaluating HRD interventions

1010

How Often are HRD Evaluations How Often are HRD Evaluations Conducted?Conducted?

Not often enough!!!Not often enough!!! Frequently, only end-of-course Frequently, only end-of-course

participant reactions are collectedparticipant reactions are collected Transfer to the workplace is Transfer to the workplace is

evaluated less frequentlyevaluated less frequently

Page 11: Evaluating HRD interventions

1111

Why HRD Evaluations are RareWhy HRD Evaluations are Rare

Reluctance to having HRD programs Reluctance to having HRD programs evaluatedevaluated

Evaluation needs expertise and Evaluation needs expertise and resourcesresources

Factors other than HRD cause Factors other than HRD cause performance improvements performance improvements –– e.g., e.g.,– EconomyEconomy– EquipmentEquipment– Policies, etc.Policies, etc.

Page 12: Evaluating HRD interventions

1212

Need for HRD EvaluationNeed for HRD Evaluation

Shows the value of HRDShows the value of HRD Provides metrics for HRD efficiencyProvides metrics for HRD efficiency Demonstrates value-added Demonstrates value-added

approach for HRDapproach for HRD Demonstrates accountability for Demonstrates accountability for

HRD activitiesHRD activities Everyone else has it… why not Everyone else has it… why not

HRD?HRD?

Page 13: Evaluating HRD interventions

1313

Make or Buy EvaluationMake or Buy Evaluation

““I bought it, therefore it is good.”I bought it, therefore it is good.” ““Since it’s good, I don’t need to post-Since it’s good, I don’t need to post-

test.”test.” Who says it’s: Who says it’s:

– Appropriate?Appropriate?– Effective?Effective?– Timely?Timely?– Transferable to the workplace?Transferable to the workplace?

Page 14: Evaluating HRD interventions

1414

Evolution of Evaluation EffortsEvolution of Evaluation Efforts

1.1. AnecdotalAnecdotal approach approach –– talk to other talk to other users users

2.2. Try before buy Try before buy –– borrow and use borrow and use samples samples

3.3. AnalyticalAnalytical approach approach –– match match research data to training needs research data to training needs

4.4. HolisticHolistic approach approach –– look at overall look at overall HRD process, as well as individual HRD process, as well as individual training training

Page 15: Evaluating HRD interventions

1515

Models and Frameworks of Models and Frameworks of EvaluationEvaluation

Table 7-1 lists six frameworks for Table 7-1 lists six frameworks for evaluationevaluation

The most popular is that of D. The most popular is that of D. Kirkpatrick:Kirkpatrick:

– ReactionReaction– LearningLearning– Job BehaviorJob Behavior– ResultsResults

Page 16: Evaluating HRD interventions

1616

Kirkpatrick’s Four LevelsKirkpatrick’s Four Levels

ReactionReaction– Focus on trainee’s reactionsFocus on trainee’s reactions

LearningLearning– Did they learn what they were supposed to?Did they learn what they were supposed to?

Job BehaviorJob Behavior– Was it used on job?Was it used on job?

ResultsResults– Did it improve the organization’s effectiveness?Did it improve the organization’s effectiveness?

Page 17: Evaluating HRD interventions

1717

Issues Concerning Kirkpatrick’s Issues Concerning Kirkpatrick’s FrameworkFramework

Most organizations don’t evaluate Most organizations don’t evaluate at all four levelsat all four levels

Focuses only on post-trainingFocuses only on post-training Doesn’t treat inter-stage Doesn’t treat inter-stage

improvementsimprovements WHAT ARE YOUR THOUGHTS?WHAT ARE YOUR THOUGHTS?

Page 18: Evaluating HRD interventions

1818

Other Frameworks/ModelsOther Frameworks/Models

CIPP: Context, Input, Process, Product CIPP: Context, Input, Process, Product (Galvin, 1983)(Galvin, 1983)

Brinkerhoff (1987): Brinkerhoff (1987): – Goal settingGoal setting– Program designProgram design– Program implementationProgram implementation– Immediate outcomesImmediate outcomes– Usage outcomesUsage outcomes– Impacts and worthImpacts and worth

Page 19: Evaluating HRD interventions

1919

Other Frameworks/Models – 2Other Frameworks/Models – 2

Kraiger, Ford, & Salas (1993): Kraiger, Ford, & Salas (1993): – Cognitive outcomesCognitive outcomes– Skill-based outcomesSkill-based outcomes– Affective outcomesAffective outcomes

Holton (1996): Five Categories:Holton (1996): Five Categories:– Secondary InfluencesSecondary Influences– Motivation ElementsMotivation Elements– Environmental ElementsEnvironmental Elements– OutcomesOutcomes– Ability/Enabling ElementsAbility/Enabling Elements

Page 20: Evaluating HRD interventions

2020

Other Frameworks/Models – 3Other Frameworks/Models – 3

Phillips (1996):Phillips (1996):– Reaction and Planned ActionReaction and Planned Action– LearningLearning– Applied Learning on the JobApplied Learning on the Job– Business ResultsBusiness Results– ROIROI

Page 21: Evaluating HRD interventions

2121

A Suggested Framework – 1A Suggested Framework – 1

ReactionReaction– Did trainees like the training?Did trainees like the training?– Did the training seem useful?Did the training seem useful?

LearningLearning– How much did they learn?How much did they learn?

BehaviorBehavior– What behavior change occurred?What behavior change occurred?

Page 22: Evaluating HRD interventions

2222

Suggested Framework – 2Suggested Framework – 2

ResultsResults– What were the tangible outcomes?What were the tangible outcomes?– What was the return on investment What was the return on investment

(ROI)?(ROI)?– What was the contribution to the What was the contribution to the

organization?organization?

Page 23: Evaluating HRD interventions

2323

Data Collection for HRD Data Collection for HRD EvaluationEvaluation

Possible methods:Possible methods: InterviewsInterviews QuestionnairesQuestionnaires Direct observationDirect observation Written testsWritten tests Simulation/Performance testsSimulation/Performance tests Archival performance informationArchival performance information

Page 24: Evaluating HRD interventions

2424

InterviewsInterviews

AdvantagesAdvantages:: FlexibleFlexible Opportunity for Opportunity for

clarificationclarification Depth possibleDepth possible Personal contactPersonal contact

Limitations:Limitations: High reactive High reactive

effectseffects High costHigh cost Face-to-face threat Face-to-face threat

potentialpotential Labor intensiveLabor intensive Trained observers Trained observers

neededneeded

Page 25: Evaluating HRD interventions

2525

QuestionnairesQuestionnaires

AdvantagesAdvantages:: Low cost to Low cost to

administeradminister Honesty increasedHonesty increased Anonymity possibleAnonymity possible Respondent sets Respondent sets

the pacethe pace Variety of optionsVariety of options

LimitationsLimitations:: Possible inaccurate Possible inaccurate

datadata Response Response

conditions not conditions not controlledcontrolled

Respondents set Respondents set varying pacesvarying paces

Uncontrolled return Uncontrolled return raterate

Page 26: Evaluating HRD interventions

2626

Direct ObservationDirect Observation

AdvantagesAdvantages:: NonthreateningNonthreatening Excellent way to Excellent way to

measure behavior measure behavior changechange

LimitationsLimitations:: Possibly disruptivePossibly disruptive Reactive effects Reactive effects

are possibleare possible May be unreliableMay be unreliable Need trained Need trained

observersobservers

Page 27: Evaluating HRD interventions

2727

Written TestsWritten Tests

AdvantagesAdvantages:: Low purchase costLow purchase cost Readily scoredReadily scored Quickly processedQuickly processed Easily administeredEasily administered Wide sampling Wide sampling

possiblepossible

LimitationsLimitations:: May be threateningMay be threatening Possibly no relation Possibly no relation

to job performanceto job performance Measures only Measures only

cognitive learningcognitive learning Relies on normsRelies on norms Concern for racial/ Concern for racial/

ethnic biasethnic bias

Page 28: Evaluating HRD interventions

2828

Simulation/Performance TestsSimulation/Performance Tests

AdvantagesAdvantages:: ReliableReliable ObjectiveObjective Close relation to Close relation to

job performancejob performance Includes cognitive, Includes cognitive,

psychomotor and psychomotor and affective domainsaffective domains

LimitationsLimitations:: Time consumingTime consuming Simulations often Simulations often

difficult to createdifficult to create High costs to High costs to

development and development and useuse

Page 29: Evaluating HRD interventions

2929

Archival Performance DataArchival Performance Data

AdvantagesAdvantages:: ReliableReliable ObjectiveObjective Job-basedJob-based Easy to reviewEasy to review Minimal reactive Minimal reactive

effectseffects

LimitationsLimitations:: Criteria for Criteria for

keeping/ discarding keeping/ discarding recordsrecords

Information system Information system discrepanciesdiscrepancies

IndirectIndirect Not always usableNot always usable Records prepared Records prepared

for other purposesfor other purposes

Page 30: Evaluating HRD interventions

3030

Choosing Data Collection Choosing Data Collection MethodsMethods

ReliabilityReliability– Consistency of results, and freedom from Consistency of results, and freedom from

collection method bias and errorcollection method bias and error

ValidityValidity– Does the device measure what we want to Does the device measure what we want to

measure?measure?

PracticalityPracticality– Does it make sense in terms of the Does it make sense in terms of the

resources used to get the data?resources used to get the data?

Page 31: Evaluating HRD interventions

3131

Type of Data Used/NeededType of Data Used/Needed

Individual performanceIndividual performance Systemwide performanceSystemwide performance EconomicEconomic

Page 32: Evaluating HRD interventions

3232

Individual Performance DataIndividual Performance Data

Individual knowledgeIndividual knowledge Individual behaviorsIndividual behaviors Examples:Examples:

– Test scoresTest scores– Performance quantity, quality, and Performance quantity, quality, and

timelinesstimeliness– Attendance recordsAttendance records– AttitudesAttitudes

Page 33: Evaluating HRD interventions

3333

Systemwide Performance DataSystemwide Performance Data

ProductivityProductivity Scrap/rework ratesScrap/rework rates Customer satisfaction levelsCustomer satisfaction levels On-time performance levelsOn-time performance levels Quality rates and improvement ratesQuality rates and improvement rates

Page 34: Evaluating HRD interventions

3434

Economic DataEconomic Data

ProfitsProfits Product liability claimsProduct liability claims Avoidance of penaltiesAvoidance of penalties Market shareMarket share Competitive positionCompetitive position Return on investment (ROI)Return on investment (ROI) Financial utility calculationsFinancial utility calculations

Page 35: Evaluating HRD interventions

3535

Use of Self-Report DataUse of Self-Report Data

Most common methodMost common method Pre-training and post-training data Pre-training and post-training data Problems:Problems:

– Mono-method biasMono-method bias Desire to be consistent between testsDesire to be consistent between tests

– Socially desirable responsesSocially desirable responses

– Response Shift Bias: Response Shift Bias: Trainees adjust expectations to trainingTrainees adjust expectations to training

Page 36: Evaluating HRD interventions

3636

Research DesignResearch Design

Specifies in advance:Specifies in advance: the expected results of the studythe expected results of the study the methods of data collection to be the methods of data collection to be

usedused how the data will be analyzedhow the data will be analyzed

Page 37: Evaluating HRD interventions

3737

Research Design IssuesResearch Design Issues

Pretest and PosttestPretest and Posttest– Shows trainee what training has Shows trainee what training has

accomplishedaccomplished– Helps eliminate pretest knowledge biasHelps eliminate pretest knowledge bias

Control GroupControl Group– Compares performance of group with Compares performance of group with

training against the performance of a training against the performance of a similar group without trainingsimilar group without training

Page 38: Evaluating HRD interventions

3838

Recommended Research Recommended Research DesignDesign

Pretest and posttest with control Pretest and posttest with control groupgroup

Whenever possible:Whenever possible:– Randomly assign individuals to the test Randomly assign individuals to the test

group and the control group to minimize group and the control group to minimize biasbias

– Use “time-series” approach to data Use “time-series” approach to data collection to verify performance collection to verify performance improvement is due to trainingimprovement is due to training

Page 39: Evaluating HRD interventions

3939

Ethical Issues Concerning Ethical Issues Concerning Evaluation ResearchEvaluation Research

ConfidentialityConfidentiality Informed consentInformed consent Withholding training from control Withholding training from control

groupsgroups Use of deceptionUse of deception Pressure to produce positive resultsPressure to produce positive results

Page 40: Evaluating HRD interventions

4040

Assessing the Impact of HRDAssessing the Impact of HRD

Money is the language of business.Money is the language of business. You MUST talk dollars, not HRD You MUST talk dollars, not HRD

jargon.jargon. No one (except maybe you) cares No one (except maybe you) cares

about “the effectiveness of training about “the effectiveness of training interventions as measured by and interventions as measured by and analysis of formal pretest, posttest analysis of formal pretest, posttest control group data.”control group data.”

Page 41: Evaluating HRD interventions

4141

HRD Program AssessmentHRD Program Assessment

HRD programs and training are HRD programs and training are investmentsinvestments

Line managers often see HR and HRD as Line managers often see HR and HRD as costs costs –– i.e.,i.e., revenue users, not revenue revenue users, not revenue producersproducers

You must prove your worth to the You must prove your worth to the organization –organization –– Or you’ll have to find another Or you’ll have to find another

organization…organization…

Page 42: Evaluating HRD interventions

4242

Two Basic Methods for Two Basic Methods for Assessing Financial ImpactAssessing Financial Impact

Evaluation of training costsEvaluation of training costs Utility analysisUtility analysis

Page 43: Evaluating HRD interventions

4343

Evaluation of Training CostsEvaluation of Training Costs

Cost-benefit analysisCost-benefit analysis– Compares cost of training to benefits Compares cost of training to benefits

gained such as attitudes, reduction in gained such as attitudes, reduction in accidents, reduction in employee sick-accidents, reduction in employee sick-days, etc.days, etc.

Cost-effectiveness analysisCost-effectiveness analysis– Focuses on increases in quality, Focuses on increases in quality,

reduction in scrap/rework, productivity, reduction in scrap/rework, productivity, etc.etc.

Page 44: Evaluating HRD interventions

4444

Return on InvestmentReturn on Investment

Return on investment = Return on investment = Results/CostsResults/Costs

Page 45: Evaluating HRD interventions

4545

Calculating Training Return On Calculating Training Return On InvestmentInvestment

    Results Results    

Operational How Before After Differences ExpressedResults Area Measured Training Training (+ or –) in $

Quality of panels % rejected 2% rejected 1.5% rejected

.5% $720 per day

    1,440 panels 1,080 panels

360 panels

$172,800      per day   per day     per year

Housekeeping Visual 10 defects 2 defects 8 defects

Not measur-    inspection   (average)   (average)     able in $

   using

  

   

 

  20-item

       

 

  checklist

       

Preventable Number of 24 per year 16 per year 8 per year

   accidents   accidents

       

  Direct cost $144,000 $96,000 per

$48,000 $48,000 per

   of each   per year   year

   year

   accident

       

 

 

Return

Investment

    Total savings: $220,800.00

ROI = =    

   

   

   SOURCE: From D. G. Robinson & J. Robinson (1989). Training for impact. Training and Development Journal, 43(8), 41. Printed by

permission.

Operational Results Training Costs

= $220,800$32,564

= 6.8

Page 46: Evaluating HRD interventions

4646

Types of Training CostsTypes of Training Costs

Direct costsDirect costs Indirect costsIndirect costs Development costsDevelopment costs Overhead costsOverhead costs Compensation for participantsCompensation for participants

Page 47: Evaluating HRD interventions

4747

Direct CostsDirect Costs

InstructorInstructor– Base payBase pay– Fringe benefitsFringe benefits– Travel and per diemTravel and per diem

MaterialsMaterials Classroom and audiovisual Classroom and audiovisual

equipmentequipment Travel Travel Food and refreshmentsFood and refreshments

Page 48: Evaluating HRD interventions

4848

Indirect CostsIndirect Costs

Training managementTraining management Clerical/AdministrativeClerical/Administrative Postal/shipping, telephone, Postal/shipping, telephone,

computers, etc.computers, etc. Pre- and post-learning materialsPre- and post-learning materials Other overhead costsOther overhead costs

Page 49: Evaluating HRD interventions

4949

Development CostsDevelopment Costs

Fee to purchase programFee to purchase program Costs to tailor program to Costs to tailor program to

organizationorganization Instructor training costsInstructor training costs

Page 50: Evaluating HRD interventions

5050

Overhead CostsOverhead Costs

General organization supportGeneral organization support Top management participationTop management participation Utilities, facilitiesUtilities, facilities General and administrative costs, General and administrative costs,

such as HRMsuch as HRM

Page 51: Evaluating HRD interventions

5151

Compensation for ParticipantsCompensation for Participants

Participants’ salary and benefits for Participants’ salary and benefits for time away from jobtime away from job

Travel, lodging, and per-diem costsTravel, lodging, and per-diem costs

Page 52: Evaluating HRD interventions

5252

Measuring BenefitsMeasuring Benefits

– Change in quality per unit measured in Change in quality per unit measured in dollarsdollars

– Reduction in scrap/rework measured in Reduction in scrap/rework measured in dollar cost of labor and materialsdollar cost of labor and materials

– Reduction in preventable accidents Reduction in preventable accidents measured in dollarsmeasured in dollars

– ROI = Benefits/Training costsROI = Benefits/Training costs

Page 53: Evaluating HRD interventions

5353

Utility AnalysisUtility Analysis

Uses a statistical approach to Uses a statistical approach to support claims of training support claims of training effectiveness:effectiveness:– N = Number of traineesN = Number of trainees– T = Length of time benefits are expected to lastT = Length of time benefits are expected to last– ddtt = True performance difference resulting from = True performance difference resulting from

training training– SDSDyy = Dollar value of untrained job performance (in = Dollar value of untrained job performance (in

standard deviation units) standard deviation units)– C = Cost of trainingC = Cost of training

U = (N)(T)(dU = (N)(T)(dtt)(Sd)(Sdyy) – C) – C

Page 54: Evaluating HRD interventions

5454

Critical Information for Utility Critical Information for Utility AnalysisAnalysis

ddtt = difference in units between = difference in units between trained/untrained, divided by trained/untrained, divided by standard deviation in units standard deviation in units produced by trainedproduced by trained

SDSDyy = standard deviation in = standard deviation in dollars, or overall productivity of dollars, or overall productivity of organizationorganization

Page 55: Evaluating HRD interventions

5555

Ways to Improve HRD Ways to Improve HRD AssessmentAssessment

Walk the walk, talk the talk: MONEYWalk the walk, talk the talk: MONEY Involve HRD in strategic planningInvolve HRD in strategic planning Involve management in HRD planning and Involve management in HRD planning and

estimation effortsestimation efforts– Gain mutual ownershipGain mutual ownership

Use credible and conservative estimatesUse credible and conservative estimates Share credit for successes and blame for Share credit for successes and blame for

failuresfailures

Page 56: Evaluating HRD interventions

5656

HRD Evaluation StepsHRD Evaluation Steps

1.1. Analyze needs.Analyze needs.

2.2. Determine explicit evaluation strategy.Determine explicit evaluation strategy.

3.3. Insist on specific and measurable training Insist on specific and measurable training objectives.objectives.

4.4. Obtain participant reactions.Obtain participant reactions.

5.5. Develop criterion measures/instruments Develop criterion measures/instruments to measure results.to measure results.

6.6. Plan and execute evaluation strategy.Plan and execute evaluation strategy.

Page 57: Evaluating HRD interventions

5757

SummarySummary

Training results must be measured Training results must be measured against costsagainst costs

Training must contribute to the Training must contribute to the “bottom line”“bottom line”

HRD must justify itself repeatedly HRD must justify itself repeatedly as a revenue enhanceras a revenue enhancer