40
A PROJECT ON IMPACT OF EVALUATION OF TRAINNING AND DEVEOPMENT SUBMITTED IN PARTIAL FULLFILLMENT OF THE REQUIREMENT FOR MASTER OF COMMERCE BY AMEY MILIND PATIL M.COM SEM II SEAT NO : 15034 UNDER THE GUIDANCE OF PROF SONI HASSANI UNIVERSITY OF MUMBAI MULUND COLLEGE OF COMMERCE S.N. ROAD, MULUND (WEST) 400080 2015-2016

Hrm final

Embed Size (px)

Citation preview

Page 1: Hrm final

A PROJECT ON IMPACT OF EVALUATION OF TRAINNING AND DEVEOPMENT

SUBMITTED IN PARTIAL FULLFILLMENT OF THE REQUIREMENT FOR MASTER OF

COMMERCE

BY

AMEY MILIND PATIL

M.COM SEM II

SEAT NO : 15034

UNDER THE GUIDANCE OF

PROF SONI HASSANI

UNIVERSITY OF MUMBAI

MULUND COLLEGE OF COMMERCE

S.N. ROAD, MULUND (WEST) – 400080

2015-2016

Page 2: Hrm final

Declaration:-

I AMEY MILIND PATIL the student of “MCOM – BUSINESS MANAGEMENT - I” Semester

2ND (2014-2015) herby declares that I have completed the project on “EVALUATION OF

TRAINNING AND DEVEOPMENT” under the guidance of Mrs SONI HASSANI.

The information submitted is true and original to the best of my knowledge.

Place: Mulund

Date :

Page 3: Hrm final

CERTIFICATE

This is to certify that the undersigned have assessed and evaluated the project on

“EVALUATION OF TRAINNING AND DEVEOPMENT “submitted AMEY MILIND

PATIL student of M.COM. Part 1. This project is original to the best of our knowledge and has

been accepted for Internal Assessment under the guidance of Mrs. SONI HASSANI.

Internal Examiner Principal

External examiner co-ordinator

Page 4: Hrm final

ACKNOWLEDGMENT

The success and final outcome of this project required a lot of guidance and assistance from

many people and I am extremely fortunate to have got this all along the completion of my project

work. This was only possible due to such guidance and I would not forget to thank them.

I respect and thank Mrs.SONI HASSANI for giving me an opportunity to do the project work

and providing all guidance and support which help me to complete my project

Page 5: Hrm final

INDEX

SR.NO PARTICULARS PG.NO

1. INTRODUCTION

2. BRIEF VIEW ON LITERATURE

3. EVALUATION OF TRAINING AND DEVELOPMENT

4. CONCLUSION

5. WEBLIOGRAPHY

Page 6: Hrm final

CHAPTER 1

INTRODUCTION

Evaluation is hardly a new subject for discussion in the UN system and it is increasingly

becoming a critical one. The UNDP, for instance, recently reviewed the topic at depth during

regional workshops in 2006-07. Among the conclusions drawn was that:“The enabling

conditions for evaluation across UNDP remain weak. These include the variable quality of

frameworks for monitoring and evaluation, the lack of systematic results monitoring, the

strategic allocation of human and financial resources, and the mechanisms for quality assurance.

Evaluation itself continues to be poorly organized and funded across the organization.” It was

also noted that evaluation was not adequately carried out throughout the UN system. A

recommendation was made accordingly:

“All efforts should be made to ensure that the monitoring support and evaluation function is

harmonized with other UN agencies, and aligned with national systems. UNDP should promote

joint evaluations with UN agencies in particular.

In a similar vein the 16th Meeting of Senior Fellowship Officers (Paris, 6-8 November 2006)

recommended the creation of a Task Force on Impact Assessment of Fellowships. Following a

meeting with consultants at WHO in Geneva, 28-30 April 2008, the specific objectives of the

Task Force were identified as follows:

1) To undertake a literature search, document analysis and a critical review of methods and

processes of training impact evaluation, with a view to determine the relevance of these

approaches to the assessment of the impact of UN Fellowships programmes;

2) With reference to the review of the literature and in consultation with the Task Force, to

draft a generic evaluation framework that defines the scope, dimensions and core

indicators for evaluating the impact of UN Fellowships programmes;

Page 7: Hrm final

3) To identify necessary conditions and supportive measures to enable implementation of

the impact evaluation framework in the context of the UN Fellowships programmes and

to present the findings of this review for discussion and review at the 17th Meeting of

SFOs.

The present report, derived from a variety of sources on evaluation, takes prime responsibility for

the review and analysis of the literature on the subject (1st objective) and will also attempt to

identify organizational “measures” which could support and enhance an evaluation framework

for UN agency fellowships. Its sources are identified through a considerable number of

footnotes. The report makes little claim to original authorship and should be viewed as a

compilation and synopsis of the work of evaluation specialists.

A fairly obvious note of caution: evaluation refers to a wide range of activities, processes,

products, etc. An astounding volume of literature is devoted to training evaluation but not much

of it concerns “fellowships”. Our fellows are not trainees in the typical organizational sense and

they leave our environments after their fellowships/training so we are compelled to try to

measure their reaction, learning, behaviour, change, etc. from a far. Hence our evaluatory task is

made infinitely more difficult than is true for, say, the typical corporation which trains its

employees and can measure the training results on the spot as it were.

Page 8: Hrm final

OBJECTIVES

• Goal-based evaluation begins with goals in mind and seeks to determine if those goals were

achieved;

• Goal-free evaluation does not seek to confirm or deny a pre-determined outcome or goal.

Rather, it seeks to discover any benefits that result from the intervention;

• Responsive evaluation is an approach that it is based on client requirements. This can present

unique challenges for the evaluator, but it is a common approach;

• The systems approach to evaluation focuses on whether the intervention was efficient and

effective;

• Professional review evaluation uses external expert appraisal to evaluate instead of other

commonly used and accepted methods;

• The quasi-legal approach is infrequently practiced, but is uses an actual court-ofinquiry format

to present evidence, take testimonials, and evaluate an intervention or product.

Page 9: Hrm final

METHODOLOGY:-

1.The basic theory about the fundamentals was sourced from various notes .

2.The internet proved to be of good assistance for studying the essentials for successful firms and

also for various statistics

Page 10: Hrm final

CHAPTER 2

REVIEW ON LITERATURE

Ramachandran (2010) has made an analytical study on effectiveness of training programme of

different cadre of employees working in a public sector organization. The result reveals that

employees differed in effectiveness of training programme on the basis of demographic

characters. It is also inferred that experience and education of the employees of the organization

is predominating and determining factor in training programme.

Nagar (2009) has viewed that training budgets are growing at a phenomenal pace as organization

use training to build required skills, indoctrinate new employees, transform banking culture,

merge new acquisitions into the organization and build multiple skills for radically changing

jobs. Scholar has made an attempt to study the effectiveness of training programmes being

conducted by the commercial banks in public as well as in the private sector based on the

responses of their clerical level staff. The results of the study reveal that training programmes of

the respondent organizations are generally effective with respect to course duration, library

facilities, trainer, teaching and computer aided programme and infrastructure facilities.

Saharan (2011) highlighted that most organization are taking feedback from employees for

training effectiveness to maximize its benefits. In the ceaseless drive for a competitive edge,

companies subscribe to the belief that smarter, better trained workers increase chances for

success. The study expounds the perspective of employees having different qualification and

experiences towards objectives behind imparting training in organizations.

Smith (1990) viewed that evaluation of management training courses is a subject much discussed

but, superficially carried out. The study finds that there is too great an emphasis on providing an

objective evaluation report and too little recognition of subjective and peculiar issues which do

not necessarily fit the frame.

Hashim (2001) has made an intensive study that training evaluation is an elusive concept,

especially when it comes to practice. The practice of evaluation in training has received a lot of

Page 11: Hrm final

criticism. This criticism is largely explained by the unsystematic, informal and adhoc evaluation

that has been conducted by training institution.

Griffin (2010) finds that there is a mismatch between organizations desires to evaluate training

and the extent and effectiveness of actual evaluation. There are a numbers of reasons for this

including the inadequacy of current methods. The author has proposed a productivity based

framework to focus data collection and the utilization of a metric to present results. A metric

provides an ideal tool to allow stakeholders informed judgment as to the value of a programme,

whether it has met its objectives and what its impact is. Most importantly the approach focuses

on the bottom line and draws evaluator‟s attention to consider what the ultimate overall impact

of learning is.

Al-Ajlouni, Athammuh & Jaradat (2010) viewed that the evaluation of any training programme

has certain aims to fulfil. These are concerned with the determination of change in the

organizational behaviour and the changes needed in the organizational structure. Scholars asserts

that evaluation of any training programme must inform whether the training programme has been

able to deliver the goals and objectives in terms of cost incurred and benefit achieved, the

analysis of the information is the concluding part of any evaluation programme. They also

stressed that the analysis of data should be summarized and then compared with the data of other

training programmes similar nature. On the basis of these comparisons, problems and strength

should be identified which would help the trainer in his future training programme.

Ogundejl (1991) viewed that evaluation is increasingly being regarded as a powerful tool to

enhance the effectiveness of training. Three major approaches to training evaluation: quality

ascription, quality assessment and quality control are highlighted. In order to enhance the

effectiveness of training, evaluation should be integrated with organizational life.

Hunt & Baruch (2003) highlighted that some organizations invest a great deal of time and effort

in elaborate training programmes designed to improve the so-called soft skills of managing. Yet

assessing the effectiveness of such initiatives has been rare. Recent developments in the use of

survey feedback have provided a technique for pre and post training assessment. A study, at a

Page 12: Hrm final

leading business school, was designed to assess the impact of interpersonal skills training on top

managers. The evaluation of training was based on subordinate feedback conducted before, and

six months after training programme took place. The result indicates significant impact on some

but not all of the competencies and skill under study.

AL-Athari & Zairi (2002) has examined the current training evaluation activity and challenges

that face Kuwaiti organizations. The study reveals that the majority of respondents, both in

government and in private sectors, only evaluate their training programme occasionally. The

most popular evaluation tools and technique used by government and private sectors were

questionnaires. The most common model used by Kuwaiti organizations is the Kirkpatrick

model, while the most common level of evaluation for both government and private sector is

reaction type.

Iyer, Pardiwalla & Bathia (2009) briefly explore the various methods of training evaluation to

understand the need for training evaluation with emphasis on the Kirkpatrick‟s model. Scholars

concluded that although there are different methods to evaluate training, still training evaluation

is the weakest and most under developed aspect of training. Although evaluation is still a grey

area, every organization has to move to evaluate return on investment and behaviour to evaluate

its training programme in order to justify the investment made in training as well as to improve

the training process.

Gopal (2008) examines the evaluation of effectiveness of executive training programmes in

Electronic of India Ltd. Scholar carried out evaluation of training in two ways. (1). Individual

programme wise evaluation and (2). Overall evaluation of all programmes. The evaluation of

training provides useful feedback to the training professional and management as to help in

making appropriate and effective one for the next better programme. Therefore evaluation of

training is not exactly the end point in the training function. In fact it is a starting point.

Blanchard et al. (2000) studied training evaluation practices at both management and non-

management level in Canada through a survey. The survey data indicated that only one-fifth the

Page 13: Hrm final

Canadians organizations evaluated their training as suggested by academic standards. The

researchers presented practitioner perspective as a supporting rationale for the survey results.

Ogunu (2002) in his study titled “Evaluation of Management Training and Development

Programme of Guinness Nigeria PLC” examined the management training and development

programme of Guinness Nigeria PLC, Benin City with a view to ascertaining its relevance,

adequacy, and effectiveness. Hypotheses testing in the study revealed that facilities for staff

training were adequate for effective training of management staff, training programme for

management staff were relevant to the jobs they performed, and the training programme

undergone by staff did indeed improve their performance and effectiveness at works.

Srivastava. ET. al. (2001) evaluated the effectiveness of various training, programme offered by

the in-house training centre of Tata Steal, Shavak Nanavati Training Institute (SNTI), India.

Effectiveness of training was measured in terms of various outcomes such as satisfaction level,

reaction and feedback of participants, and change in performance and behaviour as perceived by

participants, their immediate supervisors, and departmental heads. It was found that the

satisfaction level of participants, their superiors and divisional heads were above average for all

types of programmes. The participants were benefited from the programme but transfer of

learning was not as expected from the supervisors.

Page 14: Hrm final

CHAPTER 3

II. What is evaluation?

1. Definition

Evaluation of training and fellowships in the UN system has apparently not been carried out at a

level that will adequately measure the impact of training/fellowships or other performance

improvement interventions, at least if one is to judge from the frequently expressed frustration by

the UN agencies’ senior management and by the agencies’ constituents. Yet, systematic

evaluation can provide the information needed for continuous improvement. Moreover, today

managers are no longer satisfied with knowing how many fellows underwent training, how they

liked it, and what they learned. Increasingly managers want to know if the fellows are using what

they learned, and – most importantly – what if any institutional results were improved.

In any review of evaluation it is first essential to define the term itself, as well as its stakeholders

and its goals. Then an analysis can be made of the various types of evaluation and major

models/methodologies commonly applied to measure impact.

Probably the most frequently given definition is:

Evaluation is the systematic assessment of the worth or merit of some object 3

The definition is hardly perfect. There are many types of evaluations that do not necessarily

result in an assessment of worth or merit – descriptive studies, implementation analyses, and

formative evaluations, to name a few. Better perhaps is a definition that emphasizes the

information-processing and feedback functions of evaluation.

For instance, one might say:

Evaluation is the systematic acquisition and assessment of information to provide useful

feedback about some object4

Page 15: Hrm final

For the American Evaluation Association evaluation involves assessing the strengths and

weaknesses of programmes, policies, personnel, products and organizations to improve

their effectiveness.

2. Dimensions

Evaluation literature refers to the “dimensions of evaluation” as process, outcome and impact.

These concepts are fundamental and we will return to them in other contexts more fully.

• Process evaluations

Process Evaluations describe and assess programme materials and activities. Establishing the

extent and nature of programme implementation is an important first step in studying programme

outcomes; that is, it describes the interventions to which any findings about outcomes may be

attributed. Outcome evaluation assesses programme achievements and effects.

• Outcome evaluations (see also 8.ii.f and g)

Outcome Evaluations study the immediate or direct effects of the programme on participants.

The scope of an outcome evaluation can extend beyond knowledge or attitudes, however, to

examine the immediate behavioural effects of programmes.

• Impact evaluations

Impact Evaluations look beyond the immediate results of policies, instruction, or services to

identify longer-term as well as unintended programme effects. Very useful reports on this subject

have notably been made by the Center for Global Development7, and by Deloitte Insight

Economics.

Page 16: Hrm final

3. Goals

The generic goal of most evaluations is thus to provide useful feedback to a variety of audiences

including sponsors, donors, client-groups, administrators, staff, and other relevant constituencies.

Most often, feedback is perceived as “useful” if it aids in decisionmaking. But the relationship

between an evaluation and its impact is not a simple one – studies that seem critical sometimes

fail to influence short-term decisions, and studies that initially seem to have no influence can

have a delayed impact when more congenial conditions arise. Despite this, there is broad

consensus that the major goal of evaluation should be to influence decision-making or policy

formulation through the provision of empirically-driven feedback.

4. Approaches

An evaluation approach is a general way of looking at or conceptualizing evaluation; the main

evaluation approaches according to Paul Duignan (“Introduction to Strategic Evaluation”)

include notably:

• Utilisation-focused evaluation – determines methods on the basis of what is going to be most

useful to different audiences;

• Empowerment evaluation – emphasises that the evaluation process and methods should be

empowering to those who are being evaluated;

• Stakeholder evaluation – looks at the differing perspectives of all of a programme’s

stakeholders (those who have an interest in it);

• Goal-free evaluation – in which the evaluator’s task is to examine all of the outcomes of a

programme, not just its formal outcomes as identified in its objectives;

• Naturalistic or 4th generation evaluation – emphasises the qualitative uniqueness of

programmes and is a reaction against the limitation of quantitative evaluation approaches;

Page 17: Hrm final

• Theory based evaluation – puts an emphasis on detailing the assumptions on which a

programme is based (intervention logic) and follows those steps to see if they occur;

• Strategic evaluation – emphasises that evaluation design decisions should be driven by the

strategic value of the information they will provide for solving social problems.

5. Purposes

There are various ways of describing various purposes of evaluation activity, e.g. design,

developmental, formative, implementation, process, impact, outcome and summative. The

evaluation purpose is best understood as identifying what evaluation activity is going to be used

for. Recent years have seen evaluation move to develop types of evaluation that are of use right

across a programme lifecycle. It should be noted that any particular evaluation activity can have

more than one purpose.

The range of evaluation terms are used in various ways in the evaluation literature. A common

way of defining them is as follows (see Duignan):

• Design, developmental, formative, implementation – evaluative activity designed to improve

the design, development, formation and implementation of a programme;

• Process – evaluation to describe the process of a programme. Because the term process could

conceivably cover all of a programme from its inception to its outcomes, it is conceptually useful

to limit the term process evaluation to activity describing the programme during the course of the

programme, i.e. once it has been initially implemented;

• Impact, outcome and summative – looking at the impact and outcome of a programme, and in

the case of summative, making an overall evaluative judgment about the worth of a programme.

The purposes of evaluation also relate to the intent of the evaluation:

Page 18: Hrm final

• Gain insight – provide the necessary insight to clarify how programme activities should be

designed to bring about expected changes;

• Change practice – improve the quality, effectiveness, or efficiency of programme activities;

• Assess effects – examine the relationship between programme activities and observed

consequences;

• Affect participants – use the processes of evaluation to affect those who participate in the

inquiry. The systematic reflection required of stakeholders who participate in an evaluation can

be a catalyst for self-directed change. Evaluation procedures themselves will generate a positive

influence.

6. Types

The most basic difference is between what is known as the formative and the summative types of

evaluation. In more recent years the concepts of confirmative and meta evaluation have received

much attention as well.

Formative evaluations strengthen or improve the object being evaluated – they help form it by

examining the delivery of the programme or technology, the quality of its implementation, and

the assessment of the organizational context, personnel, procedures, inputs, and so on.

Summative evaluations, in contrast, examine the effects or outcomes of some object – they

summarize it by describing what happens subsequent to delivery of the programme or

technology; assessing whether the object can be said to have caused the outcome; determining

the overall impact of the causal factor beyond only the immediate target outcomes; and,

estimating the relative costs associated with the object.

i. Formative

Formative evaluation includes several evaluation types:

• Needs assessment determines who needs the programme, how great the need is, and what

might work to meet the need;

Page 19: Hrm final

• Evaluability assessment determines whether an evaluation is feasible and how stakeholders can

help shape its usefulness;

• Structured conceptualization helps stakeholders define the programme or technology, the

target population, and the possible outcomes;

• Implementation evaluation monitors the fidelity of the programme or technology delivery;

• Process evaluation investigates the process of delivering the programme or technology,

including alternative delivery procedures.

ii. Summative

Summative evaluation can also be subdivided:

• Outcome evaluations investigate whether the programme or technology caused demonstrable

effects on specifically defined target outcomes;

• Impact evaluation is broader and assesses the overall or net effects – intended or unintended –

of the programme or technology as a whole;

• Cost-effectiveness and cost-benefit analysis address questions of efficiency by standardizing

outcomes in terms of their dollar costs and values;

• Secondary analysis re-examines existing data to address new questions or use methods not

previously employed;

• Meta-analysis integrates the outcome estimates from multiple studies to arrive at an overall or

summary judgement on an evaluation question.

iii. Confirmative

Confirmative evaluation goes beyond formative and summative evaluation; it moves traditional

evaluation a step closer to full-scope evaluation. During confirmative evaluation, the evaluation

Page 20: Hrm final

and training practitioner collects, analyzes, and interprets data related to behaviour,

accomplishment, and results in order to determine the continuing competence of learners or the

continuing effectiveness of instructional materials and to verify the continuous quality

improvement of education and training programmes. While formative and summative

evaluations comprise two initial levels, confirmative evaluation assesses the transfer of learning

to the “real world”:

a) Level one: evaluate programmes while they are still in draft form, focusing on the needs

of the learners and the developers;

b) Level two: continue to monitor programmes after they are fully implemented, focusing on

the needs of the learners and the programme objectives:

c) Level three: assess the transfer of learning to the real world. Even level four of

Kirkpatrick’s four levels of evaluation is confirmative evaluation by another name. Level

four measures the results of training in terms of change in participant behaviour and

tangible results that more than pay for the cost of training.

iv. Meta

Formative, summative, and confirmative evaluation are all fodder for meta evaluation. Meta

evaluation is all about evaluating the evaluation. The evaluator literally zooms in on how the

evaluation was conducted. The purpose of meta evaluation is to validate the evaluation inputs,

process, outputs, and outcomes. It serves as a learning process for the evaluator and makes the

evaluators accountable.

There are two types of meta evaluation: type one and type two. Type one meta evaluation is

conducted concurrently with the evaluation process. It is literally a formative evaluation of

evaluation. Type two meta evaluation is the more common approach. It is conducted after

Page 21: Hrm final

formative, summative, and at least one cycle of confirmative evaluation is completed. Some

evaluation specialists have also defined the several types of evaluation more thematically, as

below.

v. Goal-based

Goal-based evaluations are evaluating the extent to which programmes are meeting

predetermined goals or objectives. Questions to ask when designing an evaluation to see if the

goals have been reached include:

- How were the programme goals (and objectives, if applicable) established?

- Was the process effective?

- What is the status of the programme’s progress toward achieving the goals?

- Will the goals be achieved according to the timelines specified in the programme

implementation or operations plan? If not, then why?

- Do personnel have adequate resources (money, equipment, facilities, training, etc.) to achieve

the goals?

vi. Process-based

Process-based evaluations are geared to fully understanding how a programme works – how does

it produce that results that it does. These evaluations are useful if programmes are long-standing

and have changed over the years, employees or customers report a large number of complaints

about the programme, there appear to be large inefficiencies in delivering programme services

and they are also useful for accurately portraying to outside parties how a programme operates.

There are numerous questions that might be addressed in a process evaluation. These questions

can be selected by carefully considering what is important to know about the programme.

Examples of questions include:

- On what basis do employees and/or the customers decide that products or services are needed?

Page 22: Hrm final

- What is required of employees in order to deliver the product or services?

- How are employees trained about how to deliver the product or services?

- How do customers or clients come into the programme?

- What is required of customers or client?

vii. Outcomes-based

Evaluation with an outcomes focus is increasingly important for nonprofits and asked for by

funders. An outcomes-based evaluation tries to ascertain if the organization is really doing the

right programme activities to bring about the outcomes it believes to be needed by its clients.

Outcomes are benefits to clients from participation in the programme. Outcomes are usually in

terms of enhanced learning (knowledge, perceptions/attitudes or skills) or conditions, e.g.

increased literacy, self-reliance, etc. Outcomes are often confused with programme outputs or

units of services, e.g. the number of clients who went through a programme.

Page 23: Hrm final

3.2 Training Evaluation: Purpose And Need

An evaluation of a training programme can help an organisation meet different goals during the

life of training programme. Evaluation of training programme has two basic rules aims –

assessing training effectiveness, and using it as a training aid. The primary aim of evaluation is to

improve training by discovering which training processes are successful in achieving their stated

objectives. Since evaluation affects learning, it can also be put to use as a training aid

(knowledge of results facilities good learning). The other purposes of training evaluation include

the following:

To determine the effectiveness of the different components of training and development

programme (e.g. contents, training aids, facilities and environment, programme schedule,

presentation style, the instructor etc.)

ide who (number and type of potential participants) should participate in future

programme.

ffective future

programmes.

job.

Page 24: Hrm final

ing programme maps to the needs the trainees.

Bramley and Newby (1984) identify four main purposes of evaluation.

ganisational activities and to consider cost

effectiveness;

and transfer of training to the

job;

Page 25: Hrm final

3.3 Models For Training Evaluation

There are different models to evaluate training, still training evaluation is the weakest and most

under developed aspect of training. There are number of issues which lead to ignorance of

evaluation as well as faced in the course of evaluation. It causes expenses that can be ill afforded

in a constrained financial area and also it takes time to practice (Iyer, 2009). There are several

reasons for underdeveloped evaluation. They are; evaluation means different things to different

people, it is perceived to be difficult, tedious and time consuming task which trainers do not like

to pursue, people tend to assume the training will simply work, trainers feel threatened by the

prospect of an objective evaluation of training and its outcome (Sims, 1993). Scholar states that

the main reasons for failure of evaluations are: inadequate planning, lack of objectivity,

evaluation errors of some sort, improper interpretation and inappropriate use of results. Other

issues are failure to train the evaluators on the techniques of evaluation, inappropriate data

gathering instrument and focus on unimportant details.

Different models are used by organisations to evaluate training effectiveness according to the

nature and budgets of the business. Some of the commonly used models are as follows:

A. Kirkpatrick Model: This model has been introduced in 1959 by Donald Kirk Patrick. This

is a very popular model focuses on measuring four kinds of outcomes or it can be said that

outcomes in four levels that should result from a highly effective training programme.

Kirkpatrick (1977) divided the evaluation model into four parts: reaction; learning; behaviour

and results. Reaction would evaluate how participants feel about the programme they attended.

The learning would evaluate the extent to which the trainees learned the information and skills,

the behaviour would evaluate the extent to which their job behaviour had changed as a result of

attending the training. The results would evaluate the extent to which the results have been

affected by the training programme. According to a survey by the American Society for training

and development (ASTD), the Kirkpatrick four level evaluation approaches is still the most

commonly used evaluation framework among Benchmarking Forum Companies (Bassi &

Cheney, 1997). The main strength of the Kirkpatrick evaluation approach is the focus on

behavioural outcomes of the learners involved in the training (Mann & Robertson, 1996).

Page 26: Hrm final

Reaction level: Programme evaluation involves two general approaches – Formative evaluation

also known as internal and summative evaluation also known as external evaluation. Likewise

reaction evaluation is a type of formative evaluation when the results are used for programme

modification and the redesign of contents, course material and presentations (Antheil & Casper,

1986; Robinson & Robinson, 1989). Reaction can also be summative in nature. In such cases, the

goal of reaction evaluation is to determine the value, effectiveness or efficiency of a training

programme (Smith & Brandenburg, 1991) and to make decisions concerning programme

continuation, termination, expansion, modification or adoption (Worthen & Sanders, 1987).

Summative evaluation provides programme decision makers and potentials customers with

judgements about a programme‟s worth or merit (Worthen, Sanders & Fitzpatrick, 1997).

The main purpose of reaction evaluation is to enhance the quality of training programmes, which

in turn leads to improved performance by measuring the participant‟s reactions to training

programme. This should be measured immediately after the programme. Level one evaluation

should not just include reactions towards the overall programme (e.g. did you like the

programme.); it should also include measurement of participants‟ reactions or altitudes towards

specific components of the programme such as , the topics, contents, methodology, instructor etc.

The example of evaluations tools and methods suggested by scholars are typically happy sheets,

feedback forms based on subjective reaction to the training experiences, Verbal reaction which

can be noted and analysed, Post training surveys or questionnaires, online evaluation or grading

by delegates, subsequent verbal or written reports given back by delegates to managers back to

their jobs etc.

Learning level: Evaluation at this level wants to differentiate between what they already knew

prior to training and what they actually learned during the training programme (Jeng & Hsu, nd.).

In other words it can be said that learning evaluation is the measurement of the increase in the

knowledge or intellectual capability from before to after the learning experience. Learning

outcome can include changes in knowledge, skills or attitudes. Some training events will

emphasize knowledge, some will emphasize skills, some will emphasize attitudes and some will

emphasize multiple learning outcomes. The Evaluation should focus on measuring what was

covered in the training events i.e. learning objectives. So this level‟s questions will have a pre

Page 27: Hrm final

test before practicum and a post test after practicum. Tools and methods which can be used in

evaluating learning level are assessment or tests before and after the training, interviews or

observations can be used before or after, although this is time consuming and can be

inconsistent.

Behaviour level: Behaviour evaluation is the extent to which the trainees applied the learning

and changed their behaviour, and this can be immediately and several months after the training,

depending on the situation. This level evaluation wants to measure the transfer that has occurred

in the learner‟s job behaviour/ job performance due to the training programme. This performance

testing is to indicate the learner‟s skill to apply what he has learned in the classroom. It involves

testing the participants‟ capabilities to perform learned skills while on the job, rather than in the

classroom. Change in the job behaviour is difficult to measure because people change in

different ways at different times and also it is difficult to quantify and interpret than reaction and

learning evaluation. Observation and interview overtime are required to assess change, relevance

of change and sustainability of change in behaviour of participants. The opinion of the trainees in

the form of self – assessment, 360 – degree feedback is useful method to evaluate this level.

Result level: Result level evaluation is the effect on the business or environment resulting from

the improved performance of the trainee. Level four outcomes are not limited return on training

investment (ROI). It can also include others major results that contribute to the well functioning

of an organisation, it includes any outcome that the most people would agree is “good for the

business”, Outcomes are either changes in financial outcomes (such as positive ROI or increased

profits) or changes in the variables that should have a reliability direct effect on financial

outcome at the same point of the future.

The intention at this level is to assess the coat vs. benefits of training programme, i.e.

organisational impact in terms of reduced costs, improved quality of work, higher productivity,

reduction in turnover, improved human relation, increased sales, fewer grievances, lower

absenteeism. Higher work morale, fewer accidents, greater job satisfaction etc. Collecting,

organising and analysing level four information can be difficult, time consuming and more costly

than the other three levels, but the results are often quite worthwhile when viewed in the full

context of its value to the organisation.

Page 28: Hrm final

B. CIPP Evaluation model: CIPP model of programme evaluation is developed by Daniel L.

Stufflebeam (1983). It refers to the four phases of evaluation: context evaluation, input

evaluation, process evaluation and product evaluation. It is based upon the view that the most

important purpose of evaluation is to improve the functioning of a programme.

Context evaluation: It involves evaluation of training and development needs analysis and

formulating objectives in the light of these needs. It is aimed at determining the extent to which

the goals and objectives of the programme matched the assessed need of the organisation,

whether needs assessment is accurately identified an actual and legitimate need of organisation

and relevant work culture. Context evaluation is part and parcel of the work undertaken by

employees of an organisation.

Input Evaluation: Input evaluation involves an examination of the intended content of the

programme. It is designed to assess the extent to which programme strategies, procedures, and

activities support the goals and objectives identified in the needs assessment and context

evaluation. An input evaluation is therefore an assessment of the programmes action plan. Such

an evaluation helps in prescribing the specific activities and strategies and procedures and to

ensure that it has been chosen the best approach in terms of the assessed needs and goals and

objectives that has been identified. It involves evaluation of determining policies, budgets,

schedules and procedures for organising programme.

Process Evaluation: A process evaluation is the critical aspect of programme implementation. It

involves evaluation of preparation of reaction sheets, rating scales and analysis of relevant

records (Prasad, 2005). Process evaluation is a continual assessment of the implementation of the

action plan that has been developed by organisation. It is an ongoing and systematic monitoring

of the programme. A process evaluation provides information that can be used to guide the

implementation of programme strategies, procedures and activities as well as a means to identify

successes and failures. The objectives of process evaluation are : -

Page 29: Hrm final

activities are on schedule are being carried out as planned and using time and resources in an

efficient manner;

particularly since not all aspects of the plan can be anticipated or planned in advance;

which programmes personnel are performing their rules and

carting out their responsibilities;

compares to what was intended.

Product evaluation: It involves measuring and interpreting the attainment of training and

development objectives. In other words it can be said that the purpose of product evaluation is to

measure, interpret and judge the extent to which an organisation‟s improvement efforts have

achieved their short term and long term goals. It also examines both intended and unintended

consequences of improvement efforts.

C. CIRO approach: In 1970, the CIRO model for the evaluation of managerial training was

proposed (Warr, Bird & Rackson, 1970). This model was based on the evaluation of four aspects

of training: context, input, reaction and outcomes. According to Tennant, Boonkrong and

Roberts (2002), the CIRO model focuses on measurement both before and after the training has

been carried out. The main strength of the CIRO model is that the objectives (context) and the

training equipment (input) are considered. Context Evaluation focuses on factors such as the

correct identification of training needs and the setting of objectives in relation to the

organisation‟s culture and climate. Input evaluation is concerned with the design and delivery of

the training activity. Reaction evaluation looks at gaining and using information about the quality

of training experience. Outcome evaluation focuses on the achievement gained from the activity

and is assessed at three levels: immediate, intermediate and ultimate evaluation. Immediate

evaluation attempts to measure changes in knowledge, skills or attitude before a trainee returns

Page 30: Hrm final

to the job. According to Santos and Stuart (2003) intermediate evaluation refers to the impact of

training on the job performance and how learning is transferred back into the workplace. Finally,

ultimate evaluation attempts to assess the impact of training on departmental or organisational

performance in terms of overall results.

D. Phillip’s Evaluation approach: In the past decade, training professionals have been

challenged to provide evidence of how training financially contributes to business. Phillips

(1996) suggested adding another level to Kirk – Patrick‟s four level evaluation approach to

calculate the return on investment (ROI) generated by training. According to James and Roffe

(2000), Plillips‟s five level evaluation approaches translate the worth of training into monetary

value which, in effect addresses ROI. Philips‟ framework provides trainers a logical framework

to view ROI both from human performance and business outcomes perspectives. However, the

measurement goes further, comparing the monetary benefit from the programme with its costs.

Although the ROI can be expressed in several ways, it is usually presented as a percent or

cost/benefit ratio. While almost all HRD organisations conduct evaluations to measure

satisfaction, very few actually conduct evaluations at the ROI level, perhaps because ROI is

often characterised as a difficult and expensive process.

Since Kirkpatrick established his original model, other theorists and indeed Kirkpatrick himself,

have referred to fifth level, namely ROI (Return on Investment). But ROI can easily be included

in Kirkpatrick‟s original fourth level „Result‟. The inclusion and relevance of the fifth level is

therefore arguably only relevant if the assessment of return on investment might otherwise be

ignored or forgotten when referring simply to the „Result‟ level.

There are some other training approaches and models. As it has been discussed earlier that

training evaluation itself is less touched part of training and development, these methods have

theoretical side but less in practical application. So these models are not discussed in detail.

These are: Training Validation System (TVS) approach (Fitz-Enz, 1994), Input, Process, Output/

Outcome (IPO) Model (Bushnell, 1990), Hassett‟s training investment analysis approach;

Kaufman‟s five level evaluation model, Mahapatra and Lai (2005) and Sadri and Synder (1995).

Page 31: Hrm final

Hassett‟s training investment analysis approach focuses on the four important areas and

measures the training effectiveness i.e. need analysis, information gathering, analysis and

dissemination. Kaufman‟s five level evaluation model extends the scope of training impact

evaluation beyond the organisation, it includes how training benefits the society and the

surrounding environment in the organisation.

Page 32: Hrm final

3.4 EMPLOYEES TRAINING IN DHL

Employee training is essential for an organization’s success. Despite the importance of training,

a trainer can encounter resistance from both employees and managers. Both groups may claim

that training is taking them away from their work. However, a trainer can combat this by

demonstrating that training is actually a crucialpart of employees’ and managers’ work.

Why Employee Training Is Important

Training is crucial because it:

Educates workers about the effective use of technology,

Ensures competitive edge in the market,

Promotes safety and health among employees,

Creates opportunities for career development and personal growth, an important factor in

retaining workers

Helps employers comply with laws and regulations, and

Improves productivity and profitability.

Laws that Require Employee Training

There are several federal laws for which employee training is either required or recommended.

One law under which there are a series of training requirements is the Occupational Safety and

Health Act. Two areas of federal law in which training is recommended are sexual harassment

and ethics.

One reason training employees and supervisors on the subject of sexual harassment is

recommended is because of a recent Supreme Court ruling. In the decision, the court said an

employer can be held liable for sexual harassment if the organization failed to exercise

Page 33: Hrm final

reasonable care to prevent and promptly correct any such behavior in the workplace. An

employer's responsibility to exercise reasonable care includes ensuring that its supervisors and

managers understand their responsibilities under the organization's anti-harassment policy and

complaint procedure.

Training can also reduce an employer’s liability if an employee is found guilty of criminal

misconduct. Under the Federal Sentencing Guidelines, providing employees with compliance

and ethics training is one of the 7 requirements for an employer to demonstrate that it has an

effective compliance and ethics program. An organization that has an effective compliance and

ethics program can reduce its fines for a criminal conviction by as much as 90 percent, according

to the Federal Sentencing Commission.

Besides greater legal exposure, employers with thin or nonexistent training programs often see

other negative results. The Bureau of Labor Statistics, for example, has found that employers

with high employee turnover train less and spend less on training than other organizations.

BLR’s Employee Training Center has more than 60 courses to help you train employees on a

wide range of HR topics. Training subjects include:

Sexual Harassment (Employees and Supervisors)

Business Ethics

Diversity

The Family and Medical Leave Act

Managing Challenging Employees

Customer Service Skills

Workplace Safety

Page 34: Hrm final

Training Courses

Becoming a Leader: How to Prepare for a Leadership Role

If you want to be a leader in the workplace, you need to prepare to assume leadership roles. You

must go from being a follower to being the one who guides, directs, motivates and manages. This

transition takes time, experience, skills, and commitment. But if you want to lead others, you

can. This session will tell you how to ready when the right opportunity comes along.

Blasting and Explosives Safety

Don’t miss this all-new TrainingToday course on the essentials of working safely with

explosives and blasting guidelines. You’ll learn OSHA’s rules and how to keep you and your

worksite out of harm’s way.

Crash Course in Leadership Skills

This online course in leadership skills conveys that, to be an effective leader, you need to learn

and practice certain skills addressed in this training session. The bottom line is—anyone can be a

leader—and this course shows you how.

Dipping, Coating, and Cleaning Operations

Do you know the signs of potentially dangerous damage to a dip tank? Are you wearing

appropriate PPE when handling flammable liquids? With this all-new TrainingToday course,

you'll learn precautions to protect health and safety.

Diversity for All Employees

Diversity in the workplace means having a group of employees with a wide range of different

backgrounds in terms of race, age, gender, and other characteristics. This online diversity

training course will teach employees to support diversity in the organization. At the end of the

training session, you will be able to identify how we are diverse, understand the challenges and

opportunities of workplace diversity, help avoid discrimination, and follow company policy.

Excavation Safety for Construction Workers

Page 35: Hrm final

Are you following preventative measures legally required for excavation work? Learn your

employer's obligations and your responsibilities to ensure a safe working environment with this

all-new course from TrainingToday.

Lean Project Management

Easier... better... faster... cheaper... These are the guiding principles of lean project management.

Learn the secrets to eliminating project waste, while creating value, with the new Lean Project

Management course from TrainingToday.

New Employee Safety Orientation Training

This online new employee safety training course will teach employees to understand their role in

the company's safety and health program, including security procedures, and safety information.

This course covers topics important to employee orientation including company safety

newsletters, bulletin boards, safety committee members, and labels or material safety data sheets.

Powered Platform Safety

Working on a platform is just as safe as any day in the office, IF you follow proper safety

procedures. Learn how to prevent, and if necessary, respond to emergencies, with this all-new

TrainingToday course.

Project Management Planning

Now there's one go-to resource for project management planning for managers and employees:

The Project Management Planning course from TrainingToday's all-new online Project

Management for Business library.

Project Management Stakeholders

Do you know how to identify your project’s key stakeholders? After you've completed the

Project Management Stakeholders course from TrainingToday, you'll be able to use stakeholder

analysis to spot potential issues, motivate participants, and ensure your project's success.

Project Management: Troubleshooting

Page 36: Hrm final

Can you spot potential project roadblocks before they occur? Learn effective troubleshooting

techniques and make sure your project gets done within its timeline with this new online course

from TrainingToday.

Project Quality Management

Unless you have the right pieces in place, there's no way to ensure the quality of your new

project. With TrainingToday's new Project Quality Management course, you'll learn the right

steps for incremental and continuous improvements -- ensuring your project exceeds

stakeholders' expectations.

Project Risk Management

Does your new project include a risk analysis? A thorough risk assessment is key to ensuring

your project’s success. Learn effective risk management best practices with this all-new course

from TrainingToday.

Refueling Equipment

The costs of accidents from improper refueling can be monumental. While the financial costs can

be huge, the cost of human life and to the surrounding environment can be irreparable. The safe

refueling procedures covered in this TrainingToday course will help eliminate these costs.

Stress Management

This online stress management training course helps employees better manage stress. Too much

stress is one of the most common causes of health problems. It can also cause mental distress that

leads to serious illness and to distractions that can jeopardize safety on and off the job. This

course helps trainees identify the causes of stress, recognize the different types of stress,

understand how stress affects them, and manage stress effectively both on and off the job.

Training the Trainer: Effective Techniques for Dynamic Training

This course discusses effective training in all its stages, from assessing the needs at your

workplace to developing a culture where training is ongoing and seen as an essential part of

every job.

Page 37: Hrm final

Welding and Cutting Safety For Construction Workers

Don't wait for an accident to implement proper welding and cutting safety procedures. Learn

how to identify hazards and prevent fires and injuries with this all-new TrainingToday course.

Working Safely Near Power Lines

Before you begin work near power lines, learn preventative measures to stay safe and get tips on

what to do if there is an emergency with this all-new course from TrainingToday.

Training Resources

Developing a Training Plan for Legal Compliance

Training topics may include general skills such as literacy, technical skills,orientation about the

organization, as well as programs designed to prevent lawsuits, audits,and fines, such as sexual

harassment training, safety training, and ethics training.

How to Prepare for Training

There are several major steps in preparing a training session, including the importance of

promoting the program to top management, preparing training materials, the training space,

trainers, and trainees. The most successful training sessions are well-planned and well-prepared.

When and How to Outsource Training

Do you have the resources and the qualified personnel to accomplish your training goals? You

may find that after analyzing your company’s training needs you don’t have the best training

materials or most qualified personnel in house. The United States has many employment laws

designed to protect employees and to guarantee a workplace that is fair and equal for all. Most of

these laws spell out what employers can and cannot do with regard to employment decisions, but

they do not include specific

Page 38: Hrm final

Related Training Topics

Online Employee Training

A growing number of employers are turning to online employee training for a hands-on,

interactive way for employees to learn. More economical in both time and money than

conventional training, this form of training has become more and more popular as Internet

technology has improved.

Online Safety Training

OSHA believes that computer-based training programs can be used as part of an effective safety

and health training program to satisfy OSHA training requirements, provided that the program is

supplemented by the opportunity for trainees to ask questions of a qualified trainer, and provides

trainee

Page 39: Hrm final

CHAPTER 4

Conclusions

Training evaluation is the most important aspect of training and development. It is a subject

which has been much discussed but superficially carried out. There are various reasons for that

which has been discussed earlier. One of the main reasons is that all models are descriptive and

subjective in nature, its indicators for evaluating training and development is not clearly given

and explained. From the above discussion it has been found out that Kirkpatrick model is widely

used model at reaction level but in this case also what should be the main indicators at reaction

level and other levels is not explained properly. So after discussing a lot on the models for

evaluating training and development, it can be suggested that there are enough model for training

evaluation. They should be further modified by giving its main indicators and explained properly

about each issue so that evaluation of training and development can be properly carried out with

greater effectiveness.

Page 40: Hrm final

CHAPTER 5:

WEBLIOGRAPHY:-

WWW.GOOGLE.COM

http://www.ips.org.pk/