20
Tieka Wilkins Chapter 12 Designing and Conductive Summative Evalautions

Tieka wilkins chapter 12

Embed Size (px)

Citation preview

Tieka Wilkins Chapter 12

Designing and Conductive Summative Evalautions

Background

Objective

• Designers need to know whether the skills learned in instruction are actually transferred to the performance setting.

• If the skills learned are transferred, they need to know whether using them solves the original problem. If not, and without other benefits to the organization (unanticipated), the utility of the instruction should be questioned.

Two phases of summative evaluation

• Expert judgment :determine whether currently used instruction or other candidate instruction has the potential for meeting an organization's defined instructional needs

• Field trial document the effectiveness of promising instruction with target group members in the intended

• setting.

Expert Judgment Phase of Summative Evaluation

1. Congruence Analysis

2. Content Analysis

3. Design Analysis

4. Feasibility analysis

5. Current User Analysis

Congruence Analysis :

• To perform the congruence analysis, you should first obtain a clear description of the organization's needs, which includes an accurate description of the entry skills and characteristics of the target learners. After obtaining this information, you should locate instructional materials that have potential for meeting the organization's needs.

Content Analysis• One strategy would be to provide the experts with copies of

all candidate materials and ask them to judge the accuracy and completeness of the materials for the organization's stated goals.

• A better, more cost-effective strategy would be to work with the expert(s) to produce an instructional analysis of the stated goal.

• The document the expert(s) produces should include both the goal analysis and the subordinate skills analysis.

• A framework that identifies and sequences the main steps and subordinate skills in the goal would be a valuable standard against which you can evaluate the accuracy and completeness of any candidate materials.

Design Analysis

• Evaluate the adequacy of the components of the instructional strategy included in the candidate materials.

• Although the basic components of the strategy do not change, you may want to adopt criteria related to each component based on the type of learning outcomes addressed in the materials.

• The evaluator's response format can also be expected to vary based on the nature of the instruction.

Feasibility Analysis

• The third area of questions about the instructional materials relates to the utility of the candidate materials.

• For each set, you should consider such factors as the availability of a learner guide or syllabus and an instructor's manual.

• Factors related to the durability of the materials are another consideration.

• Another is any special resources, such as instructor capabilities, equipment, or environments (e.g., learning centers) that are required.

• A utility concern is whether the materials require group or individual pacing.

Current User Analysis

• One other analysis that you may wish to include in your design is to seek additional information about the candidate materials from organizations that are experienced in using them.

• The names of current users can often be obtained from the publishers of the materials.

• Another type of information relates to the instructor's perceptions of the materials.

The Field Trial Phase Types:1. Outcome Analysis

2. Management Analysis

Outcomes Analysis

Impact on Learners – are learners achieving satisfactory results?

Impact on Job – are learners able to make connections between what they are learning and how they can apply to their job? Applicable?

Impact on Organization – are learner’s attitudes and behaviors helping the organization achieve a positive difference?

Management Analysis

Are instructor/manager attitudes satisfactory?

Is implementation doable? Realistic?

Are costs (time, personnel, equipment and resources) realistic?

Phases

• Planning for the evaluation

• Preparing for the implementation

• Implementing instruction and collecting data

• Summarizing and analyzing data

• Reporting results

REPORTING RESULTS

Prepare a report of the summative evaluation findings that includes

Summary

Background

Description of Evaluation Study

Results

Discussion

Conclusion and Recommendations

• The purpose of formative evaluation is to improve instruction by getting data for revisions.• The purpose of summative evaluation

is to prove the worth of the instruction, given that it will not revised.

• Summative evaluations are conducted to make decisions about whether to maintain or adopt instruction. The primary evaluator in a summative evaluation is rarely the designer or developer of the instruction; the evaluator is frequently unfamiliar with the materials, the organization requesting the evaluation, or the setting in which the materials are evaluated. Such evaluators are referred to as external evaluators; these evaluators are preferred for summative evaluations because they have no personal investment in the instruction and are likely to be more objective about the strengths and weaknesses of the instruction.