25
Assessing Learning Objects: Do they Make Learning and Teaching Better, Faster, Cheaper? Dean of Planning, Research, Assessment Connecticut Distance Learning Consortium Diane J. Goldsmith

Assessing Learning Objects: Do they Make Learning and Teaching Better, Faster, Cheaper? Dean of Planning, Research, Assessment Connecticut Distance Learning

  • View
    213

  • Download
    0

Embed Size (px)

Citation preview

Assessing Learning Objects: Do they Make Learning and

Teaching Better, Faster, Cheaper?

Dean of Planning, Research, Assessment

Connecticut Distance Learning Consortium

Diane J. Goldsmith

Carol Twigg’s Critique

“MERLOT claims to have 7,000 or so learning objects in a database. But if these learning objects haven’t been evaluated in terms of whether or not they increase student learning, you then just have 7,000 sort of mildly interesting things collected in a database.” Carol Twigg in an interview http://www.educause.edu/pub/er/erm04/erm0443.asp?bhcp=1

Assessment is Value DrivenAssessment Exists in a Context

Assessment compares what exists with what ought to exist.

Therefore you have to know what ought to exist – why are you creating learning objects?

Principles of Assessment

• AAHE’s 9 Principles of Good Practice for Assessing Student Learning.

1. Begins with Educational Values

2. Multi-dimensional, integrated, over time

3. Program to be improved has clear goals

4. Includes outcomes and experiences

5. Ongoing

Principles of Assessment

9 Principles con’t:6. Includes people across the institutions

7. Begins with issues of use and the questions people care about

8. Will lead to improvement if part of set of conditions that promote change

9. Meet our responsibilities to students and the public

Assessment/Evaluation Model

Utilization-Focused Evaluation (M.Q. Patton)

–Looks at utilization (important in learning objects).

–Based on 6 questions.

“Utilization-Focused Evaluation (U-FE) begins with the premise that evaluations should be judged by their utility and actual use; therefore, evaluators should facilitate the evaluation process and design anyevaluation with careful consideration of how everything that is done, from beginning to end, will affect use.”

UTILIZATION-FOCUSED EVALUATION CHECKLISTMichael Quinn Patton

January 2002

“Use concerns how real people in the real world apply evaluation findings and experience the evaluation process. Therefore, the focus in utilization-focused evaluation is on intended use by intended users. Since no evaluation can be value-free, utilization-focused evaluation answers the question of whose values will frame the evaluation by working with clearly identified, primary intended users who have responsibility to apply evaluation findings and implement recommendations.”

UTILIZATION-FOCUSED EVALUATION CHECKLISTMichael Quinn Patton

January 2002

Basic Assessment Questions

1. Who wants learning objects?–IT folks–Instructional Designers–Faculty–Students–Administrators

Basic Assessment Questions

2, What do they want them for?

3. By what criteria will they be judged?

-are reusable -meet learning objective

-can be searched -save money

-are self-contained and can be aggregated

-are fun

-are easily used -are used

Basic Assessment Questions

4. How will you collect the necessary data as evidence of how well you are meeting the criteria you have established?– How much of this assessment can be

embedded in the learning object?

– Assessment requires staff time, money, resources,

Basic Assessment Questions

5. When will you evaluate:

• Formative (while you are in the building, testing stages)

• Summative (after it is deployed)

• Continuous

Basic Assessment Questions

6. What will you do with the evidence?

• Assessment shouldn’t be an end in itself.

• Should lead to improvement/changes and more assessment

Assessing Learning

Issue of Causality

• Experimental Design – random assignment, control groups, blind

• Quasi-experimental – used in education– Non-equivalent groups – Post test only– Non-equivalent groups – Pre-Post test– Non-equivalent groups Time Series designs

Some Threats to Causality

• Hawthorn Effect • Newness• Small sample sizes• Reliability of Measures• Standard usage• History• Mortality• Selection

Models – User Evaluation

University of Wisconsin Online Resource Center.Repository of Learning Objects

“The goals of the project are to accelerate the development of quality online courses while, at the same time, minimizing the cost of course development by identifying and sharing best practices “

Models – User Evaluation

Bloom's Taxonomy For Cognitive Learning and Teaching Author:     Terri Langan School: Fox Valley Technical College    Date: 12/10/2002 Description: The users of this learning object read a brief introduction to the six levels of Bloom's Cognitive Taxonomy and quiz themselves on a basic understanding of the levels. View this object  Read reviews

Model--Outcomes

The Ct Distance Learning Consortium created a learning objects repository for the Ct. Voc-Tech schools:

The searchable data base is at: http://www.ctdlc.org/votech/

Model--OutcomesSubject:Math Core Topic: G. Calculating

heights with Trignometry conceptsCourse:GeometryAbstract:This unit requires students to use

Trigonometry concepts to calculate heights of building and objects. Students are expected to construct a clinometer and use it to calculate heights of objects.

Objective:The unit objectives are aligned with the CT Framework, Math, Grades 9-12. Standard 6, Spatial Relationships and Geometry.

Model – Cost Evaluation

Center for Academic Transformation

Program in Course Redesign – Carol Twigg

- learning objects as a cost reduction strategy.

Tool for calculating costs

Model– Design Evaluation

AliveTek Inc.• Learning Object Check List

– Questions are mostly about instructional design• Navigation• learner interaction• Learner control of object• Contains assessment• Amount covered is appropriate

Example

• Wesleyan’s Learning Object Project Assessment• Multi-campus assessment • Based on a (non-equivalent groups post-test)

– Student background– Student usage logs– Student performance on exams – Student and faculty surveys– Targeted focus groups

Conclusion

• Who is advocating for learning objects?• What “opposition” is there?• Why? What features of learning objects

are most important to them?• What evidence will they need to show that

learning objects “work?” (are worth staff time and/or other resources)

• How and when will you gather that evidence?

Q&A

Diane J. Goldsmith

Dean Planning, Research, and Assessment

CT Distance Learning Consortium

[email protected]

860 832 3893