10
Criteria for Good Measuring Instruments Adapted from Educational Research: Competencies for Analysis and Applications (Gay, L.R., Mills, G. E., Airasian, P., 2006) M.Stewart M.Ed. 2011

Adapted from Educational Research: Competencies for Analysis and Applications (Gay, L.R., Mills, G. E., Airasian, P., 2006) M.Stewart M.Ed. 2011

Embed Size (px)

Citation preview

Page 1: Adapted from Educational Research: Competencies for Analysis and Applications (Gay, L.R., Mills, G. E., Airasian, P., 2006) M.Stewart M.Ed. 2011

M.Stewart M.Ed. 2011

Criteria for Good Measuring

Instruments

Adapted from Educational Research: Competencies for Analysis and

Applications(Gay, L.R., Mills, G. E., Airasian, P., 2006)

Page 2: Adapted from Educational Research: Competencies for Analysis and Applications (Gay, L.R., Mills, G. E., Airasian, P., 2006) M.Stewart M.Ed. 2011

M.Stewart M.Ed. 2011

“Validity is the degree to which a test measures what

it is supposed to measure and, consequently, permits the appropriate interpretation of results” (134).

Validity depends on the context in which the assessment is used. In other words, it is designed for a specific purpose and a specific group of people.

So, a test that is valid one quarter, might not be the next.

Validity

Page 3: Adapted from Educational Research: Competencies for Analysis and Applications (Gay, L.R., Mills, G. E., Airasian, P., 2006) M.Stewart M.Ed. 2011

M.Stewart M.Ed. 2011

“Content validity is the extent to which a test

measures the intended content area” (134). Item validity- individual test items are relevant to

intended content. If it wasn’t explicitly taught, it shouldn’t be tested. *Sampling validity- how well the test samples

the total content area being tested. In other words, the questions on the test should be

proportionate to the amount of coverage the content received in the classroom. *most common area for instructors to have issues

Types of Validity

Page 4: Adapted from Educational Research: Competencies for Analysis and Applications (Gay, L.R., Mills, G. E., Airasian, P., 2006) M.Stewart M.Ed. 2011

M.Stewart M.Ed. 2011

Unclear test directions

Confusing/ambiguous test items

Using vocabulary too difficult for test takers

Overly difficult and complex sent. structures.*

Inconsistent and subjective scoring methods

Untaught items included on achievement tests

Factors that threaten validity:

Page 5: Adapted from Educational Research: Competencies for Analysis and Applications (Gay, L.R., Mills, G. E., Airasian, P., 2006) M.Stewart M.Ed. 2011

M.Stewart M.Ed. 2011

A question should only be as complex as

the subject matter demands. Making questions unnecessarily complicated does

not mean that you are testing critical thinking, nor does it mean that you are “challenging” them.

In fact, an unnecessarily complicated question actually reduces the validity of the test because you are no longer testing the subject matter, but rather the students’ ability to decipher the complex sentence structure or instructions.

Simple, straightforward subject matter, should be tested with a simple, straightforward question. Either they know it or they don’t.

Factors that threaten validity:Overly difficult and complex sentence structures

Page 6: Adapted from Educational Research: Competencies for Analysis and Applications (Gay, L.R., Mills, G. E., Airasian, P., 2006) M.Stewart M.Ed. 2011

M.Stewart M.Ed. 2011

For second language students and students with

some types of learning disabilities, some question types can be more difficult to understand than others. This can interfere with validity.

Form test questions in a positive rather than negative structure, whenever possible, especially for true/false questions.

For example, for a true/false statement, “Hormones are part of the endocrine system.” is much easier to comprehend than, “Hormones are not a component of the Endocrine system.”

Helpful Hints

Page 7: Adapted from Educational Research: Competencies for Analysis and Applications (Gay, L.R., Mills, G. E., Airasian, P., 2006) M.Stewart M.Ed. 2011

M.Stewart M.Ed. 2011

Expose students to test-like questions

before the exam. Including a few review questions in your

PowerPoint lectures can be a great way to do a comprehension check over current material and gives students the opportunity to learn what is expected when answering certain kinds of questions.

Helpful Hints

Page 8: Adapted from Educational Research: Competencies for Analysis and Applications (Gay, L.R., Mills, G. E., Airasian, P., 2006) M.Stewart M.Ed. 2011

M.Stewart M.Ed. 2011

Collect all questions of the same type into

their own section. For example, all of the true/false questions are together, all of the multiple-choice together, etc.

Don’t assume that students will know what to do. Every section of questions needs its own set of clear directions, even it seems obvious.

Give each question a point value, so students can apply test-taking strategies.

Helpful Hints

Page 9: Adapted from Educational Research: Competencies for Analysis and Applications (Gay, L.R., Mills, G. E., Airasian, P., 2006) M.Stewart M.Ed. 2011

M.Stewart M.Ed. 2011

Often, one of the duties of the basic skills instructor

is to review the exams to look for issues with validity. To facilitate this, there are the following expectations:

1) there will be adequate time to collaborate and comment on the exam (two weeks before study guide is to be given)

2) once comments have been given, a final draft incorporating all changes will be sent to BS instructor

3) NO changes will be made to exam after final draft has been agreed upon/study guide created.

Collaborating on Exams

Page 10: Adapted from Educational Research: Competencies for Analysis and Applications (Gay, L.R., Mills, G. E., Airasian, P., 2006) M.Stewart M.Ed. 2011

M.Stewart M.Ed. 2011

Gray, L.R., Mills, G. E., Airasian, P. (2006).

Selecting measuring instruments. Educational Research: Competencies for analysis and applications (8th ed.) (pp 120-157). Upper Saddle River, NJ: Pearson, Merrill, Prentice-Hall.

Works Cited