23
Test/Assessment Design Best Practices Bethann Guenther-Misunas EDU 652 Dr. Rhia Roberts May 2, 2011

Test assessment best practices

  • Upload
    guenphi

  • View
    1.028

  • Download
    1

Embed Size (px)

DESCRIPTION

 

Citation preview

1. Test/Assessment Design Best Practices
Bethann Guenther-Misunas
EDU 652
Dr. Rhia Roberts
May 2, 2011
2. Types of assessments
Formative Assessments are on-going assessments, reviews, and observations in a classroom. Teachers use formative assessment to improve instructional methods and student feedback throughout the teaching and learning process. For example, if a teacher observes that some students do not grasp a concept, she or he can design a review activity or use a different instructional strategy. Likewise, students can monitor their progress with periodic quizzes and performance tasks. The results of formative assessments are used to modify and validate instruction ( http://fcit.usf.edu/assessment/basic/basica.html).
3. Types of Assessment
Summative Assessment-are typically used to evaluate the effectiveness of instructional programs and services at the end of an academic year or at a pre-determined time. The goal of summative assessments is to make a judgment of student competency after an instructional phase is complete. For example, in Florida, the FCAT is administered once a year -- it is a summative assessment to determine each student's ability at pre-determined points in time. Summative evaluations are used to determine if students have mastered specific competencies and to identify instructional areas that need additional attention (http://fcit.usf.edu/assessment/basic/basica.html).
4. Examples of Formative and Summative Assessments
http://fcit.usf.edu/assessment/basic/basica.html
5. Best Practices in Assessments
Create clear, Appropriate Learning targets- In order to assess student: Knowledge- This involves; what students need to know (Santrock,p.597)
Reasoning/Thinking-An important learning goal is for students not just to acquire knowledge but also to be able to think about the knowledge using problem solving, inductive, and deductive reasoning, strategies, and critical thinking (597).
6. Best Practices in Assessment
Products- Are samples of students 'work, essays, term papers, oral reports, and science reports reflect students ability to use knowledge and reasoning (Santrock,p.597).
Affect- Affective targets are students emotions, feelings and values. Help students develop self-awareness, manage emotions, and handle relationships (598).
7. Reasons to Assess
Let learners gauge progress toward their goals
Emphasize what is important and thereby motivate learners to focus on it
Let learners apply what they have been learning-and thereby learn it more deeply
Certify that learners have mastered certain knowledge or skills as part of a legal or licensing requirement.
Diagnose learners skills and knowledge so they can skip unnecessary learning (Horton p.216).
8. How to design an effective assessment
First decide what you want to measure
Next select the types of questions you want to use subjective or objective.
Select how you will score the assessment by human or computer?
Make sure you right clear and concise questions. Know how to design meaningful questions that ask what needs to be learned.
Give quick and positive feedback
9. Types of questions
True/false- are used to measure a learners ability to make categorical, either or judgments (Horton,p.220).
When to use, to ask if:a statement is right or wrong, will a procedure work or not, is a procedure safe or unsafe, does an example comply with standards, should you approve or reject a proposal, which 2 alternatives should I pick (221).
Click to view an example http://screencast.com/t/2vQSwdJkT
10. True/False continued
How to design- Require thought; ask more than one true/false on a subject/phrase questions in different ways. Analyze questions to ensure the same number, and phrase the question in neutral terms(Horton,p.222).
How to score- penalize for guessing, require high scores, and ask a lot of questions (223).
11. Types of questions continued
Pick- one-are used to measure the learners ability to; recognize the one correct answer in a list. To identify a member of a category or assign an item or concept (Horton,p.220).
When to use; Rating along a scale, recognizing a member of a specific category, recognizing the main cause of a problem, picking superlatives, selecting the best course of action (225).
Click to view an example; http://screencast.com/t/k1nt2kiQ3PW
12. Types of questions continued
Pick Multiple; are used to measure the learners ability to recognize multiple correct answers in a list. To recognize characteristics that apply to an object or concept (Horton,p.220).
When to pick-when picking items that meet a criterion, making a quick series of yes-no decision, and picking examples or non-examples of a principle (229).
Click to view example http://screencast.com/t/0aHApTNv
13. Types of questions Continued
Fill-In-the-blank- are used to measure learners ability to recall names, numbers, and other specific facts(Horton,p.220).
When to use:to verify that learners have truly learned the names of things for example to recall, technical or business terms, part numbers, abbreviations, commands and statements in a programming language and for vocabulary in a foreign language (231).
Click to view an example http://screencast.com/t/dcl0Inqn
14. Fill in the blank questions continued
How to design fill in the blank questions ;Make sure the context provides enough clues so that the learner can fill in the blank, phrase the question to limit the number of correct answers, phrase the question so that answers can be evaluated, accepts synonyms, tell learners how to phrase their answers, if question is complex split it into separate questions, tell learners the length, the format, required parts, and other constraints on free-form input (232).
15. Types of questions continued
Matching-list: are used to measure the learners 'ability to identify associations between items in 2 lists, as between events and their causes or terms and their definitions (Horton,p.220).
Require students to; specify which items in one list corresponds to items in another(234).
Use matching-item questions to "measure knowledge of the relationships among concepts, objects and components(235).
Click to view an example http://screencast.com/t/2GtcqS6GCk
16. Matching Questions continued
How to design matching questions;write list items clearly, keep the list items clearly, do not mix categories within the list, let learners indicate matches simply, eliminate the process-of-elimination(Horton, p.236).
17. Types of questions continued
Sequence- Are used to measure learners ability to identify the order of items in a sequence, such as chronological order or ranking scheme (Horton,p.220).
When to use: : use sequence questions to measure learners ability to put items into a meaningful order. For example historical events, steps of a procedure by order performed, phases of a process by order in which they occur, logical arguments in inductive or deductive order, rankings of value, and remedies by probability of success (237).
Click to view an example http://screencast.com/t/bPsuDFjQHGt
18. Sequence questions continued
How to create sequence questions; do not use sequence questions if there is more than one right sequence, use only distinct items familiar to learners, specify the criterion for the sequence, specify only one criterion for the sequence (Horton,p.237).
Score fairly; give partial scores for items near their correct location, score each item individually, use sequence test questions for practice when scores are not recorded (238).
19. Types of questions continued
Composition-Are used to measure the learners ability to create original explanation, story, sketch, or other piece of work (Horton. 220).
Use composition questions to; evaluate complex knowledge higher order skills, and creativity for example; synthesize an original solution to a problem, recognize and express complex or subtle relationships, analyze a complex object or situation, format or justify an opinion by weighing evidence or to resolve conflicting opinions and contrary evidence(239).
Click to view an example http://screencast.com/t/xlnQbuz6
20. Composition questions continued
How to design; require breadth and depth questions, require original thinking, disallow copy-and- paste responses, let learners respond in the medium of their choice, be specific, give responses, limit the number of compositions questions(Horton, p.240)..
Scoring- be specific about: Characteristics of the answer need to be present, items it must include, facts it must mention, media it must use, conclusions the learner should draw and recommendations it should make(240).
21. Types of questions continued
Performance questions;are used to measure learners ability to perform a step of a procedure, typically in a simulation (Horton, p. 220).
When to use performance questions; performance questions help is test whether someone can perform a task for example, when you are testing the ability to perform a procedure rather than an abstract knowledge about a subject, the procedure is complex requiring learners to make decisions, not merely follow a sequence of steps, the speed of performing the task is important to its success, and when you are qualifying people to perform a task in the real world (243).
Click to view an example http://screencast.com/t/8I47mTkFZN
22. Performance questions continued
How to design- Simplify test, state the goal clearly, explain the question and spell out scoring rules (Horton, p.243-44).
23. Conclusion
When creating an effective assessment/test a teacher has to keep in mind what they want to test. First they need to think why and what they are testing. Then they need to put their focus on creating questions that will reflect what it is they want their students to know. The more a teacher focuses on what it is they want their students know the better they will create the questions they put together to assess.