Upload
claude-mathews
View
223
Download
4
Embed Size (px)
Citation preview
4 September 2013
STRATEGIES TO REDUCE TEST FRAUD
Preventing, detecting, and investigating test security irregularities
Gerard Seinhorst
Head, Language Office
Defence Intelligence and Security Institute
Ministry of Defence24 September 2013
INTRODUCTION
Ministry of Defence34 September 2013
PRESENTATION OVERVIEW
• Test Fraud – What is it and why is it a concern?
• Cheating – What do we know about it?
• Strategies to minimize test fraud
Prevention
Detection
Follow-Up Actions
Ministry of Defence44 September 2013
TEST FRAUD
1. Cheating:
“any action taken before, during, or after the administration of a test or assignment, that is intended to gain an unfair advantage or produce inaccurate results” (Cizek, 2012)
2. Test Piracy:
stealing, copying or memorizing test forms, items, prompts, or other secure testing materials, usually with the intention to make these available to future test takers.
Ministry of Defence54 September 2013
TEST FRAUD – Why is it a concern?
• Threat to validity Construct-irrelevant variance
• Reputation of testing programme
• Fiscal implications
redevelopment of testing materials
test retake sessions
investments in test security
• Negative impact on validity of scores obtained by other candidates
• Denying opportunities to other candidates
Ministry of Defence64 September 2013
CHEATING – When?
Cheating tends to occur when:
• there is a good opportunity (cheating is easy to do)
• there is a small likelihood of getting caught
• if caught, the penalties are light
• faculty appear to take a casual or lenient attitude towards cheating
• the stakes are high
A thing worth having is a thing worth cheating for
W.C. Fields
Ministry of Defence74 September 2013
CHEATING – How?
• Variety of forms
Ministry of Defence84 September 2013
CHEATING – How?
Ministry of Defence94 September 2013
CHEATING – Resources
Cheating Sites:
http://www.schoolsucks.com/http://www.academon.com/?cf=ukhttp://www.writework.com/http://www.academicintegrity.com/
Ministry of Defence104 September 2013
CHEATING – How much?
Cheating is on the rise
Research studies:
• On average, between 3-5% of test takers engage in cheating on any occasion
• 100% of 14 educational organizations participating in a survey found firm evidence of any teachers or school administrators cheating on behalf of their students
Ministry of Defence114 September 2013
WHEN TEACHERS ARE TEST ADMINISTRATORS
Overtly and covertly giving help to test takers:
• changing responses on answer sheets after testing
• leaking test questions before testing
• applying non-standard testing conditions
• cueing students on incorrect answers• giving students extra time on tests
• filling in answers left blank by students
• suspending low-ability students on testing days
• giving inappropriate instruction: “teaching to the test”
Ministry of Defence124 September 2013
CHEATING – Why?
Top-5 reasons why test candidates cheat:
1. competitiveness: pressure to perform / grade pressures
2. poor time management / exam preparation
3. lack of self confidence: anxiety about test content and test format
4. “cheating is easier than studying”
5. cheating culture (thrill / collective cheating)
Ministry of Defence134 September 2013
CHEATING BY EDUCATORS
• Material rewards (bribes)
• Pressure to promote strong performance by their students
• Indifference
• Justifiable response to standardized tests
• Compassion for their students
• Ignorance
Ministry of Defence144 September 2013
TEST FRAUD PREVENTION
If a man defrauds you one time, he is a rascal; if he does it twice, you are a fool.
Author unknown
Ministry of Defence154 September 2013
PREVENTING TEST FRAUD
Focus on prevention, rather than remediation
Acknowledge that cheating is going to occur and is problematic
Ministry of Defence164 September 2013
6 STRATEGIES TO REDUCE TEST FRAUD
1. Develop faculty and student integrity
2. Develop and implement a Security Plan
3. Ensure that administration staff are properly trained
4. Protect testing materials against piracy
5. Administer tests in controlled environments
6. Maximize probability for detection
Ministry of Defence174 September 2013
1. Develop Faculty & Student Integrity
• Create culture where it is inappropriate to engage in any form of cheating or piracy
• Respond to cheating when it does occur
• Refrain from inappropriate test preparation activities
• Eliminate test anxiety: Familiarize candidates witho Test format, length, etc.o Scoring criteriao Test admin procedureso Re-test policyo Appeal procedure
• Examination honour code / non-disclosure agreement
Ministry of Defence184 September 2013
Examination Honour Code
Please read the following and provide your signature:
I agree to answer the questions on this assessment without using aids that are not permitted and without obtaining assistance from another person or via electronic means. I agree to not share my answers or any information about the assessment content with anyone during or after the assessment. I accept that if any of these conditions are violated the result will be no credit (0 points) for this test.
Signature
Ministry of Defence194 September 2013
2. Develop & Implement Security Plan
• Roles and responsibilities
• Secure management of testing materials
• Prevention
• Test administration procedures
• (Im)permissible behaviour
• Detection and investigation of irregularities
• Sanctions for misconduct
Ministry of Defence204 September 2013
3. Ensure admin staff are properly trained
• Lack of training results in distrust of testing and a misunderstanding of the reliability and validity of standardized testing
• Training should include:o common cheating methodso test administration protocolso how to act in case of cheatingo impact of test security irregularitieso role modelling
Ministry of Defence214 September 2013
4. Protect testing materials against piracy
a. Safeguard all secure materials
b. Limit exposure of test items
c. Use appropriate (not easily-corruptible) test
construction and delivery methods
Ministry of Defence224 September 2013
a. Safeguarding testing materials
• Account for all secure materials before, during, and after testing
• Store all testing materials (including draft items and test results) in multiple, locked cabinets
• Password protect all electronic files
• Number test booklets AND answer sheets
• Shred all obsolete testing materials
• Limit access
• Pre-package test booklets (shrink-wrapped or in sealed envelopes)
• Maintain a clean-desk policy
Ministry of Defence234 September 2013
Clean-desk policy…
Ministry of Defence244 September 2013
b. Limit exposure of test items
• Reduce amount of testing
• Narrow testing windows
• Use screening tests
• Use unique make-up tests
• Use an item bank
• Periodically introduce new test forms
• Use computer-adaptive tests
Ministry of Defence254 September 2013
c. Test construction & delivery strategies
Design tests with security in mind:
• Include as many items/prompts as feasible
• Use more constructed response items
• Develop multiple versions and forms ( validity issue)
• Randomize answer choices (in MC testing)
• Use as many plausible answer choices as feasible
• Use computer-based / computer-adaptive testing
Possible tradeoffs in terms of psychometrical disadvantages,
lower efficiency and added costs
Ministry of Defence264 September 2013
5. Administer tests in controlled environments
Keep high-stakes testing separated from teaching• Avoid that teachers test their own students
Establishing a controlled environment…
Ministry of Defence274 September 2013
Standardized administration procedures
• Seating plan
Ministry of Defence284 September 2013
Randomly assign seats
Beware of the “flying V” answer copying formation (Cizek, 1999)
Ministry of Defence294 September 2013
• Monitoring
• Seating plan
Standardized administration procedures
Ministry of Defence304 September 2013
Monitoring
Monitoring is still the most effective way to prevent and
detect misconduct during testing
Ministry of Defence314 September 2013
• Monitoring
• Seating plan
• Prohibited behaviour / items / aidso talking, walking, gesturing, electronic devices, reference
materials, crib sheets, hats, sunglasses, bags
• Account for all secure materials (incl. scrap paper
and blank answer sheets)
Standardized administration procedures
Ministry of Defence324 September 2013
Security risks:
• Inappropriate proctor assistance
• Test piracy (hacking)
• Lack of familiarity
• Absence of secure browser
• Technology dependent
Special considerations for CBT / CAT
Ministry of Defence334 September 2013
6. Maximize probability for detection
a. Begin monitoring for irregularities prior to test administration• Monitor the Internet for pirated test items
b. Qualified proctors should monitor for irregularities during the test administration
c. Double marking to prevent educator cheating after testing
d. Routinely conduct data forensics:• Ratio analysis/erasure analysis • Item-response pattern analysis • Test-score analysis See Van der Linden (2011) for a list of psychometric techniques
Ministry of Defence344 September 2013
FOLLOW-UP ACTIONS
Five Fs of follow-up: fast, firm, fair, frequent, famous
Ministry of Defence354 September 2013
Response and Investigation
Security Incident Response Plan
Establish a standard or trigger for an investigation
Use appropriate sanctions• formal reprimand
• re-taking the test
• informing stakeholders
• withdrawal of certificate/diploma
• legal action
Ministry of Defence364 September 2013
CONCLUSION
Teacher shows ultimate cheating technique…
Ministry of Defence374 September 2013
THANK YOU FOR LISTENING
Proctoring – not an easy job…
Ministry of Defence384 September 2013
REFERENCES / FURTHER READING
Cizek, G.J. (1999). Cheating on Tests: How to Do it, Detect it, and Prevent it. Lawrence Erlbaum Associates: Mahwah, New Jersey.
Cizek, G.J. (2012). Ensuring Integrity in Testing: Context, Responsibilities, and Recommendations. Retrieved August 21, 2013 from http://www.gomiem.org/files/handouts/keynote_thursday-cizek_cheating_2012-02-23.pdf.
Impara, J.C., Kingsbury, G., Maynes, D., and Fitzgerald, C. (2005). Detecting Cheating in Computer Adaptive Tests Using Data Forensics. Retrieved August 21, 2013 from http://www.caveon.com/articles/NCME-05.pdf
Noah, H.J. and Eckstein, M.A. (2001). Fraud and Education: The Worm in the Apple. Maryland, USA: Rowman and Littlefield.
Van der Linden, W.J. and Sotaridona, L. (2004). A Statistical Test for Detecting Answer Copying on Multiple-Choice Tests. Journal of Educational Measurement, Vol. 41, No. 4, pp. 361–377.
Whitley, B.E. and Keith-Spiegel, P. (2002). Academic Dishonesty: an Educator’s Guide. Lawrence Erlbaum Associates: Mahwah, New Jersey.
Wollack, J.A. & Fremer, J.J. (Eds.). (2013). Handbook of test security. New York City, NY: Routledge.