Upload
martin-hawksey
View
2.610
Download
1
Embed Size (px)
Citation preview
Technology supported assessment
Martin Hawkseye-Learning Advisor (Higher Education)
Increasing learner success with technology supported assessment
April 11, 2023 | slide 1
Technology supported assessment
Plan
Defining assessment and feedback
Outline the REAP Project and its philosophy towards a ‘principled’ approach to course redesign
Some examples of REAP pilots
Questions and discussion
April 11, 2023 | slide 2
Technology supported assessment
Defining assessment and feedback
Assessment – the narrow meaning is exam for external accreditation
Feedback – the narrow sense is what the tutor writes about/on a finished piece of student work
REAP was also interested in:– Self-assessment and reflection - a form of internally
generated feedback– Peer dialogue as feedback and peer assessment
April 11, 2023 | slide 3
Technology supported assessment
Definitions
Who is involved in formative assessment and feedback– Tutor– Peers– External (e.g. placement supervisor)– Computer generated– Self
April 11, 2023 | slide 4
Technology supported assessment
Why take formative assessment and feedback seriously?
Assessment is a key driver of student learning Assessment is a major cost in HE (staff time) Widely reported that students don’t read the feedback Dropout/retention – linked to academic experience First year experience – students need regular and
structured feedback opportunities. National student survey (NSS) – students are dissatisfied
with feedback. QAA reports – main area of criticism in England
April 11, 2023 | slide 5
Technology supported assessment
NSS 2007 - Assessment and Feedback Results
• Nationally only 55% of students think feedback is prompt and had helped to clarify things they did not understand [Scotland: 48%]
• Nationally only 63% of students agree that have received detailed comments on their work [Scotland: 49%]
April 11, 2023 | slide 6
Technology supported assessment
Key messages
Formative assessment and feedback by others can only have an impact on learning when it influences a student’s own self-regulatory processes (adapted from Boud, 1995)
Students are already self-assessing and generating feedback so we should build on this capacity (Nicol and Macfarlane-Dick, 2004)
April 11, 2023 | slide 7
Technology supported assessment
REAP
3 HEIs (Strathclyde, Glasgow Caledonian Business School, Glasgow University)
Targeting large 1st year classes Multi-disciplinary as well as faculty-wide (19 pilots,
~6000 students) Range of technologies: online tests, simulations,
discussion boards, e-voting, e-portfolios, peer/feedback software, admin systems, VLEs, offline-online
A ‘principled’ approach to designing and embedding assessment practices
April 11, 2023 | slide 8
Technology supported assessment
Scaffolding self regulation: 7 principles of good feedback (assessment design)
1. Clarify what good performance is (goals, criteria, standards).
2. Facilitate reflection and self-assessment in learning
3. Deliver high quality feedback to students: feedback that enables students to monitor and self-correct
4. Encourage peer and tutor dialogue around learning
5. Encourage positive motivational beliefs & self esteem through assessment
6. Provide opportunities to close the feedback loop
7. Use feedback information to shape teachingSource: Nicol and Macfarlane-Dick (2006)
April 11, 2023 | slide 9
Technology supported assessment
Two super principles
Super-principle 1: developing learner self-regulation (empowerment) i.e steers to encourage ownership of learning – the seven principles discussed above.
Super-principle 2: time on task and effort (engagement) i.e. steers on how much work to do and when – Gibbs and Simpson 4 conditions
Case examples from REAP – applying these principles/conditions
April 11, 2023 | slide 10
Technology supported assessment
REAP Pilots (1)
Department of Mechanical Engineering– Students: 250– Technology: Commercial online homework packages, electronic
voting system (EVS)– Assessment Activities: Weekly tests provide on-demand
feedback on student performance for both students and tutors. Just-in-time teaching supported by interactive classes using EVS.
– Efficiencies: 60% reduction in assessment workload saving 102 staff hours. License cost of commercial packages £12.95 and £7 per student per annum.
– Learning Gains: Results from 2006/7 diet indicate strong class attainment maintained (90% pass rate, 65% average)
"I think it’s managed to save a lot of time for ourselves and the tutors and given them more time to develop what they are going to talk about and give more time for them to speak to people individually if they need it." Student comment
April 11, 2023 | slide 11
Technology supported assessment
REAP Pilots (2)
Department of Psychology– Students: 560– Technology: Online collaborative group tasks
supported by VLE message-board– Assessment Activities: Regular collaborative tasks
support peer feedback processes and student engagement.
– Efficiencies: 50% of lectures replaced with online tasks. Staff time re-directed to support online tasks.
– Learning Gains: Significant overall improvement in average exam pass mark (51.1% in 2005/06 diet rising to 57.4% in 2006/07).
Exam failure rate reduced from 13% to 5%. Course failure rate reduced from 12.1% to 2.8%.
April 11, 2023 | slide 12
JISC CETIS Assessment SIG
September 2007
REAP Pilots (3)
Department of Hospitality & Tourism Management– Students: 200– Technology: Podcasts, electronic voting system – Assessment Activities: Regular podcast releases
support changes to lectures to include interactive discussion using EVS.
– Efficiencies: 50% reduction in lectures (however, lectures delivered twice to smaller groups).Licensing and development costs associated with podcasts (c. £1k per 30 minutes podcast)
– Learning Gains: Significant gain in overall exam mark in 2006/07 diet (+12.2%) compared with 2005/06.
Significant reduction (-25%) in students receiving non-qualification.
JISC CETIS Assessment SIG
September 2007
REAP Pilots (4)
Department of Modern Languages– Students: 200– Technology: Online homework and tests supported by VLE, electronic
voting system (EVS)– Assessment Activities: Diagnostic tests of student knowledge at start of
year to inform teaching.
Formative feedback from regular online testing and EVS classroom use. – Efficiencies: Effective delivery of course made possible despite
significant cuts in funding and staffing.
Tutorials reduced by 50% and replaced with online tasks. Listening classes reduced from 360 to 160 hours. Saving of 200 staff hours.
– Learning Gains: Failure rate in final exam reduced from 24% to 4.6% compared with 2005/06 diet.
"Having almost immediate feedback on marks was very useful as I was aware at every point as to how well I was coping" Student comment
JISC CETIS Assessment SIG
September 2007
REAP Pilots (5)
School of Pharmacy - Pharmacy Practice 3– Students: 240– Technology: Online simulation tutorial – Assessment Activities: Regular online tasks to improve student
engagement, multiple opportunities for self-testing and regular feedback.
– Efficiencies: Savings in staff time due to reduced need for remedial revision work with individual students.
– Learning Gains: Significant gain in overall exam mark in 2006/07 (+16%) compared to 2005/06 diet.
“The tutorial was an excellent resource and learning tool to supplement our class”
Simulated exactly what it would be like to carry out a check on a prescription, allowing us to experience the difficulties involved and discover where we needed improvement” Student comments
Technology supported assessment
REAP Pilot (6)
School of Pharmacy - Foundation Pharmacy– Students: 120– Technology: e-portfolio, electronic feedback form– Assessment Activities: Improved tutor feedback to students supported by
feedback form, enhanced opportunities for reflection activities.– Efficiencies: Staff workload increased during test implementation but
reductions anticipated in future years.– Learning Gains: No significant improvements in academic performance
reported.
89% of students receiving feedback via the feedback form agreed the feedback was ‘useful’ or ‘very useful’.
“The feedback form was especially useful, I found it easier to work from as it was segmented into the different aspects of the report I had written and had comments on both the strong and weak elements of my report” Student comment
April 11, 2023 | slide 16
Technology supported assessment
Slight detour
A more detailed exampleAudio/Video Feedback
April 11, 2023 | slide 17
Technology supported assessment
Question for you
Question: What are the opportunities for change at UHI?
April 11, 2023 | slide 18
Technology supported assessment
Change
Challenge to Change
JISC Typology of technology use in assessment and feedback doc http://
jiscdesignstudio.pbworks.com/f/typology_tech_assess.doc
April 11, 2023 | slide 19
Technology supported assessment
Do you have the policies in place to back change?
Strathclyde Policies http://www.strath.ac.uk/staff/policies/academic/ and Feedback is a Dialogue site
http://www.strath.ac.uk/learnteach/teaching/staff/assessfeedback/
University of Dundee Online Assessment Policy and Procedures
http://www.dundee.ac.uk/academic/learning/archive/200910/mtg2/papero.rtf
April 11, 2023 | slide 20
Technology supported assessment
Relevant papers
Nicol, D (in press), Laying the foundation for lifelong learning: cases studies of technology supported assessment processes in large first year classes, British Journal of Educational Technology (to be published July 2007).
Nicol, D (2007) E-assessment by design: using multiple-choice tests to good effect, Journal of Further and Higher Education.
Nicol, D. & Milligan, C. (2006), Rethinking technology-supported assessment in relation to the seven principles of good feedback practice. In C. Bryan and K. Clegg, Innovations in Assessment, Routledge.
Nicol, D (2006), Increasing success in first year courses: assessment redesign, self-regulation and learning technologies, Paper prepared for ASCILITE conference, Sydney, Australia, Dec 3-6.
Nicol, D, J. & Macfarlane-Dick (2006), Formative assessment and self-regulated learning: A model and seven principles of good feedback practice, Studies in Higher Education, 31(2), 199-218.
Nicol, D & Boyle, J. (2003), Peer Instruction versus Class-wide discussion in large classes. Studies in Higher Education, 28(4), 457-473
Boyle, J.T. and Nicol, D. J. (2003). Using classroom communication systems to support interaction and discussion in large class settings, Association for Learning Technology Journal, 11(3), 43-57
April 11, 2023 | slide 21