Upload
hea-social-sciences
View
582
Download
1
Tags:
Embed Size (px)
DESCRIPTION
This is a draft of the presentation that will be given at the HEA Social Sciences annual conference - Teaching forward: the future of the Social Sciences. For further details of the conference: http://bit.ly/1cRDx0p Bookings open until 14 May 2014 http://bit.ly/1hzCMLR or [email protected] ABSTRACT This paper describes the development of a teaching methodology to enhance academic self-efficacy, defined as students’ confidence in their ability to accomplish specific academic tasks or attain specific academic goals. Our approach makes intense use of student response systems to create an interactive environment where students reflect on their academic skills while they progress with their learning. The methodology builds on a blend of teaching, assessment, and self-reflection components. It makes use of widely adopted teaching technologies, but it expands their potential with an innovative application to the formation of academic self-efficacy beliefs. Preliminary results and evidence-based validation are discussed.
Citation preview
1
When Student Confidence Clicks
Academic Self-Efficacy
and Learning in HE
Fabio R. AricòChris Thomson
** Preliminary version **
2
ACKNOWLEDGEMENTS
HEA – Teaching Development Grant Scheme
UEA-HEFCE Widening Participation Teaching Fellowship
3
OUTLINE
1. Motivation student experience vs. self-efficacy
2. Teaching protocol embedding self-efficacy via SRS
3. Preliminary results learning, self-efficacy, satisfaction
4. Summary feedback welcome!
4
MOTIVATION
Are my answers correct? I’m so confused…
Is this going to be in the exam? Are you sure?
But what if money supply contracts rather than increasing?
Yes, we checked them together already.
Yes, we spoke about it in class and practiced.
You know how to do the reverse, you showed me. Relax.
5
MOTIVATION
Typical problems analysed in recent pedagogic literature:
• Students may encounter difficulties with the course material support sessions, office hours, targeted support interventions.
• Students may display low levels of engagement revision of the curriculum, innovations in teaching, teaching technologies, partnership lecturer-students.
6
MOTIVATION
Additional problem:
• Students may experience low confidence levels anxiety over preparation; peer-pressure and competition; inability to self-assess and detect problems.
• The recent changes in HE practice exacerbate this problem the ‘student experience’ model targets support and satisfaction; students run the risk of being put ‘at the heart of the system’ as passive receivers, rather than confident owners, of their learning.
7
REACTION
Re-visit the concept of Academic Self-Efficacy:
students’ confidence in their ability to accomplish specific academic tasks or attain specific academic goals (Bandura, 1997).
Teach students how to become confident and independent learners® help them to self-assess and diagnose problems;® enable them to seek appropriate forms of support;® increase the rate of retention of widening access students;® enhance employability skills all along the academic journey.
8
REACTION in practice
Develop a teaching protocol embedding Academic Self-Efficacy as an independent learning outcome, parallel to the curriculum.
Stage 1: Investigation and assessment of student Self-Efficacy - experiment with Student Response Systems (clickers); - explore correlation between attainment and confidence.
Stage 2: Extension of dataset (add student record data) Extension to qualitative analysis (e.g. focus groups and interview)
Targeted intervention to increase Self-Efficacy levels.
9
TEACHING PROTOCOL – the module
Introductory Macroeconomics Level 1 – compulsory year-long module - 170 students
Lectures traditional frontal-teaching (10 per sem.)
Seminars small group, pre-assigned problem sets (4 per sem.)
Workshops large group, problem-solving sessions (4 per sem.)
Support Sessions non-compulsory drop-in sessions (4 per sem.)
10
TEACHING PROTOCOL – clicker technology
11
TEACHING PROTOCOL – the innovations
Lectures interaction via clicker technology
Seminars revision questions + understanding questions
Workshops closing questions:
was the lecture enjoyable?
was the material difficult?
Support Sessions online report of clicking session + feedback
12
TEACHING PROTOCOL – the innovations
Seminars preliminary Seminar Quizzes (paper-based)
Seminars 3 revision/understanding questions
Workshops 2 confidence/self-assessment questions
Sessions open-answer comments
Support Sessions online report of Seminar Quiz- solutions and overall performance- individual performance available
- response to open-answer comments
13
TEACHING PROTOCOL – the innovations
Extra-Curricular Activities to promote engagement and Self-Efficacy
Seminars Module Facebook Page + Blackboard pages
- ‘challenges’ to encourage further study- interaction and participation
Seminars Voluntary in-lecture presentations (5 minutes)- to exploit demonstration effects
Support Sessions Campus Vouchers (for engagement, not attainment)
14
TEACHING PROTOCOL – the innovations
Workshops peer-instructed flipped classroom approach
Seminars standard algorithm:1. Quiz questions + Confidence questions (no
solution)2. Peer-instruction learning3. Quiz questions + solutions4. Problem-set questions4. Feedback questions: - what was the cause of mistakes/problems? - did you enjoy using clickers? - were clickers useful to your learning?
Support Sessions online report of clicking session + feedback
15
TEACHING PROTOCOL – the methodology
Focus attainment, engagement, academic self-efficacy role of the SRS (clicker) technology
Learning analytics rich dataset = clicker and paper-based responsesSeminars matched demographics from student records
uncover correlation patterns
Qualitative data focus group and individual interviewsSessions feedback from students
Support Sessions provide the narrative to interpret the analytics
16
An example using data from the 2012-13cohort
17
EXAMPLE – workshop structure
1. Set of quiz questions – collect responses via clickers. Students can see the distribution of answers, but no solution given.
2. Ask students to rate their confidence in giving a correct answer (question by question) – collect responses via clickers.
3. Allow for 20mins discussion on quiz questions.
4. Ask the same questions – collect responses via clickers Provide a solution and a discussion to each question.
18
EXAMPLE – dataset
2 workshop sessions: Week 5 and Week 9 – Autumn Semester
attainment number of correct responses per question/per student
confidence “How confident do you feel about the answer given to Question X?” [4 levels]
learning “I feel that the clickers technology has contributed to my learning experience in today’s workshop” [4 levels]
satisfaction “I enjoyed using clickers in today’s workshop” [4 levels]
Collapsed 4-level responses into dummy response [0,1].
19
EXAMPLE – dataset
Student Q1 Q2 Q3 …
1 0 1 1
2 1 0 0
3 1 1 …
…
performance per question
performance per student
20
EXAMPLE – attainment by question
Week 5
% correct responses■ 1st round
■ 2nd round
21
EXAMPLE – attainment by student
Week 5
% correct responses 1st round
% correct responses 2nd round
22
EXAMPLE – student confidence by question
Week 5
■ % 1st correct responses ■ % confident responses
23
EXAMPLE – student confidence by student
Week 5
% confident responses
% correct responses 1st round
24
EXAMPLE – Comparing Week 5 to Week 9
initial preparation
final learning outcome
peer-instruction effect
self-efficacy indicator
problem set difficulty indicator
student self-assessment skills
25
EXAMPLE – Comparing Week 5 to Week 9
Week 9Week 5
more of a learning/preparation problem than a confidence problem!
26
EXAMPLE – Student perception of SRS
Student satisfaction
91% 56% Student learning perception
88% 73%
27
SUMMARY
• Teaching protocol with interventions to assess/enhance Academic Self-Efficacy and self-assessment skills.
• Mixed-methods approach to disentangle the relationship between engagement, attainment, and academic self-efficacy using student demographics.
• Assessment on the role of SRS technology (clickers) in promoting ASE.(A research question not yet covered in related literature).
28
FEEDBACK NEEDED
• More effort needed to map this project into the literature on ASE.
Academic references? Resources? Interesting readings?
• Explore the role of ASE questionnaires to facilitate comparisons with related studies and validate results.
Which additional instruments could be deemed as useful?
• Further opportunities for dissemination and discussion.
I would be glad to engage with these!
29
Tweet from a student:
Time to play who wants to be a millionaire in my economics lecture #FunLearning.