How do university teachers make decisions about assessment?

Preview:

DESCRIPTION

How do university teachers make decisions about assessment?. Phillip Dawson Office of the PVC (Learning & Teaching) Monash University. The ‘Assessment Decisions’ team. Dr Phillip Dawson, Monash (Co-lead) A/Prof Margaret Bearman , Monash (Co-lead) A/Prof Liz Molloy, Monash - PowerPoint PPT Presentation

Citation preview

How do university teachers make decisions about assessment?

Phillip DawsonOffice of the PVC (Learning & Teaching)

Monash University

The ‘Assessment Decisions’ team

• Dr Phillip Dawson, Monash (Co-lead)• A/Prof Margaret Bearman, Monash (Co-lead)• A/Prof Liz Molloy, Monash• Prof David Boud, UTS• A/Prof Gordon Joughin, UQ• A/Prof Sue Bennett, UOW• Dr Matt Hall, Monash

The Assessment Decisions project

What we mean by ‘assessment decisions’

• Formulation of assessment policy

• Decisions in the design and implementation of assessment

• Judgements about student work

‘assessment decisions’

What is an assessment decision?(what we thought we were looking for)

• I’ve got a blank canvas: a new unit! What assessment will I choose?

• This unit has been assessed the same way for years but it isn’t working… what can we do differently?

• Will I put the paperwork in to replace that examination with a project, even though it won’t come into effect until next year?

Recent related research

• “achieving a balance between summative and formative assessment requires complex, contextual thinking” (Price, et al., 2011)

• Changing assessment ‘thinking’ in academics doesn’t necessarily change practice (Offerdahl & Tomanek, 2011)

• Trust might play a part in our choice of different assessment types (Carless, 2009)

We know a lot about ideal practice

Hattie, 2009

Actual practice is different

How do university teachers make decisions about assessment?

Research design

Arts - Profession

s

Arts – Non-

profession

Sciences - Profession

s

Sciences – Non-

profession

• 30 semi-structured interviews

• Gritty, coalface, ‘actual’ not ‘ideal’

• Thematic analysis; coding against framework

• Meaning-making from coded data

What we can say

• Improving assessment is more than just a problem of knowledge transmission/translation

• Rarely about rationally selecting from options• Assessment decisions are complex; situated;

pragmatic

Impetus

•Opportunity or driver for change

Influences

•Environment•Educator

Activities•Making it work

“I think a lot of us have good intentions,

we just don't have the time” – science

lecturer

Time

“I was redeveloping a unit, it had already had a particular format of assessment. I elected to run with that rather than to go though the

processes of trying to alter it … I wasn't gonna [jump

through] those other hoops.” – humanities lecturer

Committees and paperwork

“I don't think an assessment should be painful for the students or painful for the staff that assess it” – science lecturer

Beliefs

“technology becomes really critical where

assessment is concerned. If you set something up and it doesn't work, they don't trust you. Getting them on board again is a killer …

students can be very hostile to you making mistakes. They're not very forgiving” – arts lecturer

Technology

Improving assessment

Requested supports• Exemplars• Time, money,

sessionals• Someone to help• Involvement from

senior academics

Our analysis adds• Understanding of

freedom to move

More information

http://assessmentdecisions.org

A short paper: sclr.li/19This research is supported by an Australian government Office for Learning and Teaching grant titled “Improving assessment: understanding educational decision-making in practice” (ID12-2254)

References• Carless, D. (2009). Trust, distrust and their impact on assessment reform.

Assessment & Evaluation in Higher Education, 34(1), 79-89. doi: 10.1080/02602930801895786

• Hattie, J., The Black Box of Tertiary Assessment: An Impending Revolution, in Tertiary Assessment & Higher Education Student Outcomes: Policy, Practice & Research, L.H. Meyer, et al., Editors. 2009, Ako Aotearoa: Wellington, New Zealand.

• Offerdahl, E. G., & Tomanek, D. (2011). Changes in instructors' assessment thinking related to experimentation with new strategies. Assessment & Evaluation in Higher Education, 36(7), 781-795. doi: 10.1080/02602938.2010.488794

• Price, M., Carroll, J., O'Donovan, B., & Rust, C. (2011). If I was going there I wouldn't start from here: a critical commentary on current assessment practice. Assessment & Evaluation in Higher Education, 36(4), 479-492. doi: 10.1080/02602930903512883

Extra slides

Research and ideas from the literature

• “there's some interesting papers that look at the use of wikis and peer learning and those sorts of things” – science lecturer (early-career)

Research and ideas from the literature

• “That was really quite confronting to me and reading education literature, and I'd still find it difficult because it's completely different to my discipline … I just find it really, really difficult. Because I’m going, "My god, that's just their feelings."” – science lecturer (late-career)

Time and workload

• “I think a lot of us have good intentions, we just don't have the time” – science lecturer

• “But the main thing is that, it has to be feasible from a resource point of view as well. And if at a resource point of view, as well, as that educational perspective as well.” – health professions lecturer

Being strategic

• one of the other things that we learned as well was not to put too much data into the FEC documents, which I remember in the early days, again, we had 1500-word essay on topic X and then afterwards when you go 'that's crazy' we need to change it, we've got to go back to FEC and that load of paperwork and things. So, we do... You know, we are much more general about what we're putting into the FEC documents

Policy and flexibility

• “I was redeveloping a unit, it had already had a particular format of assessment. I elected to run with that rather than to go though the processes of trying to alter it … I wasn't gonna [jump through] those other hoops.” – humanities lecturer

• “there's supposed to be between 10 and 20% HDs and no fewer... No more than 5% fails, or 10%, I think … or more than probably 10% fails. I don't know would you call that a bell curve or what, but it is a prescribed range of results that you should have.” – humanities lecturer

Policy and flexibility• “In regards to formal channels of approval, like education

committees and the like, turns out that the, what the paperwork that currently exists with the education committees, is actually very, extremely non-specific.” - health professions lecturer

• “But I don't feel that the committee structures are really designed to be able to give... feedback or you know contribute, it's really more tick-cross. So in some of these cases, if what I was doing wasn't a change to the paperwork they already had, then we just carried on.” – health professions lecturer

Teamwork (or not)

• “The process of how this was developed was kind of the most amazing process, because basically, <colleague> said, "Would you develop the unit, write a unit guide", and I did, with no help at all. I just thought ‘what would be needed?’, and I invented it…

• …the process has been a bit out of necessity, less consultative than what it needs to be. I mean, we had to get a <topic> course up and running really quickly.” – humanities lecturer

Teamwork (or not)

• “Yeah, I had to learn a fair bit, with that, which I learned off colleagues. So, again, yeah, that the sort of the theoretical background to teaching. Which sort of was provided to me. Through exposure to colleagues. And, that made a big change. That helped me do assessments better.” – health professions lecturer

Technology

• technology becomes really critical where assessment is concerned. If you set something up and it doesn't work, they don't trust you. Getting them on board again is a killer … students can be very hostile to you making mistakes. They're not very forgiving – arts lecturer