Evidence-based Practice (EBP) – an introduction Three objectives: 1. What is EBP? 2. Where has EBP...

Preview:

Citation preview

Evidence-based Practice (EBP) – an introduction

Three objectives:1. What is EBP?

2. Where has EBP come from (and why)?

3. Risks and Rewards of EBP?

What is EBP?

Where has EBP come from (and why)?

EBP – a definition

The term evidence-based practice (EBP) refers to preferential use of mental and behavioural health interventions for which systematic empirical research has provided evidence of statistically significant effectiveness as treatments for specific problems. Wikipedia

Box of shocks

This hand-cranked electrotherapeutic machine was designed in the early 1900s so that patients could give themselves shock therapy in the comfort of their own home.

Some used it for tooth ache, while others used it to ease nerve pain and tics.

EBP – the intent Evidence based practice (EBP) is an

approach which tries to specify the way in which professionals or other decision-makers should make decisions by identifying such evidence that there may be for a practice, and rating it according to how scientifically sound it may be. Its goal is to eliminate unsound or excessively risky practices in favour of those that have better outcomes.

Wikipedia

“Facts are meaningless. You could use facts to prove anything that's even remotely true!”

Homer Simpson

EBP in other contexts

Defence

Human relations (HR)

EBP in NZ schools

Legislative context: NAG1 NAG2 Education Standards Act

Ministry advice Planning for better student outcomes Consider the evidence

Consider the EvidenceEvidence-driven decision making

for secondary schools

A resource to assist schools to review their use of data and other evidence

Evidence-driven decision making

The evidence-driven decision making cycle

.Explore data Survey of students shows that this is only partially true

QuestionWhat are the characteristics of students who are poor at writing?

Assemble more data & other evidence: asTTle reading, homework, extracurric,Attendance, etc

Analyse NQF/NCEA results by standard

TriggerSome of our students are poor at writing

Analyse non NQF/NCEA data and evidence

InterveneCreate multiple opportunities for writing; include topics that can use sport as context; connect speaking and writing. PD for staff.

Interpret informationPoor writers likely to play sport, speak well, read less, do little HW

A teacher has a hunch - poor writers might spend little time on homework

Evaluate Has writing improved?

ReflectHow will we teach writing in the future?

Evidence-driven strategic planning

.INDICATORS FROM DATA

asTTle scores show a high proportion of Yr 9 achieving below curriculum level

NCEA results show high non- achievement in transactional writing

Poor results in other language NCEA standards

etc

STRATEGIC GOAL

To raise the levels of writing across the school

Strategic action

Develop a writing development plan which addresses writing across subjects and levels , including targets, professional development and other resourcing needsetc.

ANNUAL PLAN

Develop and implement a plan to raise levels of Writing at year 9

Development plan to be based on an analysis of all available data and to include a range of shared strategies

etc.

YEAR TARGET

Raise writing asTTle results Yr 9boys from 3B to 3A

etc.

Appraisal

P D

Self review

School charter

EVALUATION DATA

asTTle writing results improve by …

Perception data from year 9 staff indicates …

Evaluation of effectiveness of range of shared strategies, barriers and enablers …

etc.

EBP – a definition

The term evidence-based practice (EBP) refers to preferential use of mental and behavioural health interventions for which systematic empirical research has provided evidence of statistically significant effectiveness as treatments for specific problems. Wikipedia

What is EBP?

Where has EBP come from (and why)?

Risks and Rewards of EBP

For students

For teachers

For schools

Not everything that counts can be counted, and not everything that can be counted counts!

Albert Einstein

Do we measure what we value or are we valuing what we can measure?

Julia Atkin, 2004

ASSESSMENT TENSIONS and DILEMMAS:

quantity quality

objective subjective

standardise raise standards

individual collaborative

material spiritual

technical ‘soulful’

formal informal

simple complex

Julia Atkin, 2004

A rationalist, positivist ‘world view’ focuses on the left hand side of the previous slide; focuses on aspects which are tangible and that can be quantified and measured on linear scales.

Our challenge is that deep learning, learning with spirit, dynamic learning, transformative learning embraces both sides … it is all about the integration of the two columns. It is not ‘either-or’ - it’s ‘both - and’.

I encourage you to re-think assessment by clarifying what we value and believe about learning,teaching and assessing - by starting with WHY? Why assess? What do we value and believe about learning & teaching and what are the implications for assessment?

Julia Atkin, 2004

‘The right hand side of the tension point list demands qualitative forms of assessment. In addition to performance (as in authentic assessment) qualitative forms of assessment involve holistic media such as narrative and image

Julia Atkin, 2004

What did I learn today? My mother will want to know.

Can assessment raise standards?

Recent research has shown that the

answer to this question is an

unequivocal ‘yes’. Assessment is one

of the most powerful educational tools

for promoting effective learning.

Kay Hawk, 2006

But it must be used in the right way.

There is no evidence that increasing the

amount of testing will enhance learning.

Instead the focus needs to be on helping

teachers use assessment, as part of

teaching and learning, in ways that will

raise pupils’ achievement.Kay Hawk, 2006

The research tells us that successful learning

occurs when learners:

• have ownership of their learning

• understand the goals they are aiming for

• are motivated and have the skills to

achieve success.

Kay Hawk, 2006

Earl, L. (2006)

Three types of assessment:

Diagnostic

Formative

Summative

Earl, L. (2006)

Done for students

Done with students

Done to students

Earl, L. (2006)

Earl, L. (2006)

1. Understand theory and principles

2. Use current research

3. Cut out unnecessary or unused assessment

4. Embed good formative practice

5. Visit other schools

6. Explode the myths

7. Empower students Kay Hawk, 2006

What do schools need to do to maximise the benefits of good assessment practice?

Risks and Rewards of EBP?

For students?

For teachers?

For schools?

Good News range of effective tools good research on what makes a

difference MOE assessment contracts Cluster initiatives to help bridge the

educational islands NCEA systems which allows flexibility

Kay Hawk, 2006

A final thought:

Earl, L. (2006)

Evidence-based Practice (EBP) – an introduction

What is EBP?

Where has EBP come from (and why)?

Risks and Rewards of EBP?

References: Atkin, J., (2004). Reassessing assessment: Beyond benchmarking the benchmarks: NavCon 2k4 Conference

Earl, L., (2006). Rethinking classroom assessment with purpose in mind: Aporia Consulting Ltd. OISE/UT

Hammersley, M., (2001). Some questions about evidence-based practice in education: Annual Conference of the British Educational Research Association

Hawk, K., (2006). Assessment in 2006: What? and how?: ULearn06 Conference

Ministry of Education. (n.d.) Consider the evidence. Retrieved from http://www.tki.org.nz/r/governance/consider/index_e.php

Ministry of Education. (2003) Planning for better student outcomes. Retrieved from http://www.minedu.govt.nz/~/media/MinEdu/Files/EducationSectors/PrimarySecondary/SchoolOpsPlanningReporting/PlanningBetterStudentOutcomesSept2003.pdf

Recommended