1. 2 November 9, 2015 Chicago, Illinois The JRCERT promotes excellence in education and elevates...

Preview:

Citation preview

1

Wi-Fi

Login – 20nconference

Password – meeting2015

Outcomes Assessment

2

November 9, 2015

Chicago, Illinois

JRCERT Mission Statement

The JRCERT promotes excellence in education and elevates quality and safety of patient care through the

accreditation of educational programs in radiography, radiation therapy, magnetic resonance, and medical

dosimetry.

Board of DirectorsLaura S. Aaron, Ph.D., R.T.(R)(M)(QM), FASRT

• Chair

Stephanie Eatmon, Ed.D., R.T.(R)(T), FASRT

• 1st Vice Chair

Tricia Leggett, D.H.Ed., R.T.(R),(QM)

• 2nd Vice Chair

Darcy Wolfman, M.D.

• Secretary/Treasurer

Board of Directors

5

Laura Borghardt, M.S., CMD

Susan R. Hatfield, Ph.D.

Bette A. Schans, Ph.D., R.T.(R)

Jason L. Scott, M.B.A., R.T.(R)(MR), CRA, FAHRA

Loraine D. Zelna, M.S., R.T.(R)(MR)

Executive Staff

Leslie F. Winter CEO

Jay Hicks Executive Associate Director

Traci Lang Assistant Director

Professional Staff

Tom Brown Accreditation Specialist

Jacqueline Kralik Accreditation Specialist

Brian Leonard Accreditation Specialist

Program Statistics (November 2015)

8

Radiography613

Radiation Therapy

74

Magnetic Resonance

10

Medical Dosimetry

17

2014 Accreditation Actions

9

Total Considerations - 378 Interim Reports - 151

Initial -9 Progress Reports - 29

Continuing - 80 Other – 109

2014 Accreditation Awards

10

8 Year – 59 Probation – 5

5 Year – 13 2 Year – 2

3 Year – 6 Involuntary Withdraw – 3

OUTCOMES ASSESSMENT

“Educational values should drive not only what we choose to assess but also how we do so. Where questions about educational mission and values are skipped over, assessment threatens to be an exercise in measuring what is easy, rather than a process of improving what we really care about.”

12

New Leadership Alliance, 2012

A process that provides information to participants, allowing clear evaluation of the process, the ability to understand the overall quality of the process and the opportunity to identify areas for improvement.

(New Leadership Alliance, 2012)

What is Assessment?

13

Assessment should reflect an understanding of student learning over time so as to reveal change,

growth, and increasing degrees of integration. This approach aims for a more accurate picture of learning

and therefore a firmer base for improving our students educational

experience.14

The ongoing process of

1. Establishing clear, measurable, expected SLOs2. Systematically gathering, analyzing, and

interpreting evidence to determine how well students’ learning matches expectations

3. Using the resulting information to understand and improve student learning

4. Reporting on processes and results

What is Student Learning Outcomes Assessment?

15

Making your expectations explicit and

public

Using the resulting information to

document, explain, and improve

performance

Assessment Involves:

16

Information-based decision making

“The end of assessment is action”

Do not attempt to achieve the perfect research design… gather enough data to provide a reasonable basis for action.

Wolvoord (2010)

Goal of Assessment ?

17

Compliance with external demands

Gathering data no one will use

Making the process too complicated

Pitfalls of Assessment

18

Course grade cannot pinpoint concepts that students have or have not mastered

Grading Criteria ◦ Attendance, Participation, Bonus points

Inter-rater reliability or vague grading standards

Not holistic Do grades have a place in an Assessment

program?

Course Grades

19

Courses

Student Learning Outcomes

SLO 1 SLO 2 SLO 3 SLO 4

RAD 150

RAD 153 I I I

RAD 154 R I I

RAD 232 R R R R

RAD 234 R R

RAD 250 M M M&A M

RAD255 M&A M&A A M&A

Curriculum Map

20

“I” = introduce“R” = reinforce, practice“M” = mastery“A” = assessed for program assessment0 = no emphasis1 = minor emphasis 2 = moderate emphasis 3 = significant emphasis

Types of AssessmentStudent Learning Program Effectiveness

What students will do or achieve

Knowledge

Skills

Attitudes

What the program will do or achieve

Certification Pass Rate

Job Placement Rate

Program Completion Rate

Graduate Satisfaction

Employer Satisfaction

21

Types of Assessment Formative Assessment Summative Assessment

Gathering of information during the progression of a program.

Allows for student improvement prior to program completion.

Gathering of information at the conclusion of a program.

22

Mission Statement

Goals

S

L

O

s

23

Our program is an integral part of the School of Allied Health Professions and shares its values. The program serves as a national leader in the education of students in the radiation sciences and provides learning opportunities that are innovative and educationally sound. In addition to exhibiting technical competence and the judicious use of ionizing radiation, graduates provide high quality patient care and leadership in their respective area of professional practice.

Consideration is given to the effective use of unique resources and facilities. Strong linkages with clinical affiliates and their staff are vital to our success. Faculty and staff work in a cooperative spirit in an environment conducive to inquisitiveness and independent learning to help a diverse student body develop to its fullest potential. The faculty is committed to the concept of lifelong learning and promotes standards of clinical practice that will serve students throughout their professional careers.

Mission Statement

24

Mission Statement

The mission of our program is to produce

competent entry-level radiation therapists.

Mission Statement

25

Mission Statement

26

broad statements of student achievement that are consistent with the mission of the program

should address all learners and reflect clinical competence, critical thinking, communication skills, and professionalism

Goals

Goals

Contain assessment tools

Contain increases in achievement

Contain program achievements

Goals should not:

27

Goals

The program will prepare graduates to function as entry-level ___.

The faculty will assure that the JRCERT accreditation requirements are followed.

Students will accurately evaluate images for diagnostic quality.

85% of students will practice age-appropriate patient care on the mock patient care practicum.

Goals ?

28Goals

Program Effectiveness Measures

Outcome Measurement Tool

Graduates will pass the national certification exam on the 1st attempt.

ARRT or MDCB 1st Time Pass Rates

Of those pursuing employment, graduates will be gainfully employed within 12 months post-graduation.

Graduate Survey (Question 18)

Students will complete the program within 150% of the stated program length.

Retention Rate

Students will be satisfied with their education.

Graduate Survey (Question 1)

Employers will be satisfied with the graduate’s performance

Employer Survey (Question 5)29

STUDENT LEARNING OUTCOMES

Specific MeasureableAttainableRealistic Targeted

Student Learning Outcomes

31

Student Learning Outcomes

32

Students will ______ ________.

action verb something

We Suggest No More Than 8-9

(TOTAL)SLOs

33

Bloom’s Taxonomy

HIGHER-LEVEL THINKING SKILLS

LOWER-LEVEL THINKING SKILLS

KNOWLEDGE

EVALUATION

SYNTHESIS

ANALYSIS

APPLICATION

COMPREHENSION

34

KNOWLEDGECOMPREHENSION

APPLICATIONANALYSIS

SYNTHESISEVALUATION

CiteCountDefineDraw

IdentifyList

NamePointQuoteReadReciteRecordRepeatSelectState

TabulateTell

TraceUnderline

AssociateClassify

CompareComputeContrast

DifferentiateDiscuss

DistinguishEstimateExplainExpress

ExtrapolateInterpolate

LocatePredictReportRestateReview

TellTranslate

ApplyCalculateClassify

DemonstrateDetermineDramatize

EmployExamineIllustrateInterpretLocate

OperateOrder

PracticeReport

RestructureScheduleSketchSolve

TranslateUse

Write

AnalyzeAppraiseCalculate

CategorizeClassify

CompareDebate

DiagramDifferentiateDistinguishExamine

ExperimentInspect

InventoryQuestionSeparate

SummarizeTest

ArrangeAssemble

CollectComposeConstruct

CreateDesign

FormulateIntegrateManageOrganize

PlanPrepare

PrescribeProduceProposeSpecify

SynthesizeWrite

AppraiseAssessChoose

CompareCriticize

DetermineEstimateEvaluate

GradeJudge

MeasureRankRate

RecommendReviseScoreSelect

StandardizeTest

Validate

Lower division courseoutcomes

35

KNOWLEDGECOMPREHENSION

APPLICATIONANALYSIS

SYNTHESISEVALUATION

CiteCountDefineDraw

IdentifyList

NamePointQuoteReadReciteRecordRepeatSelectState

TabulateTell

TraceUnderline

AssociateClassify

CompareComputeContrast

DifferentiateDiscuss

DistinguishEstimateExplainExpress

ExtrapolateInterpolate

LocatePredictReportRestateReview

TellTranslate

ApplyCalculateClassify

DemonstrateDetermineDramatize

EmployExamineIllustrateInterpretLocate

OperateOrder

PracticeReport

RestructureScheduleSketchSolve

TranslateUse

Write

AnalyzeAppraiseCalculate

CategorizeClassify

CompareDebate

DiagramDifferentiateDistinguishExamine

ExperimentInspect

InventoryQuestionSeparate

SummarizeTest

ArrangeAssemble

CollectComposeConstruct

CreateDesign

FormulateIntegrateManageOrganize

PlanPrepare

PrescribeProduceProposeSpecify

SynthesizeWrite

AppraiseAssessChoose

CompareCriticize

DetermineEstimateEvaluate

GradeJudge

MeasureRankRate

RecommendReviseScoreSelect

StandardizeTest

Validate

Upper divisionCourse / Program

outcomes

Students will be clinically competent.

Students will complete 10 competencies with a grade ≥75% in RAD 227.

Graduates will be prepared to evaluate and interpret images for proper evaluation criteria and quality.

Students will demonstrate ability to operate tube locks.

Student Learning Outcomes?

36

SLOs

The most important criterion when selecting an

assessment method is whether it will provide

useful information - information that indicates

whether students are learning and developing

in ways faculty have agreed are important.

(Palomba & Banta 2000)

Assessment Measurements

37

Types of Assessment Measurements

Direct Assessment Measurements

Indirect Assessment Measurements

◦Demonstrate learning

◦Performance learning allows students to demonstrate their skills through activities

◦Provides reflection about learning

38

Measurements

◦ Rubrics

◦ Unit or Final Exams

◦ Capstone Courses

◦ Portfolios

◦ Case Studies

◦ Embedded Questions

◦ Surveys (Graduate, Employer)

◦ Self-Evaluations

◦ Exit Interviews

◦ Focus Groups

◦ Reflective Essays

39

Direct Indirect

Change Your Focus on Assessment

40

From (Unconsciously):

Making the program look good on paper.

TO :

How can I diagnose learning problems so that I can improve

student learning.

41

Goal of Assessment

42

You cannot determine how to improve the program until you know how well the

students have learned.

MUST measure the outcome

Should represent your students’ achievements as accurately as possible

Should assess not only whether the students are learning,but how well

What should measurement tools do for you?

43

a point of reference from which measurements may be made

something that serves as a standard by which others may be measured or judged

Benchmarks

44

Standard of Performance

Realistic yet Attainable

External or Internal

No Double Quantifiers

(Qualifiers)

Rating Scale

Benchmarks

45

Benchmarks (Examples)

46

Tool Benchmark Timeframe

Capstone Course – Final Portfolio

≥ 94.5 pts (100 scale)

5th Semester

Clinical Evaluation Form (Section 2)

≥4.0(5.0 scale)

5th Semester

Debate Rubric ≥ 31.5 points (35 possible points)

4th Semester

47

Group Exercise 1 and Assessment Committee Work

48

Report the actual data◦ On assessment plan◦ On separate document

Should facilitate comparison◦ Comparison of cohorts◦ Comparison of students attending certain clinical setting

Show dates

Collect and Trend the Data

50

What do the data say about your students’ mastery of subject matter, of research skills, or of writing and speaking?

What do the data say about your students’ preparation for taking the next career step?

Do you see areas where performance is okay, but not outstanding, and where you’d like to see a higher level of performance?

Data Analysis

51

UMass-Amherst, OAPA: http://www.umass.edu/oapa/oapa/publications/

How can assessment data be used?

Primary Uses Secondary Uses

Curriculum Review Requests to Curriculum

Committee Accreditation Reports and

Reviews

Recruiting Alumni Newsletter Other publications Grants and other Funding

52

UMass-Amherst, OAPA

Identify benchmarks met◦ Sustained effort◦ Monitoring◦ Evaluate benchmarks

Identify benchmarks not met◦ Targets for improvement

◦Study the problem before trying to solve it!!◦ Evaluate benchmark

Identify 3 years of data (trend)

Data Analysis

53

54

Outcome Benchmark Results Analysis/Action

Students will demonstrate

radiation protection.

85% of students will average a

score of ≥ 5.0 (6.0 scale)

100% of students scored 5 or better.

Benchmark met.

Students will select appropriate technical factors.

75% of students will average a

score of 85% or better

100% of students scored 85% or

better. Continue to

monitor

Employers will find our graduates

as proficient in radiation

protection and safety.

80% of employer surveys will rate grads as Above

Average or Excellent

100% of employers rate our

grads as being Above Average or

Excellent in proficiency of

radiation protection skills.

No Action Needed.

Data Collection and Analysis?

Outcome Benchmark Results Analysis/Action

Students will demonstrate

radiation protection.

≥ 5.0 (6.0 scale) 5.28

Benchmark met. For the past 2 years this result has increased

(07/08: 4.90; 08/09: 5.15). This may be attributed to an increased emphasis of rad. protection throughout this

semester.

Graduates will manipulate the

‘typical’ examination

protocol to meet the needs of a trauma patient

≥4.0(5.0 scale) 3.40

Benchmark not met. This result continually improves with each

cohort (07/08: 3.25; 08/09: 3.33). The increased amount of

lab time throughout the curriculum could be attributed to an increase in this result.

Continue to monitor. 55

2009 – 2010Data Collection and Analysis Example

Analysis Group Exercise

56

NOW WHAT ?

When it is ongoing.

Assessment works best…

58

is cumulative is fostered when assessment involves a

linked series of activities undertaken over time

may involve tracking progress of individuals or cohorts

is done in the spirit of continuous improvement

Ongoing Assessment

59

The process of drawing conclusions should be open to all those who are likely to be affected by the results – the communities of interest.

Analysis of the assessment data needs to be shared and formally documented. For example, meeting minutes from Assessment or Advisory Committee.

Closing the Cycle

60

Evaluate the assessment plan itself to assure that assessment measures are adequate.

Evaluation should assure that assessment is effective in measuring student learning outcomes.

Document with meeting minutes.

Evaluation of the Assessment Plan

61Objective 5.5

Is our mission statement still applicable for what the program is trying to achieve?

Are we measuring valuable outcomes that are indicative of the graduate we are trying to produce?

Do we like the plan?

Does the plan provide us with the data that we are seeking?

Are the student learning outcomes still applicable?

Are the SLOs measurable?

Answer These Questions

62

“Do we want to use new tools to collect data for the SLO’s?”

“Are our benchmarks appropriate?”

“Do our benchmarks need adjustment?”

“Are the appropriate personnel collecting the data?”

“Are the data collection timeframes appropriate?”

Make recommendations based on the answers

Answer These Questions

63

Keeping Your DocumentationFor each year:

1. Copy of Assessment Plan

2. Actual tools for each one identified in plan – Do calculations on this tool.

3. Example of each tool (blank).

4. Meeting minutes that document analysis and sharing of the SLO and PED data.

5. Documentation of examples of changes that were implemented as a result of data gleaned from assessment process.

6. Meeting minutes documenting that the assessment plan has been evaluated to assure that measures are adequate and that the process is effective in measuring SLOs.

64

mail@jrcert.org www.jrcert.org

JRCERT Contact Information

142

20 North Wacker Drive, Suite 2850

Chicago, IL 60606-3182(312) 704-5300

THANK YOU!!

66

for supporting excellence in education and

quality patient care through programmatic

accreditation.

Allen, M.J.(2004). Assessing academic programs in higher education. Bolton, MA: Anker Publishing Company, Inc.

New Leadership Alliance. (2012). Assuring Quality: An Institutional Self-Assessment Tool for Excellent Practice in Student Learning

Outcomes Assessment. Washington DC: New Leadership Alliance for Student Learning and Accountability.

Suskie, L.(2009). Assessing student learning: A common sense guide. San Francisco, CA: Jossey-Bass.

Wonderlic Direct Assessment of Student Learning. (n.d.). You Tube. Retrieved May 6, 2014, from http://www.youtube.com/

watch?v=JjKs8hsZosc Walvoord, B. E. (2010). Assessment clear & simple: A practical guide for

institutions, departments, and general education. San Francisco, CA: Jossey-Bass.

References and Bibliography

67

Recommended