View
218
Download
0
Embed Size (px)
Citation preview
Plenary - Eye on the Prize:Sharing and Using Program Review Results
Marilee J. Bresciani, Ph.D.Professor, Postsecondary Education Leadership and
Co-Director of the Center for Educational Leadership, Innovation, and Policy
San Diego State University3590 Camino Del Rio North
San Diego, California, U.S.A.619-594-8318
What is the purpose of your program review?
What kinds of decisions do your program review findings allow you
to make?
Visioning the Result of your Program Review ProcessVisioning requires
the setting of intentionsthe communication of those intentions
the sharing of those intentionsthe changing of past behaviors and beliefs
the implementation of new ways of doing and thinking
Visioning Questions
What kinds of decisions do your program review findings allow you to make so that the purpose of your program review is fulfilled?
What kinds of data do you need to inform the kinds of decisions that affirm the purpose of your program review process?
Visioning Questions Who needs to be involved …
in the collecting of that data? in the interpretation of that data? In the synthesizing of that data? In the brainstorming around decisions that
could be made In the prioritization of the implementation
plan and the resources to move that plan to fruition?
What professional development is needed to accomplish this?
Uses of OBPR Results(WASC Program Review Guidelines, 2009)
Developing program learning outcomes and identifying appropriate means for assessing their achievement
Better aligning department, college and institutional goals
Refining departmental access, and other interventions to improve retention/attrition, and graduation rates Bresciani, M.J.
Uses, Cont.(WASC Program Review Guidelines, 2009)
Designing needed professional development programs, especially for faculty to learn how to develop and assess learning outcomes
Reorganizing or refocusing resources to advance specific research agendas
Re-assigning faculty/staff or requesting new lines
Bresciani, M.J.
Uses, Cont.(WASC Program Review Guidelines, 2009)
Illuminating potential intra-institutional synergies
Developing specific plans for modifications and improvements
Informing decision making, planning and budgeting, including resource re/allocation
Linking and, as appropriate, aggregating program review results to the institution’s broader quality assurance/improvement efforts
Bresciani, M.J.
How do you envision using your results?
In order for these Uses to Occur, An Institution Needs…(Bresciani, 2006)
Set priorities around institutional values
Communicate a shared conceptual framework and common language
Systematically gather data that actually evaluates these priorities
Bresciani, M.J.
Sharing & Using Program Review Results
Challenges occur because: Results not linked to outcomes,
college/institutional goals, or strategic prioritiesTemplate fix is the solution here (p.144)
Those who receive results are unsure what to do with them Presentation of data Interpretation of the data Professional development
Sharing & Using Program Review Results, Cont. Challenges occur because:
The routing for discussion of results and decisions is not clearClarified by roles and responsibilitiesCommunication routing
Expectations of use of results is not clearClarifying how results will be used by
whom , when
What are your Challenges and how could they be readily addressed?
Bresciani, M.J.
Real Life Considerations for the Write-Up
The Audience For whom is the data? Change language for different audiences
if necessary The Story
What point are you trying to make and for whom?
What decision needs to be made and who needs to make it?
The Format Depends on the audience
Bresciani, M.J.
Reporting Strategiesfrom Gary Hanson, Ph.D.
Know your data Know your audience Tell the story
Identify meaningful indicators to shape the story
Examine indicators for patterns Begin with the end in mind
Tie the data to the outcomes Involve the end users in the process
Bresciani, M.J.
Reporting Strategies, Cont.from Bresciani, Zelna, and Anderson, 2004
Identify the values of your constituents and find out how your constituents prefer to see data and reports. Especially important for IR people who are the
“keepers of the data” Continual process of refinement
Students (or those whom you evaluated) can be extremely helpful in your writing and dissemination of results and decisions made.
Be sure to link the data and decisions made to the outcome and the program being assessed (Maki, 2001).
Bresciani, M.J.
Reporting Strategies, Cont. from Bresciani, Zelna, and Anderson, 2004
Timing is everything when delivering results and decisions.
Prepare to defend your outcome, your evaluation method, your results, and the decisions made based on those results.
If you need help interpreting the data, get it.
Consider multiple layers of reporting (broad and general to detailed and specific)
Bresciani, M.J.
Share Examples - Reporting
What “need” generated the data requested?
What is the purpose of the data request? What story needs to be told?
Who will be using the data? What are their values? …preferences for receipt of data?
Who needs to be involved in preparing the data? …presenting the data?
What story are you trying to tell? What key points are you trying to make?
Bresciani, M.J.
Sharing Examples - Providing Data
What data do your constituents want available to inform their decisions? How often do they want it? In what format?
How transparent do they want the data to be? What is the communication flow?
Do they need comparisons? What type of comparisons will be most
meaningful to them?
Some Examples
1. Various Ways to Tell the Story2. Using Dashboard Indicators to Inform OBPR Focus
Example Results SLO 12) Apply research to practice, particularly in their area of specialization and focus Student feedback and the evaluation of learning artifacts from 610, 795 A&B as well as the
learning portfolio reflect a misalignment in the curriculum. We have made slight changes to the ED 690 course to align it to better prepare students for ED 795A & B by offering a stronger transition to preparing students to write a literature review and to reflect on a problem statement and purpose statement. Faculty will continue to visit about further opportunities to improve this alignment over the coming year.
While the faculty explored the possibility of offering ED 690 in spring only and designing a section just for SA students and a section just for rehab students; this solution was not possible due to the in-affordability of offering this solution.
Students reported wanting more control over the selection of their research topic. Student groups will be given more autonomy in selecting their research project topic.
Students working in groups will continue to evaluate their peers. However, due to student feedback, the peer evaluation rubric will be revised to include aspects that students wanted to evaluate their peers on and students will be educated more frequently on how the rubric scores will influence their peer grades.
Results indicate that many students felt that they learned about each different aspect of a research project better in a group environment rather than individually. Students commented that they would prefer to be in a smaller group in the future, with a recommended group size of 4-5 students per project. The group size will be decreased from 8-9 students per group to a group size ranging from 3-5.
We will re-evaluate this outcome in 2008 to see if we made any improvement. Student Exit Survey –We will continue the exit survey each year but explicitly ask students to
respond to the extent that they learned each program SLO, rather than ask them for satisfaction of each course.
How do you perceive your {Primary Academic Advisor}?
AG AR BA ED EN GE GS LA SC VM totalefficient 180 116 153 139 192 92 99 171 163 141 1446
97 89 93 89 95 89 100 93 94 96 94realistic 180 128 157 150 186 97 99 178 167 143 1485
97 98 96 96 92 93 100 97 97 97 96knowledge- 182 121 159 141 197 102 98 179 168 144 1491able 98 92 97 90 97 98 99 97 97 98 96caring 181 115 153 143 183 98 94 174 162 139 1442
98 88 93 91 90 94 95 95 94 95 93maintains 181 113 154 137 186 93 93 160 152 139 1408highstandards
98 86 94 87 92 89 94 87 88 95 91
demanding 110 70 71 80 113 59 53 90 79 82 80760 53 43 51 56 57 54 49 46 56 52
responsive 178 116 150 144 187 99 98 172 167 139 145096 89 92 92 92 95 99 94 97 95 94
flexible 177 116 141 139 169 94 98 167 161 139 140196 89 86 89 83 90 99 91 93 95 91
total 185 131 164 157 203 104 99 184 173 147 1547Percent of college total in bold
Overall, rate the level of satisfaction of the assistance you have received from your {Primary Academic Advisor}
Scale: 1=poor, 2=fair, 3=good, 4=excellent
1.00
1.50
2.00
2.50
3.00
3.50
4.00
AG AR BA ED EN GE GS LA SC VM
mean
sati
sfa
cti
on
Mean StdDev
AG 3.51 0.68AR 3.08 0.82BA 3.23 0.75ED 3.13 0.88EN 3.05 0.77GE 3.13 0.83GS 3.35 0.58LA 3.25 0.74SC 3.31 0.69VM 3.37 0.71Total 3.24 0.76
significant main effect
Bresciani, M.J.
Advisors by College(continued)
College Departmental Faculty NoneAG 13.1 68.6 15.2 3.14AR 19.6 46.2 25.9 8.39BA 36.4 34.4 8.13 21.1ED 29 59.8 4.73 6.51EN 7.31 72.1 13.7 6.85GE 15.1 59.4 23.6 1.89GS 30.6 46.3 4.96 18.2LA 12.6 63.6 13.6 10.2SC 20.7 57.5 11.4 10.4VM 23 65.2 4.35 7.45median 20.15 59.6 12.5 7.92
Percent of students indicating the type of advisor they go to most often, by college.
Bresciani, M.J.
Same Information
0%
20%
40%
60%
80%
100%
AG AR BA ED EN GE GS LA SC VM
None
Faculty
Departmental
College
Bresciani, M.J.
More focused Information
Median
College
Departmental
Faculty
None
Bresciani, M.J.
Real–Life Reporting Reminders
Keep your audience in mind If you have to draft varying
reports/summaries of results for your varying audiences… do it
Report data in context of issues or outcomes
Provide a detailed version and an “executive summary”
Use graphs wisely
Bresciani, M.J.
Real–Life Reporting Reminders, Cont.
Timing IS really EverythingDon’t under-estimate the power of “trying out” drafts on key decision-makers
Interpret your data so that it informs program improvement, budgeting, planning, decision-making, or policies.
Report limitations honestly
Sharing & Using Program Review Results Questions In an ideal world, who would you want to review the results of program review? Does your answer vary by type of
program (prof accreditation or not) or level of program (UG, GR) ?
What reflection questions would you provide those reviewing the results to guide their interpretation of and therefore use of the results?
Sharing & Using Program Review Results Questions, Cont.
What are the articulated expectations for use of the results?
How would results be disseminated? Who would be involved in interpreting
the results and what is their role? Are their clear paths for
communication flow of results, interpretation of results, decisions, and recommendations?
Sharing & Using Program Review Results Questions, Cont.
Is one type of data more influential over another type in the interpretation of the results?
Is it clear who is involved in formulating decisions that are based on interpretation of results?
Is it clear on which level the decision resides?
Mentor Group Reflection Questions Is there anything I need to change on
my OBPR template so that it is clear how results align with outcomes at each level?
Am I clear about who needs to see the results from program review and how they prefer to see it in order to inform the necessary decisions?
Do I need to provide any professional development for anyone so they know how to use the results to inform decisions?