View
2
Download
0
Category
Preview:
Citation preview
1 | P a g e 8/15/2016
Merritt College
2016-2017 Annual Program Update Template
Final Version: Approved PCCD May 20, 2016
Radiologic Science Program
Jennifer Yates, Ed.D., RT(R)(M)(BD)
2 | P a g e 8/15/2016
Introduction, Directions and Timeline
The Peralta Community College District has an institutional effectiveness process which consists of the following components: a District-wide
Strategic Plan which is updated every six years; Comprehensive Program Reviews which are completed every three years; and Annual Program
Updates (APUs) which are completed in non-program review years. While there are individualized Program Review Handbooks for Instructional
units, Counseling, CTE, Library Services, Student Services, Administrative units, and District Service Centers, there is one Annual Program Update
template for use by everyone at the colleges which is completed in the Fall semester of non-program review years.
The Annual Program Update is intended to primarily focus upon planning and institutional effectiveness by requesting that everyone report upon the
progress they are making in attaining the goals (outcomes) and program improvement objectives described in the most recent program review
document. The Annual Program Update is therefore a document which reflects continuous quality improvement. Additionally, the Annual Program
Update provides a vehicle in which to identify and request additional resources that support reaching the stated goals (outcomes) and program
improvement objectives in the unit’s program review.
Throughout this document, the term “program” is used to refer to all of these terms: discipline, department, program, administrative unit, or unit.
If you have questions regarding data, please contact Samantha Kessler, Research and Planning Officer skessler@peralta.edu. If you have questions
regarding other material in the APU, please contact your Dean or Manager.
You will need the following items in order to complete the Annual Program Update document at the colleges, many of which are provided for you in
this document:
The most recently completed comprehensive Program Review document.
Any comments or feedback provided during the program review validation process.
College Goals and Peralta District Goals
Institution Set Standards (Institutional Standards that are reported annually to ACCJC)
College Institutional Effectiveness Indicators (reported to the State Chancellor’s Office annually)
College Educational Master Plan
College SSSP plan, Equity and Basic Skills Plans
Data profiles, Taskstream and Curricunet reports
3 | P a g e 8/15/2016
4 | P a g e 8/15/2016
Background and Contextual Information
PURPOSE: THROUGHOUT THE APU, YOU WILL BE ASKED TO LINK YOUR PROGRAM PLANNING, GOALS, ACTIVITIES AND/OR DATA TO THOSE OF THE
DISTRICT AND COLLEGE. THE INFORMATION IN THIS SECTION WILL PROVIDE AN OVERVIEW TO THE NECESSARY BACKGROUND INFORMATION. YOU
CAN VIEW ADDITIONAL INFORMATION OR COMPLETE PLANS USING THE LINKS PROVIDED.
Merritt College Strategic Goals 2016-2017 The following are the Peralta Community College District’s Strategic Goals and Merritt College Strategic Goals for the Academic Year 2015-2016
which will be evaluated prior to the start of the next academic year.
5 | P a g e 8/15/2016
Merritt College Institution-Set Standards 2015-2016
Institution-set Standards are used by a college to evaluate student outcomes relative to the College’s Mission. The evaluation of student achievement
performance may include different measures, and program-specific measures. These standards are reported in the ACCJC Annual Report. More information can
be found on the ACCJC website: http://www.accjc.org/wp-
content/uploads/2015/11/Test_Your_Knowledge_ACCJC_News_Fall_2015.pdf.
6 | P a g e 8/15/2016
Institutional Effectiveness Indicators (Scorecard data – CCCO Datamart) *Note: Most of these measures are cohort measures with different
definitions than the college metric. Please refer to the definitions page for the complete definition, or the website below.
Complete Scorecard data specifications can be found here: http://datamart.cccco.edu/App_Doc/Scorecard_Data_Mart_Specs.pdf
7 | P a g e 8/15/2016
Merritt College Data Profile: Fall 2015 and Spring 2016
*Note: Headcount is unduplicated number of students per term. Retention and Success is based on Enrollments, which are duplicated.
8 | P a g e 8/15/2016
9 | P a g e 8/15/2016
2015-2020 Educational Master Plan
The Educational Master Plan (EMP) serves as a key part of the College’s integrated planning process and will be implemented during the next five years through
action-oriented strategic plans. It is the foundation and primary reference for guiding program planning and reviews, managing student learning outcomes, and
coordinating College resources.
http://www.merritt.edu/wp/emp/
Strategic Directions
Student Success
The College will engage in integrated planning related to student success, student equity, distance education, foundation skills, career technical education and
transfer curriculum.
Partnerships
The College will enhance, pursue and increase partnerships with educational, nonprofit and community employers to enhance and create viable and timely
programs.
Non-Credit to Credit Pathways
The College will increase non-credit pathways leading to credit programs for native and nonnative English speakers focused on developing self-advocacy, civic
engagement and self-sufficiency.
Engagement and College Culture
The College will implement strategies to increase student, faculty and staff equity and engagement and will create a culture of inclusiveness that demonstrates
value of diversity across the campus.
Institutional Stability
The College will utilize data driven decision making based on learning assessments in the Integrated Planning and Budgeting Model to advocate for adequate
human, technological, facility and fiscal resources to support successful achievement of the Educational Master Plan strategic initiatives.
10 | P a g e 8/15/2016
2015-2016 SSSP Plan Goals
The Student Success and Support Program (SSSP) is a state mandated program that provides critical support services to students on the front-end of their
educational experience to increase student success. The 5 Components of SSSP that help ensure educational success are: Admission, Orientation, Assessment,
Counseling and Advisement and Follow-up.
http://www.merritt.edu/wp/studentsuccess/
Access: Increase enrollment of under-represented populations within the College service area, specifically Latino and African American male
students.
Course Success: Increase overall college successful completion rate (students earning grade C or better in the course).
ESL and Basic Skills Completion: Increase course success rates and ensure that students succeed at the same rate as the overall College percentage of
students who successfully complete courses.
Degree and Certificate Completion: Increase the number of students obtaining a degree or certificate, specifically number of degrees earned by
African American and American Indian/Alaskan Native, and number of certificates earned by African American, American Indian/Alaskan Native,
and Hispanic/Latino students.
Transfer: Increase transfers to CSU and UC, specifically African American and Hispanic/Latinos.
2015-2018 Student Equity Plan Goals:
The Student Equity Plan uses campus-based research and data analysis to identify target groups in need of academic performance improvement. The plan
outlines goals and activities to decrease performance gaps for disproportionately impacted student groups.
http://web.peralta.edu/pbi/files/2010/11/Merritt-College-Student-Equity-Plan.pdf
The 2015-16 Student Equity Plan is centered on the main purpose of achieving equity throughout the student body that is reflective of the diversity of
the community served by Merritt College while striving to ensure student access, retention and success across student equity indicators and target
groups. The overall goals of the 2015-16 Student Equity Plan are based on the following principles:
1. Improve student access to college programs and services;
2. Increase and balance student equity and diversity in college programs and services;
3. Improve success by closing the performance gap and mitigating disproportionate impact for identified target groups.
11 | P a g e 8/15/2016
Overall goals are based on local and state data requirements, institutional data analysis and key findings from 2006 to 2014 and are grounded in
moving the college toward achieving stated goals and activities identified in the Student Equity Plan. The target groups identified for each indicator
in the “Goals and Activities” section are considered a priority. Below is a summary of goals under each indicator:
Access: Improve access of under-represented populations within the college service area to
o Increase the African American population; o Increase the Hispanic/Latino population; o Increase the male student population; o Increase the foster youth population.
Course Completion (Retention): Increase overall college retention rate to
o Improve course completion for African Americans in Mathematics and English;
o Improve Fall to Spring course completion rates, particularly for African American and Hispanic/Latino students; o Improve course completion for Native Hawaiian/Pacific Islander students;
o Improve course completion for foster youth.
ESL and Basic Skills Completion: Increase completion rates and ensure that students succeed at the same rate as the overall percentage of students
who successfully complete courses with a grade of A, B, or C or Credit as follows:
o Improve ESL course completion for Hispanic/Latino, American Indian/Alaska Native, and Foster Youth; o Improve Basic Skills course completion in English;
o Improve Basic Skills course completion in Mathematics.
NOTE: Per the 2014 Basic Skills Initiative (BSI) End-of –the-Year Report, this Equity Report reaffirms pre-established goals to
a. Increase the successful course completion rate for credit Basic Skills and ESL courses by 2% per year (10% over five years);
b. Increase the persistence of Basic Skills and ESL students by 2% per year (10% over five years);
c. Increase the percentage of students who progress from basic skills to transfer level mathematics or English by 2% per year (10% over five
years).
Degree and Certificate Completion: Increase the number of students obtaining a degree or certificate who are below the .85 level:
o Degree: African American, American Indian/Alaskan Native, Foster Youth
o Certificates: African American, American Indian/Alaskan Native, Hispanic/Latino,
12 | P a g e 8/15/2016
Foster Youth
Transfer: Increase the overall college transfer rate to aim to reach 1.0 level for groups not achieving this level:
o Focus on under-represented populations to CSU and UC:
African American
Hispanic/Latinos
American Indian/Alaska Native
Low income
Basic Skills Plan Goals
The Basic Skills Initiative supports academic achievement and personal development of students who are building their reading, writing, critical thinking and
mathematical skills to succeed in college-level work through excellent academic programs and comprehensive support services.
Basic Skills Initiative: http://www.merritt.edu/wp/basicskillsinitiative/
Increase the placement of students directly in transfer-level English and Mathematics courses through the adoption of placement tests, other
student assessment indicators and related policies that include multiple measures.
Accelerate student completion of transfer-level English and Mathematics courses by shortening course sequences for underprepared students.
Increase student completion of basic skills and gateway transfer-level courses by providing pro-active student support services that are
integrated with instruction.
Accelerate student progression through CTE pathways by contextualizing remedial instruction in foundational skills.
13 | P a g e 8/15/2016
I. Program Information
Purpose: This section will identify basic information about your program. Program reviews can be found at: http://www.merritt.edu/wp/institutional-
research/program-review/
Program Name: Radiologic Science
Date: 9/15/16
Program Type (circle or highlight one): Instructional Non-Instructional Student Services or Special Programs Administrative Unit
College Mission Statement: The mission of Merritt College is to enhance the quality of life in the communities we serve by helping students to
attain knowledge, master skills, and develop the appreciation, attitudes and values needed to succeed and participate responsibly in a democratic
society and a global economy.
Program Mission: The purpose of the Radiologic Science Program at Merritt College is to prepare qualified
practitioners for competency in the art and science of diagnostic medical imaging. The goals of the program are:
1. Students will be clinically competent
2. Students will demonstrate effective communication skills
3. Students will develop critical thinking and problem solving skills
4. Students will demonstrate professionalism
14 | P a g e 8/15/2016
Date of Last Comprehensive Program Review: 9/19/15
Date of Comprehensive Program Review Validation: 12/2/15
15 | P a g e 8/15/2016
II. Reporting Progress on Attainment of Program Goals
Purpose: In this section, you will look at your goals stated in the last program review, align the program goals with the District and College Goals, and report on
the progress, revision, or completion of the program goals.
Program Goal
*Copy the Goals Reported from
Program Review Question 10 or
Appendix B, or input the revised goal.
Which institutional goals
will be advanced upon
completion? (PCCD and MC Goal
Mapping)
Progress on Goal
(indicate date next to the
appropriate status for the goal)
Goal Detail
(Describe how the goal was met, or is still being
pursued. If a goal is new or revised, explain the
revision.)
Assessment
Complete SLO and PLO assessment in
Taskstream for the current cycle
(assess every course SLO at least
every three years, assess every PLO
every year).
1. PCCD Goal:__D_____
2. Merritt Goal___D____
Completed: ______________
(date)
Revised: ________________
(date)
Ongoing: 9-15-16
(date)
New Goal __________
(current date)
Our program faculty have made significant
progress in course SLO assessment over the past
year, and are now caught up in assessment. The
Division II SLO coordinator, Heather Casale, has
been an invaluable resource in our achievement of
this goal. We understand the importance of
compliance for course improvement as well as for
both programmatic and college accreditation. We
will continue to work with Heather each Spring and
Fall Semester to assure that we are continuing on
track with our course SLO assessment plan. We
hope that SLO assessment activities will continue
to be supported by the SLO coordinators. Clifton
Coleman has also been an excellent resource. He
helps us identify which SLO’s in which courses are
up for assessment each semester so we may meet
with Heather and stay up to date in our 3-year
cycle.
Curriculum
Update all course outlines in
Curricunet.
1. PCCD Goal:___C____
2. Merritt Goal___C____
Completed: ________________
(date)
Revised: __________________
(date)
Ongoing: ____9-15-16_____
(date)
At our faculty meeting 9-14-16, faculty were
presented with the list of courses needing updating
in Curricunet. Instructors were asked to make an
appointment with Clifton Coleman for support with
Curricunet if needed. Many courses need to be
updated.
16 | P a g e 8/15/2016
New Goal __________ (current
date)
Instruction
Hire a part-time instructor to replace
the adjunct who left in August 2015.
1. PCCD Goal:___C____
2. Merritt Goal__C_____
Completed: 8-22-16 _
(date)
Revised: __________________
(date)
Ongoing: __________________
(date)
New Goal __________ (current
date)
W. Scott Wilson was hired and began teaching
Positioning I on August 22nd
, 2016.
Student Success and Student Equity
Implement additional preparation
programs to ready students for the
ARRT certification exam.
1. PCCD Goal:___A____
2. Merritt Goal___A____
Completed: ________________
(date)
Revised: __________________
(date)
Ongoing: _____9-15-16______
(date)
New Goal __________ (current
date)
We tried a new program, RadReview Easy, as well
as HESI practice exams and Exit Exam. The two
faculty members who worked with both programs
found RadReview Easy difficult to navigate from
the instructor side. The student side seemed to
work well. The HESI practice exams worked well
and the Exit exam was an accurate predictor of
success for the licensing exam. One student failed
HESI and subsequently failed the licensing exam.
One additional student barely passed the exit exam
and subsequently failed the licensing exam with a
score 2 points lower than the exit exam. This
spring, we will begin to work with the HESI
practice exams earlier (we will seek Perkins
funding for this), and drop the RadReview Easy
instructor controlled exams (students will still be
able to pay for their own subscription to access
those practice exams). Last year we had 100% pass
rate on the ARRT exam, this year we dropped to
88%. We will continue to explore better and earlier
preparation options to get us back up to 100%.
Professional Development,
Institutional and Professional
Engagement, and Partnerships
1. PCCD Goal:__B_____
2. Merritt Goal__B_____
Completed: ________________
(date)
Revised: ___9-15-16___________
(date)
Due to the long wait list for our program, we are
exploring the creation of new clinical partnerships.
Stanford ValleyCare Livermore was mentioned by
clinical faculty as a possible affiliation site. We
17 | P a g e 8/15/2016
We are exploring adding Stanford
ValleyCare’s Livermore site as a
clinical affiliate.
Ongoing: _________________
(date)
New Goal ____9-15-16______
(current date)
begain this discussion at faculty meeting 9-14-16.
Jerry Hollister, clinical coordinator, will begin the
imaging department survey needed to begin the
approval process with the JRCERT (programmatic
accrediting body) and California Department of
Public Health Radiologic Health Branch.
Other Goals
1. Provide support for new
students by augmenting
instruction with Instructional
Aides and Peer Tutors.
2. Purchase equipment to improve
safety in the Positioning Laboratory,
enhance learning by updating image
receptors to current technology and
purchasing test tools to improve QA
laboratory experience.
1. PCCD Goal:__A_____
2. Merritt Goal__A_____
Completed: ________________
(date)
Revised: __________________
(date)
Ongoing: 9-15-16_____________
(date)
New Goal 9-20-16
We hired 3 Instructional Aides this year who are
recent program graduates. They are an important
part of our Learning Community that includes
students, faculty, Instructional Aides, volunteer
peer tutors, and industry partners. The instructional
aides provide supervised hands-on practice and
serve as role models for student success.
a. Improve safety in Positioning Lab by
purchasing 3 new step stools with high
handles.
b. Purchase CR cassettes to replace obsolete
film screen cassettes.
c. Purchase QA test tools to enhance learning
in the QA Fluoroscopy course laboratory
experience.
18 | P a g e 8/15/2016
III. Data Trend Analysis
Purpose: In this section, you will report, review and reflect on your program data since the last program review (Fall 2015 and Spring 2016). You
may copy and paste the tables that were provided to you in your data packet via email.
Please review and reflect upon the data for your program that was sent via email. You will be asked to comment on significant changes in
the data and/or achievement gaps. Focus upon the most recent academic year and/or the years since your last comprehensive program
review. *If you have questions or concerns regarding your data, please contact Samantha Kessler, Research and Planning Officer:
skessler@peralta.edu.
Student Enrollment Demographics: (Copy/paste enrollment tables from data file)
1. What changes have occurred in enrollment since 2015-2016 program review?
Because these data include students enrolled in the prerequisite course, RADSC 1A, Survey of Radiologic Technology, they do not reflect
what is happening in the program itself. The class that began in Fall of 2014 and graduated in 2016 had a retention rate of 70% (16/23).
19 | P a g e 8/15/2016
The class that began in Fall of 2015 (second year students now) have a retention rate so far of 61% (16/26). As a result, we accepted a class
of 31 in Fall of 2016 in order to address likely attrition. So far, this is resulting in some crowded labs and we have a higher number of
students placed in each clinical site than is ideal. During this first semester, clinical is mostly observation, fortunately. We typically lose at
least 4 students due to failure or personal reasons at the end of Fall Semester. We will reassign students at the beginning of Spring Semester
to even out placements and optimize student learning.
These are the demographics of students currently enrolled in program courses:
Program Students Class of 2017 Class of 2018
Female 8 (50%) 17 (57%0
Male 8 (50%) 13 (43%)
Non-native speakers of English 9 (56%) 15 (50%)
African/African American 2 (13%) 7 (23%)
Asian 5 (13%) 13 (43%)
Caucasian 2 (13%) 5 (17%)
Hispanic 2 (13%) 3 (10%)
Pacific Islander 2 (13%) 0 (0%)
Persian 1(6%) 0 (0%)
Two or more races 2 (13%) 1 (3%)
Unknown 0 (0%) 1 (3%)
Course Sections and Productivity: (Copy/paste Fall 15 and Spring 16 tables from data file)
20 | P a g e 8/15/2016
1. Please comment on changes that have occurred in productivity since the 2015-2016 program review. (e.g. increase, decrease or no
change)
Increases in productivity are likely due to large capacity Survey 1A courses. Students enrolled in the program actually decreased
during that time period due to attrition. We are also limited by the number of clinical spaces we have in any given Fall semester. We
usually have 23-16 spaces available for each new class of students.
Student Success: (copy/paste the course retention and course (successful) completion tables.
1. Describe the course retention and successful course completion rates and any changes since the 2015-2016 program review. (Note:
Course retention is the % of students who finish the course – any grade other than W. Successful course completion is the % of students
earning a grade C or better in the course.
The class that began in Fall of 2015 (second year students now) have a retention rate so far of 61% (16/26). As a result, we accepted a
class of 31 in Fall of 2016 in order to address likely attrition. So far, this is resulting in some crowded labs and we have a higher
number of students placed in each clinical site than is ideal. At this point, clinical is mostly observation. We typically lose at least 4
students due to failure or personal reasons at the end of Fall Semester. We will reassign students at the beginning of Spring Semester
to even out placements and optimize student learning.
Comment [RD1]: Is the cause of this attrition due to reasons explained above—failure or personal reasons? (re-read sentence and consider revising/clarifying—decrease due to attrition sounds a bit redundant: “Students enrolled in the program actually
decreased during that time period due to
attrition.”
Comment [RD2]: 23-26 spaces?
21 | P a g e 8/15/2016
Student Success in Distance Education/Hybrid classes versus face-to-face classes: (copy/paste the Distance Ed retention and
course completion data here.)
1. Describe any difference in the Retention and Success of face-to-face and distance education courses.
The program does not offer any hybrid or 100% DE courses.
Other program specific data. Other data could include: departmental research via survey or special projects that significantly supports the
goals or future plans for the program.
22 | P a g e 8/15/2016
23 | P a g e 8/15/2016
IV. Aligning Program Goals, Activities and Planning
Purpose: In this section, you will align your program, department or unit goals with the Educational Master Plan goals. You will also be asked to
comment on how your department, unit or program is helping the College to achieve the targets set by the Equity, SSSP and Basic Skills Plans.
1. Educational Master Plan Alignment: Please use the following matrix to demonstrate how your program goals align with the 2015-2020
Educational Master Plan Goals.
2015-2020 EMP Goals
Foundations: 1. Assess students’ strengths and needs thoroughly to accelerate completion of certificates, degrees and transfer readiness. 2. Support and develop programs, curriculum and services that increase completion of courses, certificates, degrees and transfer. 3. Establish an organizational structure that promotes coordination, innovation, and accountability, and which embeds basic skills development across the
campus. Career Technical Education:
1. Develop opportunities for CTE students to engage in campus and community experiences that enhance learning and student success (program-level
clubs/enterprises, activities that develop soft skills, etc.) by contextualizing and proactively engaging students. 2. Create a Merritt-wide infrastructure that streamlines and develops employer partnerships, including offering High quality internships, serving on
advisory boards, and engaging in curriculum development. 3. Strengthen Merritt College’s “on ramps” to our CTE pathways by enhancing distance education, dual enrollment, adult education, contract education,
etc., and provide differentiated supports that ensure student success for targeted population. 4. Create proactive strategies to engage faculty, students, and employers to support program success and sustainability that increase student-level
academic and career outcomes.
Transfer:
1. Establish fully functioning transfer center. 2. Acquire more and better data (Higher granularity) on transfer rates. Collect transfer data to include UC, State, and Private institutions. 3. Augment and strengthen specific partnerships with academic departments in CSUs, UCs, and privates to develop transfer pipelines. 4. Augment and strengthen support services for transfer students campus-wide. 5. Augment and strengthen support for transfer students within academic programs.
24 | P a g e 8/15/2016
Directions: 1) input your program and department goals. 2) Identify which area of the Ed Master Plan this Goal aligns to – Foundations, Transfer
and/or CTE. 3) Identify the goal number in that area the department goals aligns to. (Goal 1-5) 4) Describe the activities your department or
program will complete to meet the goal. 5) What standard or goal do you think the activities will help the college achieve as a measurable outcome
(Completion rate, degree/cert completion, transfer, remedial rates). Place and X in the standard(s) and/or goal(s) your program activity will impact.
Program/ department or unit Goal
Fou
nd
atio
ns
Tran
sfe
r
CTE
How does this goal or the program activities align with the Educational Master Plan Strategic Directions and
Goals?
Measurable Outcomes: Institution Set Standards and IE Goals
Successful Course
Completion Rate
Retention Rate (F to F Persistence)
Degree or Cert.
Completion Transfer
Remedial Rate Math (Basic Skill
Success)
Remedial Rate English (Basic Skills
Success)
Assessment
Complete SLO and PLO
assessment in Taskstream for the
current cycle (assess every course
SLO at least every three years,
assess every PLO every year).
Goal 1
In our assessment process, we are continually assessing and making
changes based on student performance and feedback to ensure
CTE students engagement in experiences that enhance learning and
student success.
X x X
Curriculum
Update all course outlines in
Curricunet.
Goal
4
In our course outline update process, we are continually updating textbooks and content to reflect current technology and trends in medical imaging practices.
X X X
Instruction
Hire a part-time instructor to
replace the adjunct who left in
August 2015.
Goal 4
First-year students are benefitting greatly by the hire of an experienced
and competent instructor for the Positioning I course. This has also
enabled the program director to carry a reasonable load to ensure that she
has time to complete tasks such as the annual program update.
X X X
25 | P a g e 8/15/2016
Student Success and Student
Equity
Implement additional preparation
programs to ready students for the
ARRT certification exam.
Goal 4
Better prepare students for passing the national licensing examination
X X
Professional Development,
Institutional and Professional
Engagement, and Partnerships
We are exploring adding Stanford
ValleyCare’s Livermore site as a
clinical affiliate.
Goal
2
Adding a clinical affiliate would provide greater capacity for students,
increasing productivity and providing greater access to this very impacted
program (Wait List 1-3 years for 2016 applicants)
X X X
Other Goals
Provide support for new students
by augmenting instruction with
Instructional Aides and Peer
Tutors.
Purchase equipment to improve
safety and enhance learning in the
radiologic science laboratory.
Goal 4
Instructional Aides and Peer Tutors are an important part of our Learning
Community. They provide enhanced supervised hands-on practice and serve as role models for student
success.
X X X
2. Student Equity, Student Success and Support Program (SSSP), and Basic Skills Target Groups: These plans analyzed student success
outcomes and disproportionately impacted student populations. The chart below outlines the results of this analysis, and is a summary of the
student populations and focused outcomes that the College indicated it would like to increase as a result of the Student Equity Plan (E), SSSP
Plan (S), and Basic Skills Plan (B).
a. As a program, department or unit, review your data and describe any activities you are doing to address student equity gaps and
special populations in the table below. Describe the target or focused student population, the problem/observation, the
activity/intervention, and the intended outcome. How does your activity align with the College’s Equity, SSSP and Basic Skills Goals
26 | P a g e 8/15/2016
(list the target group and indicator in the last box below)? In your description, please note if the activity or intervention was funded by
one of these grants in the past academic year (15-16).
2015-16 Student Equity Plan, Student Success and Support Program Plan (SSSP), and Basic Skills Goal Summary
*S = SSSP, E=EQUITY, B=BASIC SKILLS
27 | P a g e 8/15/2016
Directions: 1) Describe a challenge, achievement gap or observation you made in your program data. 2) Describe an activity or intervention your
program does to address the data. 3) Note which student populations this activity or intervention targets. 4) describe the intended measurable
outcome of the activity. Think about which indicator, from the summary chart below, this activity will help to impact. 5) Note which Plan and Goal
this activity aligns to (SSSP, Equity, or Basic Skills)
To be completed by the Program, Department or Unit:
Problem,
Achievement Gap
or Observation
Activity/Intervention Target Student Population Outcome (or intended outcome from the
list of indicators above: access, course
completion, retention, BS course
completion, degree, cert. transfers)
Relevant
College
Equity/SSSP/BS
Goal
Attrition for the class graduating in 2016 was 34%. Class was 30% African or African American; 30% Caucasian, 17% Asian, 13% Latino, and 4% East Indian. Of the eight students who left the program, 5 left for personal reasons. One Latino male, one white male, one white female, and one African American Female. Three students left for failing grades, One white male, one white female, African American male. The
Provide peer tutors (second year students) and instructional aides (recent program graduates) to provide extra instruction, hands-on practice sessions, and to serve as role models for student success.
All program students, including African American and Latino students.
Increase retention from first Fall to Spring Semester.
SSSP and Equity Plan – Access for African Americans and Latinos
28 | P a g e 8/15/2016
program is quite diverse, and we don’t see a pattern of failure or leaving due to personal issues in a particular group. However we would like to provide as much support as possible to students, particularly in the first program semester where we see the highest numbers of attrition.
.
b. Are additional resources required to facilitate the activities or interventions related to this area? If yes, make sure to discuss with your
Dean.
This year, with the help of Dr. Delia, Dr. Cedillo, and Dr. Rosario, we were able to access SSSP and Equity funds to pay our
Instructional Aides. We plan to do this again nest year.
3. Student Equity, Student Success and Support Program (SSSP), and Basic Skills Activities: In addition to identifying focused student
populations and targets for improving student outcomes, these plans outlined activities the College would engage in to improve the indicators
above. Please note if your program has participated in any activities related to each of these plans. If applicable to your program.
29 | P a g e 8/15/2016
To be completed by the Program, Department or Unit: How did you participate in the plan activities outlined above? (Use N/A if not
applicable)
Student Equity
Plan
Worked with Dean Delia, Dr. Cedillo, and Dr. Rosario to identify funds. Recruited top recent graduates, worked with them on
the hiring process, and had them processed and in place for the first day of classes
SSSP Plan Worked with Dean Delia, Dr. Cedillo, and Dr. Rosario to identify funds. Recruited top recent graduates, worked with them on
the hiring process, and had them processed and in place for the first day of classes
Basic Skills Plan
30 | P a g e 8/15/2016
V. Curriculum and Assessment Status
Purpose: In this section, you will review your curriculum changes and improvements and assessment plans and findings. If your Program,
Department or Unit does not have a curriculum component, please put N/A. You should reference the Assessment Completion Report, Curriculum
Update Report, CurricUNET META, and Takstream. If you have questions about curriculum or assessment, please contact Clifton Coleman,
Curriculum and Assessment Specialist, ccoleman@peralta.edu.
1. Use the following table to document the curricular, pedagogical or other changes your department made since the most recent program
review, and the planned changes for the upcoming year. Note, curriculum updates are required every two years for CTE, every three
years for non-CTE. Identify if the changes were based on course or program level assessment, or other data/evidence collected by the
program or other requirements like Title 5, certification or accreditation requirements. Attach evidence (Curriculum Update report, the
assessment report from TaskStream, departmental meeting notes, etc).
Change or Planned Improvement Identify the Data, Assessment results or
Evidence that support the change or plan for
improvement
Status:
Completed or
Ongoing and
Planned date of
completion.
We are planning on having students work with HESI practice exams beginning in early Spring 2017, rather than waiting until the final summer. We are planning to change required practice test because we did not have 100% pass rate for the class of 2016.
ARRT Exam pass rate for the class of 2016 was 88%, trending down from class of 2015 100% pass rate.
Planned date of completion: Summer 2017.
2. Attach the Assessment Completion Report (Clifton provides this report at Flex Day), and the completed Fall Schedule Assessment
Planning Template (due to CDCPD mid-September) Please evaluate your program’s progress on assessment.
31 | P a g e 8/15/2016
32 | P a g e 8/15/2016
3. What meaningful dialogue takes place in both shaping and assessing course and program level outcomes? Where can one find the
evidence of the dialogue? Dialogue primarily takes place at Program Advisory Committee meetings (Fall and Spring). The program is blessed with a very active,
vocal, and participatory group of industry partners, student representatives, and faculty. Sample meeting minutes from Fall 2015 are
included in this APU. Fall 2016 meeting will take place on October 4th
.
33 | P a g e 8/15/2016
VI. Additional Questions for CTE, Counseling, Library and Student Services/Admin Units
Purpose: In this section, certain programs or departments will answer questions specific to the program. Leave the section blank if your program,
department or unit is not CTE, Counseling, Library or Student Services/Administration.
For CTE:
1. Please describe any recommendations resulting from advisory committee meetings that have occurred since your last program
review.
Policy change regarding rules of student supervision while on outside rotations (1 extra competency required at the new site before
student can perform exam s previously achieved at “home” site. All exams on pediatric patients must be under direct supervision,
regardless of competency status.
34 | P a g e 8/15/2016
Merritt College
Radiologic Science Program
Program Advisory Committee Meeting Minutes
10-6-15 12:00 PM
Present:
Jennifer Yates, Merritt College Program Director
Melissa Ramirez, Merritt College Faculty
Jerry Hollister, Merritt College Clinical Coordinator
Carolyn Rangle, Contra Costa Regional Medical Center
Katie Gilbreth, Sutter Solano Medical Center
Erin Haywood, John Muir Concord
Tosca Bridges, John Muir Walnut Creek
Pat Rafferty, Alta Bates Summit Medical Center, 350 Hawthorne Campus
Ginny Carpenter, Alta Bates Summit Medical Center, Ashby Campus
Mohammed Mojaddedi, Washington Hospital
Graciela Paredes, Valley Care Medical Center
Sabrina Martinez, Sutter Delta Medical Center
Justin Guevarra, Childrens Hospital Oakland
Art Murcia, Eden Castro Valley
Kristina Campomanes, 2nd Year Student Representative
Justin Tarnowski, 2nd Year Student Representative
Jenny Phong, 1st Year Student Representative
Zach Chiaro, 1st Year Student Representative
35 | P a g e 8/15/2016
Attendees received a meeting packet consisting of meeting agenda, 2015 Student Handbook, Clinical Performance Data Analyzed by item and class, Program
Effectiveness Data, 9-17-15 Assessment Plan/Report, Graduate Exit Survey, Alumni Survey, Employer Survey, information about the upcoming CSRT conference,
language from Title 17 regarding student supervision.
Item No. Item
Description
Discussion
1. Open Session
Introductions Yates welcomed CI’s and students to the open session portion
of the meeting.
36 | P a g e 8/15/2016
2. Student Concerns
2nd Year:
Kristina
Campomanes
and Justin
Tarnowski
1st Year: Jenny
Phong and
Zach Chiaro
The 2nd years’ concerns: Campomanes asked when second year
students could begin C-Arm sign-offs. Hollister stated any time
during the second year. He reminded everyone that direct
supervision is required for OR work, even after the student
achieves competency.
Tarnowski asked for clarification on sequencing of first-year
students’ competencies. Yates stated that students must first
have instruction and practical examination at the school before
they may attempt a competency at the clinical site. She will send
out the Positioning I syllabus, Hollister to send the Positioning II
syllabus to CI’s.
First year students did not have any questions or concerns.
3. First Year
Student Hospital
Orientation
Hollister Hollister inquired as to hospital practices for new student
orientation.
Sutter Delta: takes place first clinical day
St. Rose: takes place prior to start
Washington: Orientation must be completed prior to clinical start
date.
John Muir Concord: Done on paper each day until complete
37 | P a g e 8/15/2016
(information and quizzes).
Sutter Castro Valley: Orientation is a self-directed activity
CCRMC: an 8-hour orientation set up for next week
Summit: Open book test, 2-3 months for hospital orientation
Alta Bates: orientation offered 2 X per month. Employees and
students attend at Sutter Castro Valley.
Sutter Solano: Hospital Orientation takes place in Sacramento.
Students do not attend but are orientated by the CI (reading a
packet and taking a test).
JM Walnut Creek: Intranet Learning Points, Epic Training, ½ day
orientation.
Hollister added that students still must complete the 12 week
imaging department orientation documentation on Trajecsys.
They do not have to do them in order, can jump around but must
have them all complete by the end of Fall 2015 Semester.
38 | P a g e 8/15/2016
4. Second Year
Student
Rotations
Hollister In response to an item brought up at the last meeting by the
(then) second year student rep, Hollister asked what people
thought about longer rotations, up to an entire semester. Pros
and cons were discussed. Current second year students thought 2
months sounded about right, so a greater number of rotations
could be scheduled for each student. Hollister stated that he
would individually evaluate each student’s request and work with
the CI’s to determine the best length of time for rotations.
Bridges (JMW) stated that having to directly supervise rotation
students for all exams created a burden on the CI’s. Gilbreth
(Sutter Solano) and Ramirez (Merritt) asked if students could
perform a “4th sign-off” at the rotation site, after which they could
perform the exam under indirect supervision at the rotation site.
Members voted unanimously to accept this change, so Yates will
revise the policy to reflect this, effective January 2016.
Campomanes also asked when rotations would begin for second-
year students (Fall Semester). Hollister stated that he schedules
rotations based on competency status of students.
5. Student
Handbook
Revisions
Yates Yates pointed out changes in the 2015 version of the Radiologic
Science Program Student Handbook:
pp. 1-2 revised the Program Learning Outcomes as advised by
JRCERT staff to improve our ability to assess them effectively.
p. 2 added information about how students can make a complaint
to JRCERT, also added the current “Standards” to the appendix of
the handbook, so students can know what is expected of the
39 | P a g e 8/15/2016
school.
p. 11 added FERPA overview, explaining privacy laws pertaining to
student records
p. 20 Dean Mansur has left the position of Dean for our division.
Interim Dean is Dr. Rosemary Delia. The Vice President of
Instruction position is currently open.
pp. 39-44 some changes in the Radiation Safety Policy, Radiation
Safety Officer is identified (Hollister), minor updates in Radiation
Monitoring Policy, pregnancy policy has added “undeclaration” of
pregnancy must be in writing (new form in appendix).
p. 47 new procedure for non-emergency student injury
6. Review
Mission
Statement, Goals,
Proram Learning
Outcomes
Yates All members reviewed the program’s Mission Statement, Goals,
and Program Learning Outcomes. Unanimous approval to keep
the current set.
7. Program
Assessment
Yates Group members reviewed the current Program Assessment
Plan/Report. All were in favor of accepting the plan in its current
incarnation. Group reviewed the individual sets of analyzed data
including Clinical Performance Trends by class and item, Clinical
40 | P a g e 8/15/2016
Competency Evaluation Summaries by class, Program
Effectiveness Data, Alumni Survey, Graduate Exit Survey,
Employer Survey (these items were emailed ahead of the meeting
and provided in printed form at the meeting). Benchmarks are
being met for all program goals and outcomes.
Clinical Performance Trends by class and item met benchmarks,
but demonstrated a slight downward trend on almost all items for
both classes. Faculty and Advisory Committee Members
discussed the reason for this. One possible reason is that College
Instructor evaluations are now being included in these data (a
change since last year). Faculty may be grading harder or more
realistically than hospital staff??? We will monitor trends during
our assessment period next Spring 2016, when data will be
comparable to Spring 2015.
Program Satisfaction: We discussed some of the comments that
showed up on the Alumni Survey (Class of 2014). Equipment,
computers, updated facilities have taken place since this group
graduated. We have moved to the new Science Building and the
renovated Library and Learning Center are now open. We have
updated Fluoro curriculum to follow the ARRT CA Fluoroscopy
Examination outline. Communication between faculty and
Classroom preparedness (for instructors) continue to show up on
the Survey. Custard stated that students don’t realize that
instructors are often on campus on alternate days and are rarely
on campus all at the same time. We will try to increase
communications via email regarding issues that affect specific
students and the program in general. Faculty are asked to plan
ahead for classes with lectures and handouts ready at class start
41 | P a g e 8/15/2016
time.
Program Effectiveness Data demonstrates an improvement in
ARRT exam pass rates, attributable to the use of new resources
and earlier preparation by program faculty (Spring Semester).
Employment continues to be at 100%. Program completion rate is
down to 61% for Class of 2015. Students left the program for a
variety of reasons: personal, financial, failing a class, etc. We will
continue to provide support for our students with peer tutors, as
well as hiring recent graduates as Instructional Aides.
8. Clinical
Instructor
Evaluations
Yates Yates stated that clinical instructor evaluations were completed in
November 2014. Individual summaries were sent to each CI in
January. We will do this annually as now required by JRCERT.
Both first and second year students fill out evaluations online via
SurveyMonkey for the CI at their site.
9. Student
Supervision on
Repeats -
Signatures
Hollister Hollister reminded CI’s that students need to log repeats and
obtain signatures of techs supervising students on repeats.
Students are not logging the techs’ repeats, only their own.
10. Equipment
Sign-offs
Yates Yates reminded everyone that at the last meeting we agreed to
have students complete equipment sign offs in Fall of the first
year, then repeat in Spring after students return from the winter
break. Equipment sign-offs for second year students are only
required for students rotating to a new site. Yates confirmed that
duplicate equipment did not need to be separately signed-off.
42 | P a g e 8/15/2016
11. Clinical
Portfolios
Hollister Hollister stated that only a few hard copy documents are being
used in the Clinical Portfolio: Repeat log and patient exam logs.
The third sign-off was being kept as well, but the student’s privacy
could be compromised if they are not kept locked up. The
program will no longer require a hard copy of the third sign-off to
be kept. However, if individual CI’s wish to retain hard copies,
they must be kept in a secure (locked) location.
Gilbreth and Rangel asked that student reflections (on Clinical
Performance Evaluations) be made visible to CI’s on Trajecsys.
Campomanes echoed that CI’s want to be able to read this before
completing the evaluation. Hollister to arrange for this with
Trajecsys.
43 | P a g e 8/15/2016
Merritt College Radiologic Science Program
Program Assessment Plan/Report for Fall 2015: Student Learning Outcomes
Revision Date: 9-17-15
Mission Statement
The purpose of the Radiologic Science Program at Merritt College is to prepare qualified practitioners for
competency in the art and science of diagnostic medical imaging. The goals of the program are:
1. Students will be clinically competent.
2. Students will demonstrate effective communication skills.
3. Students will develop critical thinking and problem solving skills.
4. Students will demonstrate professionalism.
44 | P a g e 8/15/2016
Goal 1: Students will be clinically competent.
Outcome Assessment
Tools
Benchmark Timeframe Person
Responsible
Results
1.Produce diagnostic quality
medical images in a competent,
safe, and compassionate manner
for all basic radiography
examinations in a hospital work
environment.
a. Students will
competently position
patients.
b. Students will select
appropriate technical
factors.
c. Students will practice
good patient care.
1.Clinical
Performance
Evaluation
1.Students will pass Clinical Performance
Evaluation with an overall average score
of 1.75 for first-year students, and 1.85 for
second-year students on a scale of 0-2
(0=Unsatisfactory, 1=Needs
Improvement, 2=Satisfactory)
Each item on the evaluation will also be
averaged separately, to get a “snapshot” of
class scores for each to determine problem
areas that need to be addressed. For
example, Item #3 addresses Positioning,
Item # 5 addresses technique factors, Item
# 8 addresses Patient Care.
1. Clinical Performance
Evaluation Data for each
class will be entered and
analyzed using
Trajecsys at the end of
Spring Semester.
1.Clinical
Coordinator and
Program
Director.
Data collected
Spring 2015
1st Year
Students:
Avg. overall score = 1.98
Range = 1.93-2.0
Positioning Score: 1.94
Technical Factors: 1.93
Patient Care: 1.99
2nd Year
Students: Avg. overall
score = 1.99
Range: 1.96-
2.0
Positioning
Score: 1.97
Technical
Factors: 1.96
Patient Care:
2.0
Benchmark met for each item
for both classes
45 | P a g e 8/15/2016
Goal 1: Students will be clinically competent.
Outcome Assessment
Tools
Benchmark Timeframe Person
Responsible
Results
1.Produce diagnostic quality medical images
in a competent, safe, and compassionate
manner for all basic radiography examinations
in a hospital work environment.
a. Students will competently position
patients.
b. Students will select appropriate
technical factors.
c. Students will practice good patient
care.
2.Clinical
Competency
Evaluation
2.Average overall score
of 80% (out of 100%) or
higher for each class.
2. Evaluations for each
class will be averaged at
the end of Spring
Semester.
Clinical
Coordinator and
Program Director
Data collected
Spring 2015
First Year
Students Avg.: 98.01
Second Year Students Avg.:
99.49
Benchmark met
for each class
46 | P a g e 8/15/2016
Goal 2: Students will demonstrate effective communication skills.
Outcome Assessment Tools Benchmark Timeframe Person
Responsible
Results
2.Communicate effectively
with patients and family
members by taking
appropriate histories, giving
clear instructions, and
providing information as
needed.
1.Clinical
Performance
Evaluation-Patient
Communication Item
#1
2. Communications
Practicals for Radsci
1B course
1. Average score of 1.75 for first-
year students, 1.85 for second-
year students, on a scale of 0-2
for this item. (0=Unsatisfactory,
1=Needs Improvement,
2=Satisfactory)
2. Average score of at least 75%
on both Communications
Practical Exams
Clinical Performance
Evaluation Data for each
class will be entered and
analyzed using Trajecsys at
the end of Spring Semester.
2. Communications practicals
take place at the end of the
Radsci 1B course
(September) prior to students
moving to the clinical phase
of the program.
1.Clinical
Coordinator and
Program Director
Data collected
Spring 2015
2. Instructor for
the Radsc 1B
course
Data to be
collected end of
Sept. 2015
1. First Year
Students Avg. Score on this
item: 1.97
Second Year
Students Avg.
Score on this item: 1.99
Benchmark met
for both classes,
2. Fall 2014 students passed
47/48
practicals.
3.Communicate in a
professional manner with
hospital staff, instructors, and
peers.
1. Clinical
Performance
Evaluation-Staff and
Peer Communication
Item # 2
2. Image Evaluation
Oral Presentation and
Written Assignment
1. Average score of 1.75 for first-
year students, and 1.85 for
second-year students, on a scale
of 0-2 for this item.
(0=Unsatisfactory, 1=Needs
Improvement, 2=Satisfactory)
2.Average score of 80% for first
and second year students on both
Oral Presentation and Written
Assignment.
1. Clinical Performance
Evaluation Data for each
class will be entered and
analyzed using Trajecsys at
the end of Spring Semester.
2. Evaluations for each class
will be averaged at the end of
Spring Semester.
1.Clinical
Coordinator and
Program Director
Data Collected
Spring 2015
2. Clinical
Coordinator and
Program Director
Spring 2015
First Years
Students Avg. Score on this
item: 1.98
Second Years
Students Avg.
Score on this item: 1.99
Benchmark met for both classes
First-year:
Oral: 98.66
Written: 88.25 Second-year:
Oral: 97.71
Written: 88.51
47 | P a g e 8/15/2016
Goal 3: Students will develop critical thinking and problem solving skills.
Outcome Assessment Tools Benchmark Timeframe Person
Responsible
Results
4.Exercise critical thinking
and problem solving skills
by adapting radiographic
examinations to individual
patient needs and
conditions.
1.Clinical
Performance
Evaluation-Critical
Thinking Skills
(Section C, Items 15-
18)
2. Employer Survey,
Critical Thinking
Skills Section (Items
6-13)
Average score of 1.75 for first-year
students, and 1.85 for second-year
students, on a scale of 0-2 for each
item in this section.
(0=Unsatisfactory, 1=Needs
Improvement, 2=Satisfactory)
2. 80% of respondents will indicate a
4 or 5 for these items on a 5-point
Likert Scale. 4 is “usually” and 5 is
“Always”
1. Clinical Performance
Evaluation Data for each
class will be entered and
analyzed using Trajecsys at
the end of Spring Semester.
2. Survey will be
administered and analyzed
via Trajecsys in August
each year for the class
having graduated in August
of the previous year.
1.Clinical
Coordinator and
Program Director
Data Collected
Spring 2015
2.Program
Director
Data collected
August 2015
First Year
Students Average
Scores:
15: 1.97 16: 1.99
17: 2.0
18: 1.99
Second Year
Students Average
Scores:
15: 1.97 16: 1.99
17: 2.0
18: 1.99 Benchmark met
for both classes
Q6: 4 = 33%%
5 = 67% Q7: 4 = 33%
5 = 67%
Q8: 4 = 33% 5 = 67%
Q9: 5 = 100%
Q10: 4 = 33% 5 = 67%
Q11: 4 = 33%
5 = 67% Q12: 4 = 33%
5 = 67%
Q13: 4 = 33% 5 = 67%
Benchmark met
48 | P a g e 8/15/2016
Goal 3: Students will develop critical thinking and problem solving skills.
Outcome Assessment
Tools
Benchmark Timeframe Person
Responsible
Results
4a.Critique images and
take corrective action
when images are not of
diagnostic quality.
1. Image Evaluation
Oral Presentation
and Written
Assignment
1.Average score of 80% for
first and second year
students on both the Oral
Presentation and Written
Assignment.
1.Evaluations for first and
second year students will
be averaged at the end of
Spring Semester. Data
will be entered into
Trajecsys for analysis.
1.Clinical
Coordinator and
Program
Director
Data collected
Spring 2015
First-year:
Oral: 98.66 Written: 88.25
Second-year: Oral: 97.71
Written: 88.51
Benchmark met
for both classes
49 | P a g e 8/15/2016
Goal 4: Students will demonstrate professionalism.
Outcome Assessment
Tools
Benchmark Timeframe Person
Responsible
Results
5.Establish and maintain
satisfactory professional
relationships with other
members of the health
care team.
1.Clinical
Performance
Evaluation-
Professionalism and
Teamwork
(Section B, Items
10-14)
2. Employer Survey,
Questions 16-17
(Professional
Communications,
Conflict Resolution)
1. Average score of 1.75 for
first-year students, and 1.85
for second-year students on
a scale of 0-2 for each item
in this section
(0=Unsatisfactory, 1=Needs
Improvement,
2=Satisfactory)
2. 80% of respondents will
indicate a 4 or 5 for these
items on a 5-point Likert
Scale. 4 is “usually” and 5
is “Always”
1. Clinical Performance
Evaluation Data for each
class will be entered and
analyzed using Trajecsys
at the end of Spring
Semester.
2. Survey will be
administered and analyzed
via Survey Monkey in
August each year for the
class having graduated in
August of the previous
year.
1.Clinical
Coordinator and
Program
Director
Data collected
Spring 2015
2. Program
Director
Data collected
August 2015
First Year
Students: Q10: 2.0
Q11: 1.98
Q12: 1.97 Q13: 1.98
Q14: 2.0
Second Year
Students:
Q10: 2.0 Q11: 1.99
Q12: 2.0
Q13: 1.98 Q14: 2.0
Benchmark met for both classes
Q16: 4 = 33%
5 = 67% Q17: 4 = 33%
5 = 67%
Benchmark met
50 | P a g e 8/15/2016
Goal 4: Students will demonstrate professionalism.
Outcome Assessment
Tools
Benchmark Timeframe Person
Responsible
Results
6.Function as an effective
health care team member
by providing services in a
manner that complements
those performed by other
team members.
1.Clinical
Performance
Evaluation-
Professionalism and
Teamwork
(Section B, items
10-13)
2. Employer Survey,
Question 19
(Healthcare Team
Member)
1. Average score of 1.75 for
first-year students, and 1.85
for second-year students on
a scale of 0-2 for these
items (0=Unsatisfactory,
1=Needs Improvement,
2=Satisfactory)
2. 80% of respondents will
indicate a 4 or 5 for this
item on a 5-point Likert
Scale. 4 is “usually” and 5
is “Always”
1.Clinical performance
evaluations for first and
second year students will
be averaged at the end of
Spring Semester.
2. Survey will be
administered and analyzed
via Survey Monkey in
August each year for the
class having graduated in
August of the previous
year.
1.Clinical
Coordinator and
Program
Director
Data collected
Spring 2015
2. Program
Director
Data Collected
August 2015
First Year
Students: Q10: 2.0
Q11: 1.98
Q12: 1.97 Q13: 1.98
Second Year Students:
Q10: 2.0
Q11: 1.99 Q12: 2.0
Q13: 1.98
Benchmark met
for both classes
Q19: 4 = 67% 5 = 33%
Benchmark met
51 | P a g e 8/15/2016
Goal 4: Students will demonstrate professionalism.
Outcome Assessment
Tools
Benchmark Timeframe Person
Responsible
Results
7. Demonstrate a
commitment to
professional
development.
1. Professional
Society meeting
reflection papers
2. Four-Year
Plans
1. All students will attend at least
one professional society meeting
during the two year program, and
submit a reflection paper describing
their experiences.
Papers will be scored utilizing a
rubric, students must achieve a
score of at least 80%.
2. All students will identify a career
path and goals for the four years
following graduation. 4-year plans
will be scored utilizing a rubric,
students must achieve a score of at
least 80%.
1. Students are required to attend a
professional society meeting at least once
during the program and submit a
reflection paper describing sessions and
experiences. These papers will be
evaluated in August at the end of each
academic year.
2. Advanced Imaging Course Radsci 7,
5th
Semester.
1.Clinical
Coordinator and
Program Director
Current data
collected
November 2014
2. Course
Instructor
Data collected
May 2015
Students will
attend the CSRT
conference in
November, 2015. Scoring
rubric will be
used to assess successful
completion.
100% of
students in Radsci 7 course
completed the
final exam assignment
satisfactorily.
Scores ranged from 87% -
100%.
52 | P a g e 8/15/2016
Student Learning Outcomes Action/Analysis
Clinical Performance Evaluations 2014 and 2015 Comparison Item-by- Item Summaries by Class
Pertains to Goals 1-4: At the last faculty meeting on 9-9-15, faculty examined trends in individual item scores, which are
generally trending downward or holding steady as compared to the previous year’s analyzed data. However, for all items in
both classes the benchmark was met. Faculty believe that the addition of college faculty initiated Clinical Performance
Evaluations entered into Trajecsys were graded a little harder than the CI’s evaluations. We will discuss at the 10-6-15 PAC
meeting to get input from hospital representatives. We will continue to monitor trends as we move forward.
One item trending up was Communications skills for First-Year students. Last year we identified this as area of concern as the
score was significantly lower than all other items. Faculty believe that the Intro 1B Course communications practical is at least
partly responsible for improving the outcome for Goal 2 #2.
53 | P a g e 8/15/2016
Merritt College Radiologic Science Program
Program Assessment Plan/Report: Program Effectiveness
Revision Date: 9-16-15
Item Assessment Tools Benchmark Timeframe Person Responsible Results ARRT Examination Pass
Rate
ARRT Exam Score
Report
At least 80% of program
graduates will pass the
ARRT credentialing exam
on the first attempt within 6
months of graduation
Annually in
January
Program Director 2011-100%
2012-81.3%
2013-82%
2014-89%
2015-100%
Five Year
Average:
90.6%
Benchmark
met for yr and
5-yr avg.
California Fluoroscopy
Examination
RHB Exam Score Report At least 80% of program
graduates will pass the
California Fluoroscopy
Examination on the first
attempt
Annually in
March
Program Director 2014-93%
Benchmark
met.
54 | P a g e 8/15/2016
Item Assessment Tools Benchmark Timeframe Person Responsible Graduate Employment
Rate
Alumni Survey followed
up by individual e-mails
At least 75% of surveys
returned by alumni will
indicate that they were
employed within 12
months of graduating from
the program.
12 months
after
graduation, in
August
Program Director
(Administered via Survey
Monkey, followed up with
individual e-mails)
2010-65%
2011-100%
2012-100%
2013-100%
2014-100%
Five Year
Average: 93%
Benchmark
met for yr and
5-yr avg.
Item Assessment Tools Benchmark Timeframe Person Responsible Results Program Completion. Class rosters for first and
last program semesters are
compared for each class.
Completion rate
calculated based on
difference in class size.
At least 70% of students
who begin the program
successfully complete it.
Annually in
August
Program Director 2011-82%
2012-72%
2013-92%
2014-76%
2015-61%
Five year Avg. =
76.6%
Benchmark not
met for CO
2015, but 5-year
avg. is above
benchmark.
55 | P a g e 8/15/2016
Item Assessment
Tools
Benchmark Timeframe Person Responsible
Graduate Satisfaction Alumni Survey
Graduate Exit
Survey
Of surveys returned by
alumni, at least 80% of
respondents will select
5 (“Satisfied”) or 6
(“Very Satisfied”) on
item #11 “Overall
satisfaction with the
Merritt College
Radiologic Science
Program”
Of surveys returned by
graduates, at least 80%
of respondents will
select 4 (“Satisfied”)
or 5 (“Very
Satisfied”) on item #30
“Overall satisfaction
with the Merritt
College Radiologic
Science Program”
Annually in
August.
Annually in
August
Program Director
(Administered via Survey
Monkey)
Program Director
(Administered via Survey
Monkey)
Class of
2014 Q11: 2 = 7.14% 5 = 28.57%
6 = 64.29%
Benchmark met,
92.86% of
respondents selected 5
(“Satisfied”) or 6
(“Very Satisfied”). This
is an
improvement over last year, but
one person is
NOT SATISFIED.
Class of
2015 Q30: 2= 22.22%
4 = 22.22%
5 = 55.56%
Benchmark NOT
met, 77.78% of respondents
selected 4
(“Satisfied”) or 5 (“Very satisfied”)
56 | P a g e 8/15/2016
Item Assessment
Tools
Benchmark Timeframe Person Responsible
Employer Satisfaction Employer Survey Of returned Employer
Surveys, at least 80%
of respondents will
select a 4 or 5 (on a
scale of 0-5) on item
#21, “Overall
satisfaction with
graduates of the
Merritt College
Radiologic Science
Program”
Annually in
August
Program Director
(administered via Survey
Monkey)
2014
60% of
respondents
indicated a
“4.”
40% of
respondents
indicated a
“5.”
Benchmark
met
2015
33% of
respondents
indicated a
“4.
67% of
respondents
indicated a
“5”
Improvement
over last
year.
57 | P a g e 8/15/2016
Program Effectiveness Action/Analysis
ARRT Exam Pass Rate: Benchmark consistently met for the past 5 years. 100% pass rate for the Class of 2015 (first time in 4
years)! We attribute this to prepping students beginning in the Spring of 2015 to take the HESI exit exam and the ARRT exam.
We also used additional test prep products this year (RadReview Easy, added an additional set of HESI practice exam
materials). We will continue to explore new test prep materials, and will continue to prep students beginning in Spring
Semester of the second year.
Graduate Employment Rate: The benchmark was met in 4 of the past 5 years, trending upward as we continue to come out of
the recession. The Program Advisory Committee will continue to monitor patient load and employment trends, to adjust class
size as appropriate in the changing economic climate.
Program Completion: Completion rates have been very inconsistent over the past 5 years, ranging from a low of 61% this
year (2015 graduates), to a high of 92% in 2013. The benchmark was met in 4 of the past 5 years, with a 5-Year Average of
76.6%. The benchmark was not met for the current year, but the 5-year average is above the benchmark. We will continue to
provide support services for students including peer tutoring, scheduled practice sessions for Positioning outside of class hours,
and extra help from instructional aides and instructors. We will continue to maintain high standards for the program by
removing students who do not meet academic or performance standards; or who display behavior that is not acceptable in a
professional setting.
58 | P a g e 8/15/2016
Graduate Satisfaction:
Alumni Surveys -
Benchmark met, 92.86% of respondents selected 5 (“Very Satisfied”) or 6 (“Satisfied”). This represents an improvement
over last year, but one person is NOT SATISFIED. However, 100% of respondents said they would recommend the
program to a friend or relative. Sample of comments made on the Survey (areas that need strengthening):
1. “Equipment in class”
2. “Focus more on lectures rather than group/individual presentations. Students should learn from the instructor,
not learning on their own.”
3. “Computer lab and updated classroom facilities.”
4. “More clinical rotations.”
5. “Didactic fluoro instruction needs to be more in line with state exam.”
6. “Communication between faculty. Classroom preparedness (lecture)” (This is a repeat from last year’s
Alumni).
Faculty addressed these items in their discussion at the 9-9-15 meeting. Since the Class of 2014 graduated, the renovated
Library, Learning Center, and Computer Labs have opened, affording students upgraded study spaces and computers.
The move to the new Science Building and equipment upgrades address #1 and #3. #2 request does not reflect current
pedagogical thinking, and we will continue to encourage students to become independent thinkers and learners with
support and assistance from faculty. #4 Hollister began offering clinical rotations in the Fall of 2014 for the first time,
allowing for rotations the entire 3 semesters of the second year. This change was in response to student feedback
presented at the Spring 2014 PAC meeting. #5 Yates redesigned the Fluoro Curriculum taught in the Advanced Imaging
Course to reflect the outline from the CA Fluoroscopy Examination Handbook. #6 Custard stated that students do not
realize that instructors are not on campus at the same time, inhibiting communication. All will try to increase
communication via e-mail regarding issues that affect specific students and the program in general.
59 | P a g e 8/15/2016
Graduate Exit Surveys – Benchmark was NOT met. Only 77.78% of respondents selected 4 (Satisfied) or 5 (Very Satisfied).
This represents a decline from last year, when benchmark WAS met- 92.3% of respondents selected 4 (“Satisfied”) or 5 (“Very
satisfied”). Some comments:
1. “Not saying it has to be lovey-dovey, but more support for the student and less threatening to kick them out of
the program.”
2. “Longer rotations to 1 other site (full semester).”
3. “Instructors who actually care about what it is that they do.”
Faculty discussed these items. #1 We will consider ways in which to support students while informing them that they must
follow all policies and perform according to our standards or there is a risk of being dismissed from the program. #2 Hollister is
exploring the possibility of longer rotations. We will place this item on the agenda for the upcoming PAC meeting 10/6/15. #3
All faculty to consider why a student would say this, and what we can do to avoid this perception in the future.
Employer Satisfaction
Benchmark met, all respondents indicated that they were “satisfied” or “very satisfied” with program graduates in all areas. All
respondents stated that they would recommend the program to a friend or relative. There were no suggestions for improvement.
Compared to last year, we improved on critical thinking in regards to equipment set-up. Faculty attribute the improvement to
the institution of equipment competencies for each x-ray machine. PAC decided to conduct equipment sign-offs in both Fall
and Spring Semesters of the first year, and again when second-year students rotate to a new site. We believe this will help us
continue to improve in this area.
60 | P a g e 8/15/2016
2. Did your program work with a Deputy Sector Navigator and if so, how did this lead to program changes or improvements?
No
3. Is your discipline/department/program currently participating in any grants specific to the program? Please discuss your progress
in meeting the stated goals in the grant.
No
For Counseling:
1. What has the counseling department done to improve course completion and retention rates? What is planned for the future?
2. What has the counseling department done to improve SSSP counseling services? Please discuss your progress in improving SSSP
counseling services.
For Library Services:
1. Please describe any changes in the library collections, circulation transactions, or library programs.
2. What has the library done to improve course completion and retention rates?
For Student Services and/or Administrative Units:
1. Briefly describe the results of any student satisfaction surveys or college surveys that included evaluation and/or input about the
effectiveness of the services provided by your unit. How has this information informed unit planning and goal setting?
61 | P a g e 8/15/2016
2. Briefly describe any changes that have impacted the work of your unit.
62 | P a g e 8/15/2016
VII. New Resource Needs Not Covered by Current Budget
Purpose: In this section, programs will documents new and repeat resource requests, and document the support of the request with data or evidence.
Human Resources: If you are requesting new or additional positions, in any job classification, please explain how new positions will contribute
to increased student success.
Human Resource
Request(s)
Already
Requested in
Recent
Program
Review?
(yes/no)
Program Goal
(cut and paste
from program
review)
Connected to
Assessment Results
and Plans?
(List the course
and SLO or PLO
and Academic
Year)
Does other data support
your resource requests?
If so, explain the metric
and trend or result. (1-3
sentences)
How will this resource
contribute to student
success? (1-3 sentences)
Alignment
with College
(List Goal
A-E)
Alignment with
PCCD Goal
(List Goal A-E)
*New faculty requests must be listed here.
63 | P a g e 8/15/2016
Technology and Equipment: How will the new technology or equipment contribute to student success?
Technology and
Equipment
Already
Requested in
Recent
Program
Review?
(yes/no)
Program Goal
(cut and paste
from program
review)
Connected to
Assessment Results
and Plans?
(List the course
and SLO or PLO
and Academic
Year)
Does other data support
your resource requests?
If so, explain the metric
and trend or result. (1-3
sentences)
How will this resource
contribute to student
success? (1-3 sentences)
Alignment
with College
(List Goal
A-E)
Alignment
with PCCD
Goal (List
Goal A-E)
Footstools with high
handles
QA Equipment
CR Cassettes
No
No
No
No
No
No
No
No
No
Footstools are a safety need.
Stools we now have are old,
broken, and do not have a
high handle. Students
practice positioning by
positioning each other, we
need to get them safely on
and off the table.
QA test tool equipment will
be in the next curriculum
update for the RADSC 6,
Quality Assurance and
Fluoroscopy course. It will
be used in the laboratory
portion of the course.
CR cassettes will replace
obsolete technology.
Students need to practice
with equipment they will
use in the hospital.
A and E A and E
64 | P a g e 8/15/2016
Facilities: Has facilities maintenance and repair affected your program in the past year? How will this facilities request contribute to student
success? None
Facilities Already
Requested in
Recent
Program
Review?
(yes/no)
Program Goal
(cut and paste
from program
review)
Connected to
Assessment Results
and Plans?
(List the course
and SLO or PLO
and Academic
Year)
Does other data support
your resource requests?
If so, explain the metric
and trend or result. (1-3
sentences)
How will this resource
contribute to student
success? (1-3 sentences)
Alignment
with College
(List Goal
A-E)
Alignment
with PCCD
Goal (List
Goal A-E)
Professional Development or Other Requests: How will the professional develop activity contribute to student success? What
professional development opportunities and contributions make to the college in the future? None
Professional
Development
Already
Requested in
Recent
Program
Review?
(yes/no)
Program Goal
(cut and paste
from program
review)
Connected to
Assessment Results
and Plans?
(List the course and
SLO or PLO and
Academic Year)
Does other data support
your resource requests? If
so, explain the metric and
trend or result. (1-3
sentences)
How will this resource
contribute to student
success? (1-3 sentences)
Alignment
with College
(List Goal
A-E)
Alignment
with PCCD
Goal (List
Goal A-E)
Endorsed by the District Academic Senate May 17, 2016
65 | P a g e 8/15/2016
Glossary: Definitions
The following are only some common terms and definitions. If you have additional questions about data, terms or definitions found in this APU, please contact
Samantha Kessler, Research and Planning Officer skessler@peralta.edu.
Term Definition ACCJC Accrediting Commission for Community and Junior Colleges
Annual Unit Planning (APU) A report documenting reflecting continuous quality improvement containing progress on goals, assessment results, and program changes and improvements, as well as requests for new resources.
Assessment An ongoing process aimed at understanding and improving student learning. At Merritt, Assessment data is housed in Task Stream.
CCCO California Community College Chancellor's Office Certificate Completion (PCCD definition) Number of Students earning a Certificate Completion Rate (CCCO - Scorecard definition)
Cohort measure of the percentage of first time students and achieved an outcome of Degree, Certificate, transfer or 'transfer-prepared' within six years of entry.
Completion Rate (Course-level) (PCCD and state definition) The measure of students earning a grade of C or better in a course. Also called success rate, or Successful Course Completion.
CTE Rate (CCCO - Scorecard definition) Cohort measure of the percentage of student who attempted a CTE course for the first-time and completed more than 8 units in the subsequent 3 years in a single discipline and achieved a Degree, Certificate, Transfer or 'transfer-prepared' within 6 years of entry.
CurricUNET Software for Curriculum information changes and updates. Degree Completion (PCCD definition) Student earning a Degree
Enrollment A student enrolled in a class is counted once. Enrollment for a department, division and college is 'duplicated' in the sense that all class enrollments are counted, including students taking multiple courses.
FTEF Full-time Equivalent Faculty 1FTEF = 1 instructor teaching 15 equated hours per week for 1 semester.
FTES Major student workload measure. It is the equivalent of 525 hours of student instruction per FTES, or one student enrolled in courses for 3 hours a day, 5 days a week, for an academic year of 35 weeks.
Goals Broad learning outcomes and concepts as a vision for the program and expressed in general terms.
66 | P a g e 8/15/2016
Headcount Unduplicated count of students. Students are counted once per academic year. If the headcount is by term, the student is counted once per term.
Institutional Effectiveness Indicators System of indicators and goals that are intended to encourage improvement in institutional effectiveness at California Community Colleges.
Institution-set Standards Measures of evaluating student achievement performance of an institution and/or program required by ACCJC.
Learning Outcomes The skills and/or knowledge that a student can expect to have upon completion of a specific education task (course, program, degree, etc.)
Mission Statement A brief statement of the general values and principles which guide the program curriculum and/or department goals.
Productivity FTES/FTEF. A measure of the productivity of a class or group of classes. Number of full time students per full time faculty member.
Program Review (PR) Comprehensive reporting documents completed every three years, containing progress on goals, assessment results, and program changes and improvements, as well as requests for new resources.
Remedial Rate (CCCO - Scorecard definition) Cohort measure of the percentage of credit students who attempted for the first time a course designated at 'levels below transfer' and then successfully completed a college-level course within 6 years.
Retention (Course-level) (PCCD definition) The measure of students retained in a class, or earning a grade other than W.
Retention (Institution-level) A measure tracking students who enroll in consecutive terms at the college. Sometimes this term is interchanged with persistence. Can be tracked Fall to Spring, or Fall to Fall.
Student Success Scorecard California Community College Chancellor's Office performance measurement system that tracks student success at the college.
Taskstream Merritt's assessment tool and assessment data tracking system.
Transfer (as a metric) Number of Students enrolling in a 4-year College or University after attending Merritt College
SLO Three-year Cycle Report
Fall 2013-Summer 2016
Tuesday, September 06, 2016
5:53:03 PM
Disc # Course Title # of
SLOs
SLO NotesSLO 1 SLO 2 SLO 3 SLO 4 SLO 5 SLO 6 SLO 7 SLO 8 SLO 9
RADSC 001A RADSC 001A SURVEY OF
RADIOLOGIC SCIENCE
3 15-16 15-16
RADSC 001B RADSC 001B INTRODUCTION
TO MEDICAL IMAGING
5 15-16 15-16 15-16 15-16
RADSC 001C RADSC 001C
INTRODUCTION TO MEDICAL
IMAGING CLINIC
4 15-16 15-16 15-16 15-16
RADSC 002A RADSC 002A
RADIOGRAPHIC PHYSICS I
2 15-16 15-16
RADSC 002B RADSC 002B RADIOGRAPHIC
PHYSICS II
2 13-14 13-14
RADSC 002C RADSC 002C DIGITAL
APPLICATIONS IN MEDICAL
IMAGING
4 14-15 14-15
RADSC 003A RADSC 003A POSITIONING I 3 14-15 14-15 14-15
RADSC 003B RADSC 003B POSITIONING II 4
RADSC 004A RADSC 004A RADIATION
PROTECTION
3 15-
16
RADSC 004B RADSC 004B RADIOBIOLOGY 4
RADSC 005A RADSC 005A PATIENT CARE I 6 13-14 13-14 15-16
RADSC 005B RADSC 005B PATIENT CARE II 6 15-16 15-16 15-16 15-16 15-16 15-16
Page 1 of 2
Disc # Course Title # of
SLOs
SLO NotesSLO 1 SLO 2 SLO 3 SLO 4 SLO 5 SLO 6 SLO 7 SLO 8 SLO 9
RADSC 006 RADSC 006 QUALITY
MANAGEMENT/FLUOROSCO
PY
5 15-
16
RADSC 007 RADSC 007 ADVANCED
IMAGING PROCEDURES
2 14-15 14-15
RADSC 008 RADSC 008 SECTIONAL
ANATOMY AND
RADIOGRAPHIC PATHOLOGY
4 15-16
RADSC 009A RADSC 009A CLINICAL
EXPERIENCE I
6
RADSC 009B RADSC 009B CLINICAL
EXPERIENCE II
6
RADSC 009C RADSC 009C CLINICAL
EXPERIENCE III
6 15-16 15-16 15-16 15-16 15-16 15-16
RADSC 009D RADSC 009D CLINICAL
EXPERIENCE IV
4
RADSC 009E RADSC 009E CLINICAL
EXPERIENCE V
4
RADSC 010A RADSC 010A SEMINAR 5 15-16
RADSC 010B RADSC 010B SEMINAR 3 15-16
Page 2 of 2
Last updated date should be on or after:CTE: 9/1/2014Non-CTE: 9/1/2013
Curriculum Update Report
Course Title Last Updated Notes
COPED 470C Occupational Work Experience in
Radiologic Science
2005‐2008 Not updated since original CNET implemented; no approved SLOs
in CNET. Note: Missing state control number. Cannot approve
until we receive clarity as to whether COPED courses can contain
RADSC 001A Survey of Radiologic Science 2005‐2008 Not updated since original CNET implemented; no approved SLOs
in CNET.
RADSC 001B Introduction to Medical Imaging 2005‐2008 Not updated since original CNET implemented. Update proposed
in 2013 but required SLO revision.
RADSC 001C Introduction to Medical Imaging
Clinic
2005‐2008 Not updated since original CNET implemented. It looks like
update was started in 2013 but got stuck in Originator Approver
step. Please contact Arja and Clifton if you wish to pursue. Given
RADSC 002A Radiographic Physics I 2005‐2008 Not updated since original CNET implemented. It looks like
update was started in 2013 but got stuck in Originator Approver
step. Please contact Arja and Clifton if you wish to pursue. Given
RADSC 002B Radiographic Physics II 10/14/2010
RADSC 002C Digital Applications in Medical
Imaging
10/24/2013
RADSC 003A Positioning I 11/21/2013
RADSC 003B Positioning II 2005‐2008 Not updated since original CNET implemented; no approved SLOs
in CNET. It looks like update was started in 2013 but got stuck in
Originator Approver step. Please contact Arja and Clifton if you
RADSC 004A Radiation Protection 2005‐2008 Not updated since original CNET implemented; no approved SLOs
in CNET.
RADSC 004B Radiobiology 10/29/2013
RADSC 005A Patient Care I 2005‐2008 Not updated since original CNET implemented; no approved SLOs
in CNET. It looks like update was started in 2013 but got stuck in
Originator Approver step. Please contact Arja and Clifton if you
RADSC 005B Patient Care II 10/29/2013
RADSC 006 Quality Management/Fluoroscopy 2005‐2008 Not updated since original CNET implemented; no approved SLOs
in CNET.
RADSC 007 Advanced Imaging Procedures 11/21/2013
RADSC 008 Sectional Anatomy and
Radiographic Pathology
2005‐2008 Not updated since original CNET implemented; no approved SLOs
in CNET. It looks like update was started in 2013 but got stuck in
Originator Approver step. Please contact Arja and Clifton if you
RADSC 009A Clinical Experience I 11/21/2013
RADSC 009B Clinical Experience II 2005‐2008 Not updated since original CNET implemented; no approved SLOs
in CNET. It looks like update was started in 2013 but got stuck in
Originator Approver step. Please contact Arja and Clifton if you
RADSC 009C Clinical Experience III 11/21/2013
Last Updated: 9/1/2016
Last updated date should be on or after:CTE: 9/1/2014Non-CTE: 9/1/2013
Curriculum Update Report
Course Title Last Updated Notes
RADSC 009D Clinical Experience IV 2005‐2008 Not updated since original CNET implemented; no approved SLOs
in CNET. It looks like update was started in 2013 but got stuck in
Originator Approver step. Please contact Arja and Clifton if you
RADSC 009E Clinical Experience V 2/28/2008 Not updated since original CNET implemented; no approved SLOs
in CNET. It looks like update was started in 2013 but got stuck in
Originator Approver step. Please contact Arja and Clifton if you
RADSC 010A Seminar 10/24/2013
RADSC 010B Seminar 10/24/2013
RADSC 251 Clinical Experience for the
Returning Student (First Year
2005‐2008 Not updated since original CNET implemented; no approved SLOs
in CNET.
RADSC 252 Clinical Experience for the
Returning Student (Second Year)
10/29/2013
Last Updated: 9/1/2016
Q1: Program Name RADSC
Q2: Reviewer Name: Rosemary Delia
Q3: Are the program name and type present? Yes
Q4: Is the program mission statement clear and well-defined?
Satisfactory,
Comments:The mission statement is very good--it also includes aset of program goals which could be incorporated intothe mission statement.
Q5: Dates of last program review and validation arelisted.
Yes
Q6: Select the category of goal: Assessment Goal
Q7: Is the goal clear and measurable? Clear, measurable and well-defined.
Q8: Is the goal aligned to PCCD and Merritt goals? yes
Q9: Does the detail explain the completion or revision ofthe goal, or does the detail explain why the new goal waschosen?
Detail is clear and comprehensive.
Q10: Select the category of goal: Curriculum Goal
Q11: Is the goal clear and measurable? Clear, measurable and well-defined.
Q12: Is the goal aligned to PCCD and Merritt goals? yes
Q13: Does the detail explain the completion or revisionof the goal, or does the detail explain why the new goalwas chosen?
Detail is clear and comprehensive. ,
Comments:How many course outlines need updating and what isthe timeline of completing them?
Q14: Select the category of goal: Instruction Goal
COMPLETECOMPLETECollector:Collector: Web Link 1 Web Link 1 (Web Link)(Web Link)Started:Started: Monday, October 10, 2016 12:26:02 PMMonday, October 10, 2016 12:26:02 PMLast Modified:Last Modified: Saturday, October 22, 2016 7:33:18 AMSaturday, October 22, 2016 7:33:18 AMTime Spent:Time Spent: Over a weekOver a week
PAGE 1
PAGE 2
#25
1 / 5
APU 2016-2017 Validation
Q15: Is the goal clear and measurable? Clear, measurable and well-defined.
Q16: Is the goal aligned to PCCD and Merritt goals? yes
Q17: Does the detail explain the completion or revisionof the goal, or does the detail explain why the new goalwas chosen?
Detail is clear and comprehensive.
Q18: Select the category of goal: Professional Development, Professional Engagementand Partnerships Goal
Q19: Is the goal clear and measurable? Clear, measurable and well-defined.
Q20: Is the goal aligned to PCCD and Merritt goals? yes
Q21: Does the detail explain the completion or revisionof the goal, or does the detail explain why the new goalwas chosen?
Detail is clear and comprehensive.
Q22: Select the category of goal: Other Goals
Q23: Is the goal clear and measurable? Clear, measurable and well-defined.
Q24: Is the goal aligned to PCCD and Merritt goals? yes
Q25: Does the detail explain the completion or revisionof the goal, or does the detail explain why the new goalwas chosen?
Detail is clear and comprehensive. ,
Comments:Both goals (instructional aides and equipmentpurchase) are explained in the detail
Q26: Additional comments regarding Program Goals: Respondent skipped thisquestion
Q27: Is enrollment data present? Yes,
Comments:Enrollment data provided by Samantha issupplemented by department's own enrollment datathat details information on the individual cohorts thatenter annually. Can our college's "official" datacollection be reflective of each cohort?
Q28: Is the narrative about enrollment clearly linked tothe data?
Narrative is clear with analysis and reflection ofdemographic and enrollment changes.
Q29: Is course sections and productivity datapresent? Yes
Q30: Is the narrative about course sections andproductivitylinked to the data?
Narrative is clear with analysis and reflection ofdemographic and enrollment changes.
Q31: Is student retention and successdata present? Yes
PAGE 3
2 / 5
APU 2016-2017 Validation
Q32: Is the narrative about student retention andsuccess linked to the data?
Narrative is clear with analysis and reflection ofdemographic and enrollment changes.
Q33: Is distance ed and hybrid coursedata present? Respondent skipped thisquestion
Q34: Is the narrative about distance ed and hybridcourseslinked to the data?
Comments: n/a
Q35: Additional comments about data trend analysis:
department conducts its own data collection--can our IR assist?
Q36: Are the program's goals present (the goals for thecurrent year, Section II)?
Yes
Q37: Are the goals mapped to the Educational MasterPlan Goals?
Yes
Q38: Does the detail listed support alignment with theEducational Master Plan?
Detail supports clear and logical mapping to theEducational Master Plan.
Q39: Is the goal mapped to at least one Institution-setstandard or Institutional Effectiveness Goal?
Yes
Q40: Is at least one problem, achievement gap orobservation listed?
yes
Q41: Is the activity or intervention clear? Activities are clear and detailed.
Q42: Is there a target population identified? Yes
Q43: Is the outcome or intended outcome clear andmeasurable?
Detail about the outcome is clear and measurable.
Q44: Is the activity aligned with one or more of theplans: SSSP, Equity or Basic Skills?
Yes
Q45: Did the department or program receive fundingfrom any of these grants in 2015-2016? If so, did thedepartment discuss the use and impact of these funds?
Clear and detailed discussion of the use and impact ofthese funds.
Q46: Did the program discuss any changes or plans forimprovement?
Yes
Q47: Are the changes/plans discussed based on data orother evidence?
Evidence listed and clearly explained and linked tochanges/plans.
Q48: Is a statuslisted for the changes or plans? Yes
PAGE 4
PAGE 5
3 / 5
APU 2016-2017 Validation
Q49: Is theAssessment Completion report attached? yes
Q50: Isthe Fall schedule assessment planning templateattached?
No
Q51: Does the department conduct meaningful dialogueto shape course and program level outcomes? Did thedepartment note where to find evidence of the dialogue?
Question answered thoroughly.
Q52: Additional comments about the Curriculum and Assessment status section:
Note: check the attached fall schedule assessment planning template.
Q53: If applicable, did the program answer the additionalquestions?
Yes
Q54: What category of resource request are youcommenting on?
Technology and Equipment
Q55: Is there a cost listed for the resource? No
Q56: Is the resource connected to a program goal (listedin Section II) and aligned to PCCD and Merritt Collegegoals?
yes
Q57: Is the resource linked to evidence (assessmentdata or other data)?
Evidence and link to assessment or data is clear.
Q58: Is detail provided about the impact on studentsuccess?
Link to student success is clear and detailed.
Q59: What category of resource request are youcommenting on?
Respondent skipped thisquestion
Q60: Is there a cost listed for the resource? Respondent skipped thisquestion
Q61: Is the resource connected to a program goal (listedin Section II) and aligned to PCCD and Merritt Collegegoals?
Respondent skipped thisquestion
Q62: Is the resource linked to evidence (assessmentdata or other data)?
Respondent skipped thisquestion
Q63: Is detail provided about the impact on studentsuccess?
Respondent skipped thisquestion
Q64: What category of resource request are youcommenting on?
Respondent skipped thisquestion
Q65: Is there a cost listed for the resource? Respondent skipped thisquestion
PAGE 6
4 / 5
APU 2016-2017 Validation
Q66: Is the resource connected to a program goal (listedin Section II) and aligned to PCCD and Merritt Collegegoals?
Respondent skipped thisquestion
Q67: Is the resource linked to evidence (assessmentdata or other data)?
Respondent skipped thisquestion
Q68: Is detail provided about the impact on studentsuccess?
Respondent skipped thisquestion
Q69: What category of resource request are youcommenting on?
Respondent skipped thisquestion
Q70: Is there a cost listed for the resource? Respondent skipped thisquestion
Q71: Is the resource connected to a program goal (listedin Section II) and aligned to PCCD and Merritt Collegegoals?
Respondent skipped thisquestion
Q72: Is the resource linked to evidence (assessmentdata or other data)?
Respondent skipped thisquestion
Q73: Is detail provided about the impact on studentsuccess?
Respondent skipped thisquestion
Q74: Additional comments about resource requests.
Only 3 Technology/Equipment requests given: footstools, QA equipment, and CR cassettes.
I recommend re-assessing resource requests in all other areas. Professional Development needs? Other large equipment needs?
Q75: Please mark the APU as "submitted"or"needsrevisions."
needs revisions
PAGE 7
5 / 5
APU 2016-2017 Validation
Recommended