CAEP Evidence Map Initial Programs
1 CEHS Office of Planning and Research_JPWK
PEU Evidence Map
Standard 1: Content and Pedagogical Knowledge - The provider ensures that candidates develop a deep understanding of the critical concepts and
principles of their discipline and, by completion, are able to use discipline-specific practices flexibly to advance the learning of all students toward
attainment of college- and career-readiness standards.
Substandard CMU Evidence Administered Required CAEP Annual Measure
CAEP Suggested Evidence
1.1 Candidates demonstrate an understanding of the 10 InTASC standards at the appropriate progression level(s) in the following categories: the learner and learning; content; instructional practice; and professional responsibility.
Common Rubric for a Lesson Plan (not as well aligned with InTASC)
Mid & End of Program
● Clinical experience observation instrument
● Lesson and unit plans
● Portfolios
● Teacher work samples
● GPA (for courses specific to the learner such as
developmental psychology, motor development, etc.)
● Dispositional and professional responsibility data
● Comparisons of education and other IHE attendees on
provider end-of-major projects or demonstrations (if
applicable to provider);
● End-of-key-course tests
● Pre-service measures of candidate impact on P-12
student learning such as during methods courses, clinical
experiences, and/or at exit
● Capstone assessments (such as those including measures
of pre-service impact on P-12 student learning and
development as well as lesson plans, teaching artifacts,
examples of student work and observations or videos
judged through rubric-based reviews by trained
reviewers) that sample multiple aspects of teaching
including pre-and post-instruction P-12 student data
● Pedagogical content knowledge licensure test such as
Praxis PLT
● Proprietary assessments that may or may not be required
by the state (such as edTPA and PPAT)
● GRE field tests (limited fields: biochemistry, cell and
molecular biology, biology, chemistry, computer science,
literature in English, mathematics, physics, psychology);
ETS major field tests
Final Pre-Student Teaching Evaluation
Mid Program
Final Student Teaching Evaluation End of Program
MTTC End of Program
Need Direct Evidence
CAEP Evidence Map Initial Programs
2 CEHS Office of Planning and Research_JPWK
Substandard CMU Evidence Administered Required CAEP Annual Measure
CAEP Suggested Evidence
1.2 Providers ensure that candidates use research and evidence to develop an understanding of the teaching profession and use both to measure their P-12 students’ progress and their own professional practice.
Final Student Teaching Evaluation (it is embedded in Instructional Practice and is not a direct measure of use of research)
End of Program
YES ● Work sample
● Provider-created or proprietary assessments
● Pre- and post-data and reflections on the interpretation
and use of data
● Portfolio (including assessment of assignments made to
students and artifacts produced) Direct Evidence Needed YES
1.3 Providers ensure that candidates apply content and pedagogical knowledge as reflected in outcome assessments in response to standards of Specialized Professional Associations (SPA), the National Board for Professional Teaching Standards (NBPTS), states, or other accrediting bodies (e.g., National Association of Schools of Music – NASM).
MTTC trends as compared to the state
CSS - OPR YES ● SPA reports
● Other specialty area accreditor reports
● Specialty area-specific state standards achieved OR
evidence of alignment of assessments to other
state/national standards
● Number of completers who have been awarded National
Board Certified Teacher (NBCT) status by the National
Board for Professional Teaching Standards (NBPTS)
● Providers should include trends and comparisons within
and across specialty licensure area data.
Alignment of assessments to state standards
CSS - OPR
CAEP Evidence Map Initial Programs
3 CEHS Office of Planning and Research_JPWK
Substandard CMU Evidence Administered Required CAEP Annual Measure
CAEP Suggested Evidence
1.4 Providers ensure that candidates demonstrate skills and commitment that afford all P-12 students access to rigorous college- and career-ready standards (e.g., Next Generation Science Standards, National Career Readiness Certificate, Common Core State Standards).
Common Rubric for a Lesson Plan Mid & End of Program
● Observational instruments
● Lesson or unit plans
● Work samples
● Portfolios (such as edTPA or PPAT) Final Student Teaching Evaluation End of
Program
Direct Evidence Needed
1.5 Providers ensure that candidates model and apply technology standards as they design, implement, and assess learning experiences to engage students and improve learning; and enrich professional practice.
Common Rubric for a Lesson Plan (one section embedded w/in Instructional Strategies)
Mid & End of Program
● Clinical experience observation instrument
● Lesson or unit plan assessments
● Portfolios
● Work sample with exhibition of applications
and use of technology in instruction
● Technology course signature
project/assignment.
Final Pre-Student Teaching Evaluation (Technology)
Mid Program
Final Student Teaching Evaluation (Technology)
End of Program
Direct Evidence Needed
CAEP Evidence Map Initial Programs
4 CEHS Office of Planning and Research_JPWK
Standard 2: Clinical Partnerships and Practice - The provider ensures that effective partnerships and high-quality clinical practice are central to
preparation so that candidates develop the knowledge, skills, and professional dispositions necessary to demonstrate positive impact on all P-12
students’ learning and development.
Substandard CMU Evidence Administered Required CAEP Annual Measure
CAEP Suggested Evidence
2.1 Partners co-construct mutually beneficial P-12 school and community arrangements for clinical preparation, including technology-based collaborations, and shared responsibility for continuous improvement of candidate preparation. Partnerships for clinical preparation can follow a range of forms, participants, and functions. They establish mutually agreeable expectations for candidate entry, preparation, and exit; ensure that theory and practice are linked; maintain coherence across clinical and academic components of preparation and share accountability for candidate outcomes.
Memorandum of Understanding (MOU)
Ongoing ● Description of partnerships (e.g., MOU) along
with documentation that the partnership is
being implemented as described
● Schedule of joint meetings between partners
and purpose/topics covered in meetings
● Field experience handbooks (section[s] specific
to component)
● Documentation of stakeholder involvement
● Documentation of a shared responsibility
model
● Documentation of technology-based
collaborations
● Evidence that placements, observational
instruments, and evaluations are co-
constructed by partners
● Criteria for candidate expectations during
clinical experiences are co-constructed and
identified on evaluation instruments
Clinical Partnership & Planning Committee (CPPC)
Ongoing
PEU Shared Governance Structure – P-12 representation on each committee
Ongoing
CAEP Evidence Map Initial Programs
5 CEHS Office of Planning and Research_JPWK
Substandard CMU Evidence Administered Required CAEP Annual Measure
CAEP Suggested Evidence
2.2 Partners co-select, prepare, evaluate, support, and retain high quality clinical educators, both EPP and school-based, who demonstrate a positive impact on candidates’ development and P-12 student learning and development. In collaboration with their partners, providers use multiple indicators and appropriate technology-based application to establish, maintain, and refine criteria for selection, professional development, performance evaluation, continuous improvement and retention of clinical educators in all clinical placement settings.
CCE Documentation & Procedures for hiring, selection of CTs, PD, performance evaluations, online Clinical Handbook
Ongoing
● The provider documents clinical educator
and clinical placement characteristics with
co-selection, based on shared criteria.
● The provider documents its criteria for
selection of clinical educators, including
recent field experience and currency in
relevant research.
● Resources are available online.
● Orientation of clinical educators is available
in person and online.
● The provider shares performance
evaluations of university supervisors,
clinical educators, and candidates.
● The provider conducts surveys of clinical
educators (P-12 and EPP based) and
candidates on the quality of and consistency
among clinical educators.
● The provider collects and uses data for
modifying clinical experiences.
● The provider makes and keeps records of
remediation and/or counseling out
available.
● Training and coaching of clinical educators
is available in person and online.
● Joint sharing of curriculum
development/design/redesign occurs
between the provider and site(s).
MDE Cooperating Teacher Survey End of program
MDE Supervisor Survey End of program
Need to document stakeholder involvement
CAEP Evidence Map Initial Programs
6 CEHS Office of Planning and Research_JPWK
Substandard CMU Evidence Administered Required CAEP Annual Measure
CAEP Suggested Evidence
2.3 The provider works with partners to design clinical experiences of sufficient depth, breadth, coherence, and duration to ensure that candidates demonstrate their developing effectiveness and positive impact on all students’ learning and development. Clinical experiences, including technology-enhanced learning opportunities, are structured to have multiple, performance-based assessments at key points within the program to demonstrate candidates’ development of the knowledge, skills, and professional dispositions, as delineated in Standard 1, that are associated with a positive impact on the learning and development of all P-12 students.
To summarize outcomes, providers could cross- reference their findings and conclusions from component 1.1 evidence on exiting candidate competencies, from component 3.4 evidence on monitoring of candidate development during preparation, and from component 4.1 evidence about completer impact on P-12 student learning
The provider documents the provider and partner probe of the relationships between outcomes and a particular facet of clinical preparation (depth, breadth, diversity, coherence, or duration): ● Selection of one of the facets of preparation,
based on analyses of data and individual fit, to
examine current placement and then test the
specific facet systematically (controlling for
other variables) to gather data on what works.
● To summarize outcomes, providers could cross-
reference their findings and conclusions from
component 1.1 evidence on exiting candidate
competencies, from component 3.4 evidence on
monitoring of candidate development during
preparation, and from component 4.1 evidence
about completer impact on P-12 student
learning.
● To examine clinical experiences, providers
should ensure that these experiences are
deliberate, purposeful, sequential, and assessed
using performance-based protocols.
● To examine clinical experiences, component 2.3
is asking the provider to consider the
relationship between the outcomes and the
attributes of the clinical experiences. The
question is as follows: What is it about the
Description of clinical experience goals and operational design along with documentation that clinical experiences are being implemented as described; scope and sequence matrix that charts depth, breath and diversity of clinical experiences; chart of candidate experiences in diverse settings; monitoring of candidate progression and counseling actions; application of technology to enhance instruction; and P-12 learning for all students.
CAEP Evidence Map Initial Programs
7 CEHS Office of Planning and Research_JPWK
experiences (that is, depth, breadth, diversity,
coherence, and duration) that can be associated
with the observed outcomes?
● Description of clinical experience goals and
operational design along with documentation
that clinical experiences are being implemented
as described; scope and sequence matrix that
charts depth, breath and diversity of clinical
experiences; chart of candidate experiences in
diverse settings; monitoring of candidate
progression and counseling actions; application
of technology to enhance instruction; and P-12
learning for all students.
CAEP Evidence Map Initial Programs
8 CEHS Office of Planning and Research_JPWK
Standard 3: Candidate Quality, Recruitment and Selectivity – The provider demonstrates that the quality of candidates is a continuing and
purposeful part of its responsibility from recruitment, at admission, through the progression of courses and clinical experiences, and to decisions that
completers are prepared to teach effectively and are recommended for certification. The provider demonstrates that development of candidate quality is
the goal of educator preparation in all phases of the program. This process is ultimately determined by a program’s meeting of Standard 4.
Substandard CMU Evidence Administered Required CAEP Annual Measure
CAEP Suggested Evidence
3.1 The provider presents plans and goals to recruit and support completion of high-quality candidates from a broad range of backgrounds and diverse populations to accomplish their mission. The admitted pool of candidates reflects the diversity of American’s P-12 students. The provider demonstrate efforts to now and address community, state, national, or local needs for hard-to-staff schools and shortage fields, currently, STEM, English language learning, and students with disabilities
Multicultural Recruitment Plan (Smith-Gaken & PESAR)
Ongoing ● Application, acceptance, and enrollment rates
should be disaggregated by demographic
variables such as socio-economic background,
gender, ethnicity, and other background
characteristics.
● Strategic recruitment plans, based on the
provider’s mission and employment
opportunities (including STEM, ELL, (special
education, and hard-to-staff schools) for
completers and need to serve increasingly
diverse populations. Includes plans for
outreach, numerical goals and baseline data,
monitoring of progress, analyses and judgment
of adequacy of progress toward goals, and
making indicated changes.
Also, (1) evidence of resources moving toward identified targets and away from low-need areas; (2) evidence of marketing and recruitment at high schools and/or colleges that are racially and culturally diverse; and (3) evidence of collaboration with other providers, states, and school districts as an indicator of outreach and awareness of employment needs.
STEM Project – Julie Cunningham Ongoing
Application/Acceptance/Enrollment Rates
Ongoing
CAEP Evidence Map Initial Programs
9 CEHS Office of Planning and Research_JPWK
Substandard CMU Evidence Administered Required CAEP Annual Measure
CAEP Suggested Evidence
3.2 REQUIRED COMPONENT- The provider sets admissions requirements, including CAEP minimum criteria or the state’s minimum criteria, whichever are higher, and gathers data to monitor applicants and the selected pool of candidates. The provider ensures that the average grade point average of its accepted cohort of candidates meets or exceeds the CAEP minimum of 3.0, and the group average performance on nationally normed ability/achievement assessments such as ACT, SAT, or GRE: is in the top 50 percent from 2016-2017; is in the top 40 percent of the distribution from 2018-2019; and is in the top 33 percent of the distribution by 2020.[i]
ACT Scores (cohort average)
Mid Program ● The provider presents admission criteria,
admitted candidate criteria, and enrollment
pool of candidates’ criteria for GPA, for
normed tests, and any alternatives. More
explicitly, the EPP provides GPA, normed
tests, and any alternate measures separately
for admission criteria, the admitted
candidates, and the enrolled pool of
candidates. In addition to the mean cohort
GPA, providers should report the
range/standard deviation, and percentage
of students below 3.0.
● For admissions at the undergraduate level, as
freshmen, the CAEP “minimum criteria”
should be interpreted as referring to high school
GPA and “normed tests” such as ACT or SAT
(or IB, or AP, or other normed measures) or
state or other assessments linked to normed
data.
● For admissions at the graduate level, the CAEP
“minimum criteria” should be interpreted as
referring to college GPA; the normed test might
include GRE, MAT, or other college level
indicators of academic achievement ability.
Admissions GPA (individual requirement)
Mid Program
CAEP Evidence Map Initial Programs
10 CEHS Office of Planning and Research_JPWK
Substandard CMU Evidence Administered Required CAEP Annual Measure
CAEP Suggested Evidence
3.3 Educator preparation providers establish and monitor attributes and dispositions beyond academic ability that candidates must demonstrate at admissions & during the program. The provider selects criteria, describes the measures used and evidence of the reliability & validity of those measures, and reports data that show how the academic and non-academic factors predict candidate performance in the program and effective teaching.
Disposition Interview Protocol (DAP) (admissions)
Mid Program ● The provider indicates non-academic factors
actually used during candidate admissions and
monitored during preparation. A description of
how these non-academic factors are assessed
and applied to admissions decisions should be
included.
● The provider demonstrates knowledge and use
of relevant literature supporting the factors it
has selected and/or investigated. The provider
bases selection criteria on relevant research
literature and/or investigations it has
conducted, including both quantitative and
qualitative approaches.
● Measures may be related to specific specialty
license areas or generally applied to all provider
candidates
Need Evidence: Disposition System professionalism standards for teacher candidates (reviewed with students in EDU 107, admissions, & pre-student teaching)
Early, Mid & End of Program
MTTC – subarea 4 Working in the Professional Environment
End of Program
YES
Final Pre-Student Teaching Evaluation (Professional Responsibility)
Mid Program
Final Student Teaching Evaluation (Professional Responsibility)
End of Program
CAEP Evidence Map Initial Programs
11 CEHS Office of Planning and Research_JPWK
Substandard CMU Evidence Administered Required CAEP Annual Measure
CAEP Suggested Evidence
3.4 The provider creates criteria for program progression and monitors candidates’ advancement from admissions through completion. All candidates demonstrate ability to teach to college- and career-ready standards. Providers present multiple forms of evidence to indicate candidates’ developing content knowledge, pedagogical content knowledge, pedagogical skills, and the integration of technology in all of these domains.[ii]
Common Rubric for a Lesson Plan Mid and End of Program
● The provider documents evidence that it
measures candidate progress at two or more
points during preparation (including decision
points on candidate retention, assessments,
provider interventions, the results, and provider
explanations for actions taken) for candidates’
development of the following knowledge/skills:
o Ability to teach to college- and career-
ready standards
o Content knowledge
o Pedagogical content knowledge
o Pedagogical skills
o Integration of technology with
instruction
Provider documents use of assessments that
monitor candidate proficiencies, including impact
on P-12 student learning, at various points during
their developmental preparation experiences.
Final Pre-Student Teaching Evaluation Mid Program
Final Student Teaching Evaluation End of Program
Need Evidence for pre-service impact on P-12 Learning
YES
CAEP Evidence Map Initial Programs
12 CEHS Office of Planning and Research_JPWK
Substandard CMU Evidence Administered Required CAEP Annual Measure
CAEP Suggested Evidence
3.5 Before the provider recommends any completing candidate for licensure or certification, it documents that the candidate has reached a high standard for content knowledge in the fields where certification is sought and can teach effectively with positive impacts on P-12 student learning and development.
Need Evidence for pre-service impact on P-12 Learning
YES ● Provider evidence documents pre-service
positive candidate impacts on P-12 student
learning and development such as the following:
o Pre-service measures of candidate impact
on P-12 student learning such as during
methods courses, clinical experiences,
and/or at exit.
o Capstone assessments (such as those
including measures of pre-service impact
on P-12 student learning and development
as well as lesson plans, teaching artifacts,
examples of student work and
observations or videos judged through
rubric-based reviews by trained reviewers)
that sample multiple aspects of teaching
including pre- and post-instruction P-12
student data..
Cross-reference to relevant evidence provided for component 1.1 on candidate competence and 1.3 on alignment with specialty area standards.
3.6 Before the provider recommends any completing candidate for licensure or certification, it documents that the candidate understands the expectations of the profession, including codes of ethics, professional standards of practice, and relevant laws and policies. CAEP monitors the development of measures that assess candidates’ success and revises standards in light of new results.
Final Pre-Student Teaching Evaluation Mid Program ● Provider measure of topic knowledge of
codes of ethics, professional standards of
practice and relevant laws and policies,
based on course materials/assessments
● Results of national, state, or provider-
created instrument(s) to assess candidates’
understanding of special education laws
(section 504 disability) code of ethics,
professional standards, and similar content.
Evidence of specialized training (e.g.,
bullying, state law, etc.)
Final Student Teaching Evaluation End of Program
Direct Evidence Needed
CAEP Evidence Map Initial Programs
13 CEHS Office of Planning and Research_JPWK
STANDARD 4: The provider demonstrates the impact of its completers on P-12 student learning and development, classroom
instruction, and schools, and the satisfaction of its completers with the relevance and effectiveness of their preparation.
NOTE 1: All components must be met for Standard 4
Substandard CMU Evidence Administered Required CAEP Annual Measure
CAEP Suggested Evidence
4.1 Required Component: The provider documents, using multiple measures, that program completers contribute to an expected level of student-learning growth. Multiple measures shall include all available growth measures (including value-added measures, student-growth percentiles, and student learning and development objectives) required by the state for its teachers and available to educator preparation providers, other state-supported P-12 impact measures, and any other measures employed by the provider.
P-12 Program Impact Case Study – NWEA Measures of Academic Progress (MAP) – P-12 student growth data
Completers 1-3 Years Out
YES ● Value-added modeling (VAM)
● Student-growth percentiles tied to teacher
(completers or provider)
● Student learning and development objectives
● State supported measures that address P-12
student learning and development that can be
linked with teacher data
● Providers’ documentation of analysis and
evaluation of evidence presented on completers’
impact on P-12 student learning
For providers that do not have access to state P-12 student learning data and providers that are supplementing state or district data with data on subjects or grades not covered, the following guidance applies:
● This type of provider study could be phased in.
For example, initially the provider would create
an appropriate design; then conduct a pilot data
collection and analysis; and then make
refinements and further data collection.
P-12 Program Impact Case Study – Principal Focus Group
Completers 1-3 Years Out
YES
CAEP Evidence Map Initial Programs
14 CEHS Office of Planning and Research_JPWK
P-12 Program Impact Case Study – Completer Focus Group
Completers 1-3 Years Out
YES ● The provider could maintain a continuing cycle
of such studies, examining completer
performance in different grades and/or subjects
over time.
● The provider could develop case studies of
completers that demonstrate the impacts of
preparation on P-12 student learning and
development and can be linked with teacher
data; some examples follow:
o Provider-conducted case studies of
completers
o Completer-conducted action research
o Descriptions of partnerships with
individual schools or districts
o Description of methods and
development of any assessment used
o Use of focus groups, blogs, electronic
journals, interviews, and other evidence
Documentation of Partnerships with ISD/RESDs
Ongoing
CAEP Evidence Map Initial Programs
15 CEHS Office of Planning and Research_JPWK
Substandard CMU Evidence Administered Required CAEP Annual Measure
CAEP Suggested Evidence
4.2 Required Component: The provider demonstrates, through structured and validated observation instruments and student surveys, that completers effectively apply the professional knowledge, skills, and dispositions that the preparation experiences were designed to achieve
Evidence Needed ● student surveys and/or
● classroom observations of completers using
measures correlated with P-12 student
learning, such as those used in the MET
study and/or
● Provider created classroom observations.
● Provider analyze student survey and
completer observation evidence, including
(1) comparison of trends over time and
benchmarking with district, state, national,
or other relevant data, if available; (2)
assessments and scoring guides; (3)
interpretations of results; and (4)
information on the representativeness of
data.
CAEP Evidence Map Initial Programs
16 CEHS Office of Planning and Research_JPWK
Substandard CMU Evidence Administered Required CAEP Annual Measure
CAEP Suggested Evidence
4.3 Required Component: The provider demonstrates, using measures that result in valid and reliable data and including employment milestones such as promotion and retention, that employers are satisfied with the completers’ preparation for their assigned responsibilities in working with P-12 students.
Principal Survey Completers 1-3 Years Out
YES ● Employer satisfaction surveys (include
instrument sampling, response rates, timing)
● Employer satisfaction interviews (include
population represented, response rates,
instrument content, timing)
● Employer satisfaction focus groups (include
population represented, response rates,
instrument content, timing)
● Employer satisfaction case studies (include
description of methodology).
Providers submit at least three cycles of data on employment milestones such as the following:
● Promotion
● Employment trajectory
● Employment in high needs schools
Retention in (1) education position for which
initially hired or(2) other education role by the
same or a different employer
P-12 Program Impact Case Study – Principal Focus Group
Completers 1-3 Years Out
YES
CMU First Destination Survey – (Employment trajectory, education position for which prepared)
Completers 1-3 Years Out
YES
CAEP Evidence Map Initial Programs
17 CEHS Office of Planning and Research_JPWK
Substandard CMU Evidence Administered Required CAEP Annual Measure
CAEP Suggested Evidence
4.4 Required Component: The provider demonstrates, using measures that result in valid and reliable data, that program completers perceive their preparation as relevant to the responsibilities they confront on the job, and that the preparation was effective.
CARRS Alumni Survey (3 years out)
Completers 1-3 Years Out
● Completer satisfaction surveys (include
instrument, sampling, response rates, timing)
● Completer satisfaction interviews (include
population represented, response rates,
instrument content, timing)
● Provider focus groups of employers (include
population represented, response rates,
instrument content, timing)
● Completer satisfaction case studies (include
methodology)
MDE Year Out Survey Completers 1-3 Years Out
P-12 Program Impact Case Study – Completer Focus Groups
Completers 1-3 Years Out
CMU First Destination Survey Completers 1-3 Years Out
CAEP Evidence Map Initial Programs
18 CEHS Office of Planning and Research_JPWK
STANDARD 5: The provider maintains a quality assurance system comprised of valid data from multiple measures, including evidence of
candidates’ and completers’ positive impact on P-12 student learning and development. The provider supports continuous improvement that is
sustained and evidence-based, and that evaluates the effectiveness of its completers. The provider uses the results of inquiry and data collection to
establish priorities, enhance program elements and capacity, and test innovations to improve completers’ impact on P-12 student learning and
development.
Substandard CMU Evidence Administered Required CAEP Annual Measure
CAEP Suggested Evidence
5.1 The provider’s quality assurance system is comprised of multiple measures that can monitor candidate progress, completer achievements, and provider operational effectiveness. Evidence demonstrates that the provider satisfies all CAEP standards.
Taskstream™ Ongoing ● A description of how the evidence submitted in
Standards 1-4 and other provider data are
collected, analyzed, monitored, and reported.
● Evidence of system capabilities including
support for data-driven change (e.g., data can be
disaggregated by specialty license area and/or
candidate level as appropriate), application
across and within specialty license areas, and
ability to disaggregate data by relevant aspects
of EPP management and policy (e.g.,
usefulness).
● Schedule and process for continuous review,
together with roles and responsibilities of
system users.
Data Packs Annually
CAEP Evidence Map Initial Programs
19 CEHS Office of Planning and Research_JPWK
Substandard CMU Evidence Administered Required CAEP Annual Measure
CAEP Suggested Evidence
5.2 The provider’s quality assurance system relies on relevant, verifiable, representative, cumulative and actionable measures, and produces empirical evidence that interpretations of data are valid and consistent.
Need to establish technical adequacy of all instruments
For assessment instruments as evidence for Stds 1-4:
● Description of developmental steps in
constructing instruments
● Empirical/analytical data supporting the use of
the instrument for its intended purposes
● Formal study of the alignment of instruments
with their intended goals
● Implementation procedures and context
● Empirical evidence that interpretations of data
are consistent and valid
(The above are intended to document that measures are relevant, verifiable, representative, cumulative, & actionable:
● Instruments align with construct being
measured.
● Scoring of assessment (items) clearly defined.
● Interpretation of assessment (items) results
unambiguous.
● Data files complete and accurate.
● Data results align with demonstrated quality.
● Convergence (e.g., correlation across multiple
measures of the same construct)/consistency
(e.g., interrater reliability) analysis conducted
appropriately and accurately.
Convergence/consistency is of sufficient magnitude
Need to establish Inter-rater reliability
Need to determine that all data are relevant, verifiable, representative, cumulative & actionable
CAEP Evidence Map Initial Programs
20 CEHS Office of Planning and Research_JPWK
Substandard CMU Evidence Administered Required CAEP Annual Measure
CAEP Suggested Evidence
5.3 Required Component: The provider regularly and systematically assesses performance against its goals and relevant standards, tracks results over time, tests innovations and the effects of selection criteria on subsequent progress and completion, and uses results to improve program elements and processes.
Need systematic method for monitoring individual programs use of the 8 annual measures to make program improvements
Providers document of regular and systematic data-driven changes drawn on research and evidence from the field and data analyses from the provider’s own evidence from its quality assurance systems and from the 2013 CAEP Standards, as well as changes tied to provider’s goals and relevant standards.
● Well-planned tests of selection criteria and each
data-driven change to determine whether or not
the results of the changes are improvements
should include
o baseline(s),
o intervention,
o tracking over time
o rationale for conclusions
o comparison(s) of results, and
o next steps taken and/or planned.
If applicable, providers document use of results of optional Early Instrument Evaluation review; base next steps on test from component 2.3.
CAEP Evidence Map Initial Programs
21 CEHS Office of Planning and Research_JPWK
Substandard CMU Evidence Administered Required CAEP Annual Measure
CAEP Suggested Evidence
5.4 Required Component: Measures of completer impact, including available outcome data on P-12 student growth, are summarized, externally benchmarked, analyzed, shared widely, and acted upon in decision making related to programs, resource allocation, and future direction.
Impact Measure P-12 Program Impact Case Study
Completers 1-3 Years Out
YES Impact measures: 1. P-12 student learning/development 2. Observations of teaching effectiveness 3. Employer satisfaction and completer persistence 4. Completer satisfaction Outcome measures: 5. Completer or graduation rate 6. Licensure rate 7. Employment rate 8. Consumer information.* For above evidence, include
● analysis of trends,
● comparisons with benchmarks,
● indication of changes made in EPP preparation
curricula and experiences,
● how/where/with whom results are shared
● Resource allocations, and future directions.
Impact Measure: Teacher Effectiveness Labels (weak measures)
Completers 1-3 Years Out
YES
Outcome Measure: Completer Rates End of Program
YES
Outcome Measure: MTTC Pass Rates End of Program
YES
Outcome Measure: CMU First Destination Survey (employment rates, consumer information)
Completers 1-3 Years Out
YES
5.5 The provider assures that appropriate stakeholders, including alumni, employers, practitioners, school and community partners, and others defined by the provider, are involved in program evaluation, improvement, and identification of models of excellence.
PEU Shared Governance Structure Ongoing Providers document that stakeholders are involved. Describe stakeholders and roles as relevant to specific examples of shared
● decision making and results, and
● evaluation, and selection and
implementation of changes for
improvement
Clinical Partnership & Planning Committee (CPPC)
Ongoing
Need systematic method for documenting stakeholder involvement