56
Dabney S. Lancaster Community College Institutional Effectiveness Manual Second Edition Prepared by Chris Orem, Ph.D. Director of Institutional Effectiveness Dabney S. Lancaster Community College November 1, 2013 2 nd Ed: July 31, 2014

Dabney S. Lancaster Community College Institutional ... · Dabney S. Lancaster Community College Institutional Effectiveness Manual Second Edition . Prepared by Chris Orem, Ph.D

  • Upload
    lythuy

  • View
    215

  • Download
    0

Embed Size (px)

Citation preview

Dabney S. Lancaster Community College Institutional Effectiveness Manual Second Edition

Prepared by Chris Orem, Ph.D. Director of Institutional Effectiveness Dabney S. Lancaster Community College November 1, 2013 2nd Ed: July 31, 2014

P a g e | 2

A Note About This Manual

This manual explains the strategic planning and institutional effectiveness processes at Dabney S. Lancaster Community College. This manual was adapted from the Virginia Highlands Community College Institutional Effectiveness Handbook with permission from Jennifer Addison at VHCC. I offer many thanks to Jennifer and others at VHCC who laid the groundwork for the DSLCC manual. It is the hope of the Office of Institutional Effectiveness that this manual be used as a resource, a guide, and a roadmap for making evidence-based decisions that are intended to strategically, effectively, and efficiently improve the processes, programs, and services at the College. A sound institutional effectiveness system does not ensure a high caliber of learning and institutional quality, but it sets the stage for decisions to be made, based on meaningful data, that are in the best interest of the College’s constituents. Thank you for using the processes described within to strengthen the quality of your instruction, leaderships, programs, and services. Your hard work makes DSLCC a great place to gain a quality education.

Dr. Chris Orem Director of Institutional Effectiveness [email protected]

P a g e | 3

TABLE OF CONTENTS Introduction .................................................................................................................................................. 4

Mission Statement ........................................................................................................................................ 6

Institutional Effectiveness ............................................................................................................................. 7

Strategic Planning ......................................................................................................................................... 8

I.E. & Strategic Planning Continuous Cycle ................................................................................................. 10

I.E. and Strategic Planning at DSLCC ........................................................................................................... 11

Outcomes Assessment & Evaluation Process ............................................................................................. 14

Educational Programs Student Learning Outcomes Assessment ........................................................... 14

Administrative and Educational Support Units Assessment Process ..................................................... 20

General Education Assessment ................................................................................................................... 27

Institutional Effectiveness Support ............................................................................................................. 32

References .................................................................................................................................................. 33

APPENDIX A: ACHIEVE 2015 Initiative ......................................................................................................... 34

APPENDIX B: DSLCC Institutional Goals 2014-2015 .................................................................................... 35

APPENDIX C: Excerpt from 2013-2014 President’s Goals Template........................................................ 36

APPENDIX D: DSLCC Annual Institutional Effectiveness Timeline ............................................................... 37

APPENDIX E : Schedule of Assessment Activities ........................................................................................ 38

APPENDIX F: Transfer Program Curriculum Map ........................................................................................ 39

APPENDIX G: Academic Assessment Evaluation Rubric .............................................................................. 41

APPENDIX H: SLO Assessment Reporting Template for Educational Programs ......................................... 45

APPENDIX I: Academic Program Assessment Template ............................................................................ 47

APPENDIX J: Administrative Unit Assessment Rubric ................................................................................ 48

APPENDIX K: Administrative Unit Assessment Report Outline .................................................................. 52

APPENDIX L: Administrative Unit Assessment Plan Template .................................................................... 55

P a g e | 4

Introduction Institutional Effectiveness Defined Institutional effectiveness is the accomplishment of the College’s mission through the achievement of stated outcomes, goals, and objectives. DSLCC uses a variety of evaluation and assessment processes to determine the degree to which it accomplishes its mission. Through these cyclical processes of identifying outcomes and goals, measuring the extent to which these outcomes and goals are achieved; and using results to identify ways of improving programs and services, College faculty, staff, and administrators can help ensure that the entire campus is actively involved in helping the College achieve its mission. Assessment and Evaluation Assessment and evaluation are two distinct, yet interconnected processes. Evaluation can be thought of as the umbrella process of achieving institutional effectiveness. The evaluation process involves such practices as needs assessments, surveys, and the institutional review of outputs and inputs. The overall process of evaluation also includes assessment, which can be considered one of the most difficult, yet important aspects of the evaluation process. Assessment is the systematic basis for making inferences about student learning and development. Though this definition highlights student learning as a core component of assessment, the process is often used to refer to practices not directly involving student learning. An effective evaluation process incorporates the assessment of learning outcomes as well as the review of non-learning outcomes to facilitate institutional improvement, which in turn, results in institutional effectiveness. As the institutional effectiveness process at DSLCC matures and evolves, the distinction between evaluation and assessment will be made clearer. During this phase of IE development, the term assessment is used more commonly throughout reports involving both learning and non-learning outcomes. However, as will be described throughout this manual, the institutional effectiveness process at DSLCC incorporates a comprehensive evaluation of outcomes, outputs, and programs, all of which focus on improving the degree to which the College accomplishes its mission. A Process for Continuous Improvement A successful institutional effectiveness program cannot succeed without commitment from faculty, staff, and administration. Institutional Effectiveness must involve the desire to improve processes, the ability to communicate strengths, and the humility to recognize and share weaknesses. For an institutional effectiveness process to be successful, faculty and staff must be able to intentionally reflect upon trustworthy results, and communicate weaknesses to the appropriate stakeholders in a

P a g e | 5

forum that encourages improvement. When evaluation and assessment processes are viewed as the means by which changes are identified, resources are provided, and improvement is rewarded, institutional effectiveness is, excuse the pun, most effective. When faculty and staff are hesitant to share weaknesses for fear of “punishment,” perhaps in the form of poor performance evaluations, loss of resources, or public scrutiny, decision-makers will never be able to trust that the results they receive illustrate the true state of affairs. Thus, a formative mindset must be in place for a culture of assessment and evaluation to develop. Accountability The College must also demonstrate evidence-based decision making in order to be held accountable to its various stakeholders. Of the College’s external stakeholders, perhaps no one is more interested in the presence of an integrated, institution-wide process of institutional effectiveness as the College’s accrediting body, the Southern Association of Colleges and Schools Commission on Colleges (SACSCOC). Through the process of accreditation reaffirmation, which occurs every ten years, the College is responsible for complying with multiple SACSCOC principles, notably Core Requirement 2.5 and Comprehensive Standard 3.3.1, which are specifically devoted to institutional effectiveness.

2.5 The institution engages in ongoing, integrated, and institution-wide research-based planning and evaluation processes that (1) incorporate a systematic review of institutional mission, goals, and outcomes; (2) result in continuing improvement in institutional quality; and (3) demonstrate the institution is effectively accomplishing its mission. (Institutional Effectiveness) 3.3.1 The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional Effectiveness)

3.3.1.1 educational programs, to include student learning outcomes 3.3.1.2 administrative support services 3.3.1.3 educational support services 3.3.1.4 research within its educational mission, if appropriate 3.3.1.5 community/public service within its educational mission, if appropriate

These principles of accreditation are in place so that the College uses broad-based planning and evaluation processes to ensure that students receive the highest quality of education possible and that the College is intentionally documenting and evaluating the degree to which it accomplishes its mission.

P a g e | 6

Mission Statement The mission statement of the College is the guiding document for the development of institutional goals and outcomes assessment for programs and administrative and educational support units. It is the statement that defines the core functions and goals of the College, defining the most fundamental criteria for assessing institutional effectiveness. The current mission statement was last reviewed by the College’s President’s Council in June 2013 and was reaffirmed by the DSLCC Local Board in September 2013.

VCCS Mission Statement

We give everyone the opportunity to learn and develop the right skills so lives and communities are strengthened.

from ACHIEVE 2015, VCCS Master Plan, adopted February 2010

Dabney S. Lancaster Community College Mission Statement

Dabney S. Lancaster Community College (DSLCC) provides an opportunity for the extension of knowledge, skills, and personal enrichment in a forum that maintains high academic standards, is financially and geographically accessible, and respects each student's rights and responsibilities.

The College offers programs at the associate degree level as well as a full complement of credit and non-credit courses and resources tailored to the life-long learning needs of local residents. On-site and long-distance options incorporate up-to-date technological support. Comprehensive guidance, counseling, and tutoring services assist students in achieving their potential and in making sound educational, occupational, and personal choices.

The College serves the diverse needs of the community, sponsoring a wide array of cultural events, offering training to meet changing workforce demands, and providing facilities for research and recreation. In cooperation with local business, industry, government, professions, and other educational institutions, DSLCC prepares students for a full range of academic choices and careers.

Reviewed by the President’s Council – June 17, 2013 Reaffirmed by the DSLCC Local Board – September 16, 2013

P a g e | 7

Institutional Effectiveness The Institutional Effectiveness process at DSLCC encompasses an ongoing, integrated, and institution-wide approach to ensuring that the College fulfills its mission. The College has adopted the Nichol’s Institutional Effectiveness Model (see illustration below). This Model describes the institutional effectiveness paradigm, which contains the following critical elements: 1) establishing an institutional mission, 2) identifying intended educational and administrative outcomes and goals, 3) Assessing the degree to which these outcomes and goals are accomplished, and 4) making changes to the institution’s mission, goals, or programs, based on assessment findings. This process occurs at both the institutional and program-level. Improvements are driven by available resources and the strength of assessment data.

Feedback of Assessment Results Implementation of Departmental/ Program Assessment Plans

Program Adjustments/Improvements Use of

Results

Institutional Adjustments Resource Availability Decisions

Specify Intended Goals and Outcomes

Assessment Activities

Mission

Adapted from: Nichols, J. O. (1995). A practitioner’s handbook for institutional effectiveness and student outcomes assessment implementation (3rd ed.). Agathon Press: New York

P a g e | 8

Strategic Planning What is the difference between Strategic Planning and Institutional Effectiveness (IE) Planning? Strategic planning outlines the actions taken by the College to implement its mission. IE planning identifies the methods and processes that help determine how well the College’s is fulfilling its mission. Strategic planning is means/process oriented. It focuses on the actions that improve processes, programs, and services. Institutional effectiveness planning is outcomes oriented. When learning outcomes are involved, IE planning focuses on determining the extent to which students are acquiring the knowledge, skills, and abilities expected of them by their academic programs. When the outcomes do not focus on learning, IE planning focuses on measuring the extent to which administrative and student support units are meeting specified targets and operating effectively in a manner that helps the college accomplish its mission.

Adapted from Addison, J. (2013). Virginia Highlands Community College Institutional Effectiveness Handbook (5th ed.) location: http://www.vhcc.edu/index.aspx?page=958. (Original Source: The Department Head’s Guide to Assessment Implementation in Administrative and Education Support Units by James O. Nichols and Karen W. Nichols, Agathon Press, New York, 2000.)

College Mission and Strategic

Goals

Strategic Planning is

Means/Process-Oriented

Institutional Effectiveness Planning

is Ends/Outcomes-Oriented

What actions should we take to implement our mission and goals?

How well are our students learning and our administrative and educational support units functioning?

P a g e | 9

The following chart demonstrates the organization of institutional and unit goals at DSLCC. The VCCS mission drives both the establishment of VCCS strategic initiatives as well as the DSLCC mission and strategic goals. Using these sources, the President establishes annual goals that are then disseminated to the Vice Presidents who then disseminate goals to the unit heads. Further, the student learning outcomes in the educational programs are developed to help ensure the college is accomplishing its mission. These SLOs are often in tandem with program goals. The cycle of establishing these goals and determining the degree to which the mission of the College is being accomplished is described in the next section.

VCCS Achieve Goals

DSLCC Strategic Goals

VCCS Chancellor’s Goals

President’s Goals

DSLCC Mission

VCCS Mission

Division Goals (Vice Presidents)

Unit/Program Goals

(Directors and Program Heads)

Student Learning Outcomes

(Program Heads)

Organization of Institutional/ Unit Goals

P a g e | 10

I.E. & Strategic Planning Continuous Cycle At DSLCC, our mission, specific to the college yet connected to the VCCS mission, drives the continuous cycle of strategic planning and institutional effectiveness. The graph below illustrates the cyclical nature of the college’s long term strategic planning and annual institutional effectiveness planning, demonstrating the connectedness between the two processes and their connection to the mission. A description of each stage in the process is provided.

Division/Unit Goals assessed according to

specified plan

Strategic Goals Established/Update

Measures and Implementation

Timeline Identified

Strategic Goals evaluated on key performance indicators

Division/Unit Heads develop/update Assessment Plan

Division/Unit Results Gathered and

disseminated to unit staffs

Divisions and Units identify annual goals to

align with strategic goals

1

2 3

4

5

1 4

3 2

Long Term Strategic Planning

Improvements to college programs and

services identified based on progress

5

Annual Institutional Effectiveness Planning

Local Board Approves Strategic Plan

VCCS Mission

DSLCC Mission

VCCS Chancellor’s Initiatives

VCCS ACHIEVE 2015

Adapted from Addison, J. (2013). Virginia Highlands Community College Institutional Effectiveness Handbook (5th ed.) location: http://www.vhcc.edu/index.aspx?page=958.

P a g e | 11

I.E. and Strategic Planning at DSLCC As the Institutional Effectiveness and Strategic Planning Continuous Cycle graphic illustrates, the College’s strategic plan is driven by the College’s mission statement. Using the mission statement as a guiding force, institutional and strategic goals, measures, and timelines for completion are developed. The strategic plan describes the major initiatives and direction for the College, laying the foundation for annual goal planning at each level of the College. In addition to the mission, the initiatives of the VCCS Chancellor and the VCCS Strategic Plan influence the strategic plan of the College.

Long Term Strategic Planning

At least once every five years, the President reviews the need for long-range, fully-integrated, institutional planning, with input solicited from the College community through its established committees and boards, through in-service college-wide meetings, and through other ad hoc meetings and venues. Input may also be sought from the general community through various mechanisms deemed appropriate by the President. The VCCS strategic plan, currently ACHIEVE 2015, is also used to guide the College’s strategic plan. This process can take several months and results in a set of strategic goals to guide the College’s strategic planning initiatives over the next five years. Once the strategic goals have been established, specific measures, objectives, and timelines are developed to ensure the successful completion of each goal. This information, along with the strategic goals themselves, is compiled into the College’s strategic plan. The strategic plan, once completed, is approved by the DSLCC Local Board. Yearly updates are provided to the Local Board to illustrate the progress being made. Every year in July, a progress report on the strategic goals is developed and disseminated to the President’s Council. This report provides results of key performance indicators that are used to evaluate the achievement of the strategic goals and annual President’s Goals. These results are then used as the foundation for a discussion of the success/failure of the various strategies implemented to meet the college’s strategic goals. Using results stemming from key performance indicators, institutional improvements are

identified and implemented to better meet the strategic goals. These improvements are incorporated into budget requests by various units, but also take the form of college-wide initiatives that span multiple divisions and units.

1

2

3

4

5

P a g e | 12

Annual Institutional Effectiveness Planning

The five-year strategic goals also require annual goals and objectives to help meet long-term planning and initiatives. These annual goals stem from institutional improvements identified through the strategic planning and effectiveness process, but also from other sources, such as the VCCS Chancellor’s annual goals, established each spring. From these sources, The DSLCC president creates annual goals that align with the strategic goals of the College and with the VCCS Chancellor’s goals. The President shares his goals with the President’s Council in May. The President’s Goals for the next academic year are shared with the Local Board during their June/July meeting where the goals receive board approval.

After receiving the President’s annual goals, Vice Presidents establish divisional outcomes, which are shared with individual administrative and educational support units. These units establish annual outcomes that align with the college’s strategic goals, the President’s annual goals, divisional outcomes, and departmental goals.

The unit heads and vice presidents, with the help of their staffs, are responsible for evaluating the extent to which their areas are meeting departmental goals. This is accomplished by assessing annual outcomes and identifying and implementing areas of improvement. After goals and outcomes have been established, each division and unit develops an Assessment Plan for the upcoming year. This plan specifies the unit outcomes, alignment with President’s Goals and Strategic Goals, assessment tools, and targets for success. These reports are due to supervisors by late September. In March of each year, each administrative unit use preliminary assessment data to request new resources during the College’s annual budget allocation process. A final report is due by May 31st that specifies the results of the various assessments, and provides updates on previous plans for improvement as well as new action plans that result from the year’s assessment activities. The Assessment Reports are shared with the unit staffs and with supervisors as a way to determine the extent to which the Strategic Goals and President’s Goals have been met. From these conversations, improvements can be identified.

Using the Assessment Reports, unit staffs and divisions identify unit improvements to help better meet goals and outcomes in the future. These improvements also form the basis for determining the extent to which the College’s Strategic Goals are being met and help identify areas where institutional improvements are most needed.

1

2

3

4

5

P a g e | 13

This process is cyclical because within each long-term and annual reporting cycle, the institution updates the strategic plan based on assessment results. Units continuously use assessment data not only to make changes to their programs and services, but to establish new goals and outcomes to better address their needs. A timeline of the annual Institutional Effectiveness Planning Process is provided in Appendix D. The VCCS Chancellor and the College President meet annually to review the College’s progress towards meeting the President’s goals. This process provides an additional measure of accountability to ensure the College is adequately fulfilling its mission and working towards meeting the initiatives of the VCCS. Budget Allocation Process

The budget projection and allocation process typically begins shortly after the beginning of the spring semester. A budget request is presented from each department to their respective Vice President for review. This budget request is accompanied by a Budget Priority Report, which includes the unit’s top, non-essential priorities for the upcoming year, supported with evidence to justify the request. Once approved, the budget manager make a presentation to the senior leadership team (President, Vice President for Instruction, Student Services, and Research, the Vice President for Continuing Education and Workforce Services, Vice President for Financial and Administrative Services) outlining their upcoming budget needs for the next fiscal year and also explain any new or on-going initiatives requiring additional funding. Once all budget presentations are complete, the Vice President for Financial and Administrative Services) all requests and include them in one comprehensive budget document. This document is reviewed and adjusted by the senior leadership team to meet the validated budget allocation and projections for the upcoming fiscal year. The validated budget allocation is provided by the Virginia Community College System office. Once the process is completed, this final budget document is approved and signed by all members of the senior leadership team and at that time budget managers are notified of the amount of the budgets for their respective areas for the upcoming fiscal year. This process typically occurs in late May or early June, prior to the beginning of the fiscal year on July 1. Once the new fiscal year begins a monthly budget report reflecting total expenditures and remaining balances is sent to the budget managers at the beginning of each month.

P a g e | 14

Outcomes Assessment & Evaluation Process The College’s Educational Programs and Administrative and Educational Support Units complete annual outcomes assessment and evaluation processes (referred to from here on as the outcomes assessment process). Though the basic assessment cycle for educational programs and administrative and educational support units is essentially the same, differences exist. These differences will be highlighted in the following descriptions of the assessment processes for educational programs and Educational Support and Administrative Units.

Educational Programs Student Learning Outcomes Assessment Assessment Schedule and Responsibilities Student Learning Outcomes Assessment for educational programs occurs every academic year. Program heads conduct learning outcomes assessment throughout the year and compile a summary of the process into an assessment report, which is due May 15 each year (the last day of faculty contracts). Faculty may choose to report their assessment results over an academic year or a calendar year. Program Heads are expected to submit assessment reports to their division dean and to the Director of Institutional Effectiveness. The Division Dean ensures that the program heads submit reports by the deadline and that the reports are complete. The Director of Institutional Effectiveness reviews the reports, and works throughout the year with faculty to improve the assessment process, clarify information within the report, and strengthen the overall quality of the assessment process and ensuing report. The learning outcomes assessment cycle is pictured below.

Use Information to Make Improvements

The Assessment

Cycle

Analyze Information

Collect Information

Set Targets for Success

Identify Measures

Establish SLOs

P a g e | 15

Establishing Student Learning Outcomes: Each year faculty are expected to establish student learning outcomes. This step is the foundation of the assessment process, as it provides direction to faculty about what they believe is important for their students to know, think, or do. Establishing measureable outcomes also identifies the types of assessment tools that are most appropriate for adequately measuring achievement on each outcome. Faculty are encouraged to set as many program-level learning outcomes as they see necessary, though a common benchmark is anywhere between three to five outcomes for certificate programs and six to ten learning outcomes for degree programs.

Learning Outcomes are concise, clear statements describing the knowledge, skills, and abilities expected of students as a result of their educational experience. All educational programs must include learning outcomes in their assessment reports. It is recommended that other program outcomes are included as well. Program outcomes measure non-learning related objectives or outputs, such as graduation rates, student satisfaction, transfer rates, job placement rates, or retention rates.

• Refer to the College’s strategic goals when setting program outcomes to ensure they reflect the College mission and purposes. In addition, this will aid in the compilation of an institution-wide plan (see Strategic Planning section).

• All outcomes (learning and program) need to be assessed, though not all outcomes must be assessed every year. Once all outcomes have been assessed at least once, consider establishing a cycle in which 3-5 learning outcomes are assessed each year, with all outcomes being assessed at least once every three years.

• For each outcome/objective, define one or more measures. The more measures you define, the more data (evidence) you will gather.

Identifying Measures An assessment measure is the tool that is used to collect information about the degree to which students are meeting expected outcomes. In essence, the score a student receives on a measure is the quantifiable representation of his or her “actual” achievement of the learning outcome. Faculty must infer from the tool’s score whether or not a student has achieved all or part of a learning

Learning Outcomes

identify the knowledge,

skills, and abilities

expected of students as a

result of your program.

“Students will identify

three literary genres”

Program Outcomes

identify measures of

success not directly

linked to student

learning.

“The program will

graduate five students

annually”

P a g e | 16

outcome. Therefore, it is inherently important that faculty trust the scores as an accurate, “True” representation of student learning, and likewise the inferences they then make from the scores. This process of making meaning from test scores is known as the validity process. One may hear that a test “is valid.” In reality, a test cannot be valid. Rather, validity a matter of degree and one can always look to strengthen the degree to which the inferences made from test scores can be trusted. There are many ways to help improve the validity of inferences made from assessment measures. Using more than one measure to assess a single learning outcome can be a useful way to complement the limitations of certain measures in order to gain a comprehensive evaluation of student learning. Additionally, when using measures such as tests, it is important that several items be used to determine achievement on an outcome. Therefore, avoid using single items to measure success on a program-level learning outcome. When determining whether or not a measure may be sufficient for determining the achievement of a learning outcome, ask the following questions: Are the measures I am using appropriate for the wording of the outcomes? Are the measures comprehensive enough to assess the breadth and depth of the learning

specified in the outcome? There are two general types of measures one can use to assess student learning: Direct and Indirect Measures. Typically, an assessment program that uses both direct and indirect measures provides the most comprehensive review of student learning. Direct measures are objective assessments of student learning. At a basic level, all student learning outcomes must be assessed with at least one direct measure. Common examples of direct measures include comprehensive tests, standardized exams, student portfolios, rubrics, and internship or employer evaluations. Indirect measures are subjective measures of beliefs, attitudes and perceptions about student learning. Indirect measures are best used to supplement direct measures of student learning and can provide excellent information that strengthens the inferences one can make from direct measures. Indirect measures are typically student satisfaction surveys, questionnaires, course evaluations, national surveys such as the Community College Survey of Student Engagement (CCSSE), graduate exit survey, alumni surveys, focus groups, etc.

P a g e | 17

Measures can be qualitative or quantitative. Qualitative measures contain non-numerical data such as verbal or written feedback from students/staff/faculty. These measures can sometimes provide a more comprehensive perspective of an aspect of student learning, but the results often take a much longer time to analyze. Quantitative measures collect numerical data that can be analyzed statistically. These measures are more common, can be easier to analyze, and are useful for trend analyses and longitudinal comparisons. It is recommended that the vast majority of measures be direct, if only because of the time needed to properly analyze and evaluate qualitative measures.

Often, the results of qualitative measures can be quantified. Consider an instructor who wants to assess written communication. A student may submit a written essay that will be evaluated by the instructor to determine student success on the outcome. By using a rubric, in which various criteria of the essay are assigned point values based on the degree to which expectations are met, the instructor can easily quantify the results of the assessment, using score averages, setting benchmarks for proficiency, and examining trends over time to quickly and easily assess student learning.

Identifying Targets for Success Once measures have been identified, it is important that targets are established. A Target allows you to establish a specific level of success that is either expected or desired from students. A target then specifies the level of student success needed to determine whether or not the outcome has been met. Targets should be set at a level that is reasonable, yet challenges student performance. Targets that are too low may boost your ego, but the results won’t tell you anything about student learning. Targets that are set too high might be unrealistic and, likewise, won’t provide much information about ways you can improve programs, courses, or resources to strengthen student learning. If you haven’t set a Target before, consider taking the first year to establish a baseline, after which you can use this baseline as the target, increasing it or decreasing it based on changes made to the program. Targets should always relate to both the measure and the outcome. Though it is easy to set an arbitrary target, use as much information as you have available to answer the question: “What level of success will allow me to say with some confidence that the outcome has been met?” Targets for success can be many things. Examples of the most common targets are framed below: The average score on the test will be a 75% (student learning)

100% of students will score above a 3.0 on the rubric (student learning)

P a g e | 18

The average score on the test will increase by five points over the average score from last year (student learning)

90% of students will state that they are satisfied with the level of instruction in the program

(program outcome) 90% of graduates will have a job by the time of commencement (program outcome)

Data Collection Once measures have been developed and targets have been established, it is important for results to be collected with integrity and in a manner that again, allows faculty to make inferences about the degree to which students are meeting the outcome. Two major criteria must be met, however, to help ensure that the results are collected using a meaningful, appropriate process. When assessing program-level learning outcomes:

1. It is often most appropriate to be assessing students at the end of their program. Remember, you are interested in knowing what students know, think, or do as a result of their educational experience, which typically means after they have experienced as much of the program as possible. Therefore, assessing students on a program-level learning outcome after their first semester may not be appropriate, especially if students are then exposed to additional resources or instruction in later courses that relate directly to the outcome.

2. Common assessments should be used when multiple sections of courses are offered. This point should not be confused as an argument against academic freedom. In courses where multiple sections are offered (recognizing not very many of these exist at DSLCC), instructors may teach the material in any way they see fit. However, students should be expected to learn similar concepts, regardless of the section they take. Therefore, if program-level learning outcomes are being assessed in courses where multiple sections are offered, all students must be assessed using the same criteria and with the same items. Otherwise, one may never be able to compare student performance with any real certainty. This point is of major concern when course-embedded assessments are the major method used to assess program-level learning outcomes.

Analyzing Information After assessing each outcome using the appropriate measure(s), results should be provided that a) relate back to the measure and outcome, b) allow for interpretation of student learning, c) demonstrate multiple years of data. Sharing results should not be complicated, and if the preceding components of the assessment process have been followed, then listing results should take care of itself. The purpose of this section is to determine if your outcomes were met. Though listing the

P a g e | 19

results is important, even more important is that you can interpret the results within the context of the target and outcome. Explain whether the results meet the outcome based on the target and the degree to which you can trust these results. Ask yourself the following questions: To what degree do these results support the conclusion that students are or are not

meeting the outcome? What was done this year that might explain the results, especially if they differ significantly

from past iterations? Results that lead you to the conclusion that students did not meet the outcome should not be seen as negative. Certainly, we all hope that our students are learning and that they meet the standards we set for them. However, results that are weaker than you would like should be seen as an opportunity to review and evaluate the way parts of your program are offered. Why were the results lower than you would like? Should changes be made to courses or experiences? Should students follow the program of study more closely? Was the test flawed? Were extenuating circumstances to blame? Poor results are not an immediate indication of poor instruction, but they should lead faculty to critical introspection and reflection about what needs to be done differently to improve results in the future. Assuming you can trust the inferences you hope to make from your assessment results, your findings should lead you to ask two fundamental questions: What do these results tell us about student learning? How can we use these results to improve student learning?

If your findings indicate that students did not meet the target, the next step is to determine what changes need to be made in order to improve results for next year. Remember, the purpose of assessment is to help us determine our effectiveness and document our continuous improvement towards meeting program and student learning outcomes.

Making Improvements Based on Results The Action Plan describes the steps you plan to take in the upcoming year to improve instruction, program offerings, resources, and any other aspect of the program that you think might lead to improved findings relative to student learning outcomes. Once the action plan has been implemented, you are able to show how you “closed the loop.” If you did not meet your outcomes you MUST have an Action Plan. Even if you do meet your outcomes, you may still identify ways to improve based on your interpretation of the findings. Remember that you may not always meet your outcomes, but the purpose of assessment is improvement. Your findings may help you to create new outcomes for the upcoming year, revise targets, make important changes to program offerings, or simply reveal the need for better

P a g e | 20

assessment. We can always get better, and assessment is the vehicle by which improvements can best be identified, justified, and implemented.

Administrative and Educational Support Units Assessment Process Assessment Schedule and Responsibilities Administrative and educational support units complete assessment annually. Directors/Vice Presidents that oversee offices, departments, and divisions are responsible for completing an assessment report every year. Unit directors and division vice presidents develop yearly measurable outcomes that align with departmental goals and incorporate appropriate President’s Goals. Together, these goals and outcomes form the foundation for determining institutional effectiveness. These goals and outcomes are used to establish an assessment plan, which is due by late September. This Assessment Plan only contains outcomes, goals, the alignment of unit/division goals to strategic/ President’s Goals, performance indicators, and targets for success. Initial findings from the plans should be identified by March 15th in time for budget requests to be informed by assessment data. Because other data are not collected until the end of the academic year, a final assessment report is due by May 31st. Outcome results, the status of previous action plans, and new plans for improvement based on the year’s findings are included in the final Assessment Report, due by May 31st each year.

The Assessment Report is submitted to the respective unit director’s Vice President, or, in the case of the Division Vice Presidents, the reports are sent to the President. Reports are also submitted to the Director of Institutional Effectiveness, who reviews the final report to ensure the proper cycle of assessment is followed, and to offer suggestions on how to better disseminate the information in a way that ties the use of results to outcomes. Before the next Assessment Plan is submitted, the Director of Institutional Effectiveness meets with each Unit/Vice President to offer suggestions on improvement and to help ensure that any relevant President’s Goal has been included in the unit’s plan. Unit directors and supervisors then meet to discuss the previous year’s reports and plans for the next year by late September.

Assessment Timeline

July

• President’s Goals finalized and shared with administrators

September

• Assessment Plans Finalized

March/April

• Initial findings from assessments used to justify budget requests for following year

May

• Final Assessment Reports submitted to supervisors and Director of IE

P a g e | 21

The following sections provide additional information about each step in the Administrative Assessment Cycle.

Establishing Goals and Outcomes Each year unit heads are expected to establish goals and outcomes. This step is the foundation of the assessment process, as it provides direction to unit heads about what they believe to be the core functions of their units. Further, well-written, measurable outcomes also provide information about ways that units support both college-wide initiatives the overall institutional mission. Although individual outcomes may describe relatively specific functions of an office, some connection to the larger mission and strategic goals of the College must be established. Ensuring that outcomes are measurable also helps identify the appropriate performance indicators needed to measure the unit’s effectiveness in meeting each goal/outcome. By identifying the indicators needed to measure each goal or outcome, targets for success can be established within each unit/division, leading to a fruitful and productive assessment process. All of this assessment starts, however, with the establishment of well-written goals and measurable outcomes.

Use Information to Implement Action Plans

for Improvement

The Administrative

Assessment Cycle

Analyze Information

Collect Information

Identify Performance

Indicators

Set Targets for Success

Establish Outcomes

P a g e | 22

What’s the difference between a goal and an outcome?

Depending upon whom you ask, not much. These two terms can be used interchangeably as you will see a little later in this document. However, they are used separately within the context of this

manual to illustrate the differences between the core functions of individual units (Goals) and the ways in which each unit is expected to support the President’s annual goals (outcomes). Confused yet?

Goals tend to describe the unit’s stakeholders and the specific benefit that these stakeholders are expected to gain as a result of the unit’s programs and services. For example, a common stakeholder of Student Services is the student. A benefit that may result from students using Student Services is that they will receive accurate advice about their academics (a core function of Student Services). Thus, a departmental goal for Student Services might be: Students will receive accurate advice about their academics after visiting Student Services. By itself, this goal isn’t incredibly measurable, but it accurately describes a core function of the department. In the context of administrative unit assessment, outcomes tend to be process-oriented, describing various levels of activity, compliance with various standards, or process efficiencies. Think about it this way: Goals describe the broad functions, services, and benefits to a population that are provided by the department. Goals are often and should be tied to the departmental mission (which is why they don’t change year to year). However, in order to determine the degree to which these goals are met, outcome statements are created that are measurable, specific, and directly align with the broader goals. Because these outcomes are more closely tied to specific outcomes determined each year by the President and/or Vice Presidents, unit outcomes may relate more to specific levels of accomplishment. For instance, if the President’s Goal (I know this is confusing. If it’s easier, within our context, President’s Goals should be called President’s Outcomes) is to increase resources to orient students

to campus, a Student Services outcome might be to achieve 100% first-time student attendance at New Student Orientation. This outcome may support a broader departmental goal such as “students will be well informed about college policies, procedures, and opportunities to become connected to the community.” In other cases, a President’s Goal might align perfectly with a

Goals versus Outcomes

Goal: A Goal is a broad statement that identifies the core functions and services of a department. It also identifies the benefits to a group of stakeholders (e.g., students, faculty, staff) that result from them using the programs and services. Goals tend not to be very measurable as written. They remain relatively stable year to year.

“Students will receive accurate and timely academic advice”

Outcome: Outcomes are used to specify the measurable actions/expectations that, if met, will demonstrate the achievement of a goal. They are written using measurable terms and specific language, and typically contain a target for success (or make it easy for a target to be identified). Depending on the focus of the department, outcomes may change every year.

“90% of students will be satisfied with the quality of advice they receive from an academic

P a g e | 23

department’s outcome and can be used verbatim. For example, a President’s Goal (outcome) might be to add two new academic programs during the year. The Vice President of Academic Affairs may use that outcome specifically, perhaps aligning it with a divisional goal like “Students will have access to relevant, high quality academic programs that meet the needs of today’s workforce.” Whereas departmental goals tend to be stable year to year (a unit likely always wants to provide accurate information and satisfy the customer), outcomes may change based on the specific initiatives set forth each year by the College President. However, there is no requirement that outcomes change year to year (it may always be a outcome of Student Services to achieve 100% first-time student attendance at New Student Orientation). Tips:

• Refer to the College’s institutional goals to establish departmental goals to ensure they reflect the College mission and purposes. In addition, this will aid in the compilation of an institution-wide plan (see Strategic Planning section).

• Develop outcomes to provide measurable statements you will use to meet your broader goals. All outcomes need to be assessed, though not all outcomes must be assessed every year. Once all outcomes have been assessed at least once, consider establishing a cycle in which 3-5 learning outcomes are assessed each year, with all outcomes being assessed at least once every three years. Remember that departmental goals aren’t likely to change much year to year, whereas outcomes likely change annually to align with specific initiatives at the department and college level.

Identifying Performance Indicators and Measures A Performance Indicator is simply the piece of information that is used to determine a unit’s effectiveness for a particular goal or outcome. Performance Indicators are typically pieces of data such as retention numbers, enrollment figures, or graduation rates. Assessment measures, on the other hand, typically require administration and analysis, such as with a satisfaction survey or focus group results. Assessment measures are tools used to collect information about the degree to which students are meeting expected outcomes or goals. There are many ways to help improve the validity of inferences made from assessment measures. Using more than one measure to assess a single learning

Performance Indicator:

A P.I. is a statistic that can be used to measure effectiveness of a specific outcome (graduation rate, retention rate, attendance figures, usage data)

Assessment Measure:

An assessment measure typically requires actual data collection and may need to be developed and analyzed by personnel (e.g., satisfaction surveys).

P a g e | 24

outcome can be a useful way to complement the limitations of certain measures in order to gain a comprehensive evaluation of student learning. Additionally, when using measures such as tests, it is important that several items be used to determine achievement on an outcome. Therefore, avoid using single items to measure success on a program-level learning outcome. When determining whether or not a measure may be sufficient for determining the achievement of a goal or outcome, ask the following questions: Are the measures I am using appropriate for the wording of the outcomes? Are the measures comprehensive enough to assess the breadth and depth of the action

specified in the outcome? Measures can be qualitative or quantitative. Qualitative measures contain non-numerical data such as verbal or written feedback from students/staff/faculty. These measures can sometimes provide a more comprehensive perspective of an aspect of a unit’s effectiveness, but the results often take a much longer time to collect and analyze. Quantitative measures collect numerical data that can be analyzed statistically. These measures are more common, can be easier to analyze, and are useful for trend analyses and longitudinal comparisons. It is recommended that the vast majority of measures be direct, if only because of the time needed to properly analyze and evaluate qualitative measures.

Identifying Targets for Success Once measures have been identified, it is important that targets are established. A Target allows you to establish a specific level of success that is either expected or desired from stakeholders. A target then specifies the level of effectiveness/success needed to determine whether or not the outcome or goal has been met. Targets should be set at a level that is reasonable, yet challenges unit performance. Targets that are too low may boost your ego, but the results won’t tell you anything about how well your unit is doing. Targets that are set too high might be unrealistic and, likewise, won’t provide much information about ways you can improve programs, courses, or resources to strengthen your unit. If you haven’t set a Target before, consider taking the first year to establish a baseline, after which you can use this baseline as the target, increasing it or decreasing it based on changes made to the program. Targets should always relate to both the measure and the outcome/goal. Though it is easy to set an arbitrary target, use as much information as you have available to answer the question: “What level of success will allow me to say with some confidence that the outcome has been met?”

P a g e | 25

Targets for success can be many things. Examples of the most common targets are framed below: Retention of at-risk students will increase by 5% over two years

Dual enrollment headcount will increase by 25 over last year

90% of students will state that they are satisfied with the level of service provided by the

unit

Data Collection Once measures have been developed and targets have been established, it is important for results to be collected with integrity and in a manner that again, allows staff to make inferences about the degree to which the unit is meeting the outcome. In particular, when surveys are used, it is imperative that appropriate steps be taken to ensure that the data collected is representative and adequately-sized. Surveys provide valuable information to units that can help identify areas where service or functions can be improved. However, it is important to understand how the way the survey is administered can affect the degree to which results can or should be interpreted. A graduate exit survey can provide valuable information about the services found most useful to graduates, but these same services could also lead certain students to stop attending the College and thus the unit would never know this if only graduates are surveyed. Offering incentives for completing surveys might entice some students or staff to fill it out, but for those potential participants who have no interest in the incentive, there exists little motivation to provide valuable insights. Further, if a survey is only filled out by ten people, what larger inferences can be drawn from the results? Is it fair to draw conclusions about all stakeholders based on the opinions of ten people?

Analyzing Information After assessing each outcome using the appropriate measure(s), results should be provided that a) relate back to the measure/performance indicator and outcome/goal, b) allow for interpretation of effectiveness, c) demonstrate multiple years of data. Sharing results should not be complicated, and if the preceding components of the assessment process have been followed, then listing results should take care of itself. The purpose of this section is to determine if your outcomes and goals were met. Though listing the results is important, even more important is that you can interpret the results within the context of the target and outcome. Explain whether the results meet the outcome based on the target and the degree to which you can trust these results. Ask yourself the following questions:

P a g e | 26

To what degree do these results support the conclusion that stakeholders are or are not meeting the outcome/goal?

What was done this year that might explain the results, especially if they differ significantly from past iterations?

Results that lead you to the conclusion that stakeholders did not meet the outcome should not be seen as negative. Certainly, we all hope that our units are operating effectively in support of the College’s mission and that they meet the standards we set for them. However, results that are weaker than you would like should be seen as an opportunity to review and evaluate the way parts of your programs and services are offered. Why were the results lower than you would like? Should changes be made to programs and services? Are additional resources needed? Was the assessment measure flawed? Were extenuating circumstances to blame? Poor results are not an immediate indication of poor performance, but they should lead unit staff to critical introspection and reflection about what needs to be done differently to improve results in the future. Assuming you can trust the inferences you hope to make from your assessment results, your findings should lead you to ask two fundamental questions: What do these results tell us about the effectiveness of our unit? How can we use these results to improve our unit’s effectiveness?

If your findings indicate that students did not meet the target, the next step is to determine what changes need to be made in order to improve results for next year. Remember, the purpose of assessment is to help us determine our effectiveness and document our continuous improvement towards meeting unit goals and outcomes.

Making Improvements Based on Results The Action Plan describes the steps you plan to take in the upcoming year to improve instruction, program offerings, resources, and any other aspect of the program that you think might lead to improved findings relative to goals and outcomes. Once the action plan has been implemented, you are able to show how you “closed the loop.” If you did not meet your outcomes you MUST have an Action Plan. Even if you do meet your outcomes, you may still identify ways to improve based on your interpretation of the findings. Remember that you may not always meet your outcomes or goals, but the purpose of assessment is improvement. Your findings may help you to create new outcomes for the upcoming year, revise targets, make important changes to services and programs, or simply reveal the need for better assessment. We can always get better, and assessment is the vehicle by which improvements can best be identified, justified, and implemented.

P a g e | 27

General Education Assessment

Graph adapted from Addison, J. (2013) Virginia Highlands Community College Institutional Effectiveness Handbook (5th Ed.). Source: http://www.vhcc.edu/index.aspx?page=958 General Education is at the core of every degree curriculum at DSLCC and is tied to the College’s mission, values, and strategic goals. In February 2000, the Governor’s Blue Ribbon Commission on Higher Education in the Commonwealth of Virginia called for the creation of a Quality Assurance Plan. The resulting plan identified several core competencies of general education that were to be assessed by every college in the VCCS. The core competencies address the areas of communication, information literacy, quantitative reasoning, scientific reasoning, critical thinking, and oral communication. The VCCS establishes an assessment cycle to measure one competency annually, such that all five competency areas are assessed at least every five years. Two additional competency areas were identified by the Quality Assurance Plan (Personal Development and Social and Cultural Understanding), but the assessment of student achievement in these two competencies has not been made a priority by the VCCS, and the VCCS assessment schedule that is followed by DSLCC does not include these competencies. Every spring, degree graduates are tested on at least one core competency as specified by the VCCS reporting schedule. The VCCS institutional effectiveness office analyzes the results of the core competency tests, which are then forwarded to each college’s assessment officers, typically by the following fall. The Director of Institutional Effectiveness conducts additional analyses on these results and shares the reports with the Vice President for Instruction as well as the entire President’s Council. The Vice President for Instruction, along with the Director of Institutional Effectiveness, communicates the results to faculty. In addition to the core competency

General Education Courses (General

Studies Major)

Occupational/ Technical Program Components

Transfer Programs

P a g e | 28

assessments, some Transfer faculty have developed course-embedded assessments that are used to measures student learning on individual program objectives. These assessments are typically used to identify changes to common Transfer courses, whereas the intent of the core competency results is to invoke change at both the course and program-level for common courses taken by both O/T as well as Transfer students. As the graph above shows, a general education program is required of all degree-seeking students at DSLCC. For Occupational/Technical degree-seeking students, this program consists of at least 15 credits, divided among the Humanities, Social Sciences, and Natural Sciences/Mathematics, with the rest of the degree credits occurring in program-specific courses. Transfer programs, on the other hand, consist almost entirely of general education courses, designed to achieve student learning on the core competency learning outcomes, which have been adopted by the Transfer program faculty as their student learning outcomes. Finally, within each Transfer program there exist slight differences in program requirements, meant to differentiate the programs from one another. In Education, for example, students take SDV 101, a course meant to help prepare students for transfer to Education programs at four-year institutions. Business students must take one additional business course, but otherwise, their program of study is essentially identical to that of a general studies major. Given the minor differences between each of the four Transfer program majors, the faculty of each Transfer major assess their programs as one, labeled as General Studies. Although as of the fall 2013 semester, these programs are still separate majors, faculty have begun the process of changing the statuses to that of specializations, and as such, only the General Studies program is currently assessed on program-level student learning outcomes. Transfer faculty in each separate major still assess students on course outcomes within the major’s program-specific course. A Curriculum Map was conducted that includes the general education outcomes and the degree to which each outcome is addressed within a majority of Transfer courses (see Appendix F). This map helps guide Transfer program faculty decision-making regarding course-embedded assessment and helps ensure that all outcomes are being addressed somewhere within the General Studies curriculum.

P a g e | 29

General Education Core Competencies and Related Objectives DSLCC degree graduates will demonstrate competency in the following general education areas: 1. Communication A competent communicator can interact with others using all forms of communication, resulting in understanding and being understood. Degree graduates will demonstrate the ability to

1.1 understand and interpret complex materials; 1.2 assimilate, organize, develop, and present an idea formally and informally; 1.3 use standard English; 1.4 use appropriate verbal and non-verbal responses in interpersonal relations and group

discussions; 1.5 use listening skills; 1.6 recognize the role of culture in communication.

2. Critical Thinking A competent critical thinker evaluates evidence carefully and applies reasoning to decide what to believe and how to act. Degree graduates will demonstrate the ability to

2.1 discriminate among degrees of credibility, accuracy, and reliability of inferences drawn from given data;

2.2 recognize parallels, assumptions, or presuppositions in any given source of information; 2.3 evaluate the strengths and relevance of arguments on a particular question or issue; 2.4 weigh evidence and decide if generalizations or conclusions based on the given data are

warranted; 2.5 determine whether certain conclusions or consequences are supported by the information

provided; 2.6 use problem solving skills.

3. Cultural and Social Understanding * A culturally and socially competent person possesses an awareness, understanding, and appreciation of the interconnectedness of the social and cultural dimensions within and across local, regional, state, national, and global communities. Degree graduates will demonstrate the ability to

3.1 assess the impact that social institutions have on individuals and culture—past, present, and future;

3.2 describe their own as well as others’ personal ethical systems and values within social institutions;

3.3 recognize the impact that arts and humanities have upon individuals and cultures; 3.4 recognize the role of language in social and cultural contexts; 3.5 recognize the interdependence of distinctive world-wide social, economic, geo-political, and

cultural systems.

P a g e | 30

4. Information Literacy A person who is competent in information literacy recognizes when information is needed and has the ability to locate, evaluate, and use it effectively. (adapted from the American Library Association definition) Degree graduates will demonstrate the ability to

4.1 determine the nature and extent of the information needed; 4.2 access needed information effectively and efficiently; 4.3 evaluate information and its sources critically and incorporate selected information into his

or her knowledge base; 4.4 use information effectively, individually or as a member of a group, to accomplish a specific

purpose; 4.5 understand many of the economic, legal, and social issues surrounding the use of

information and access and use information ethically and legally. 5. Personal Development * An individual engaged in personal development strives for physical well-being and emotional maturity. Degree graduates will demonstrate the ability to

5.1 develop and/or refine personal wellness goals; 5.2 develop and/or enhance the knowledge, skills, and understanding to make informed

academic, social, personal, career, and interpersonal decisions. 6. Quantitative Reasoning A person who is competent in quantitative reasoning possesses the skills and knowledge necessary to apply the use of logic, numbers, and mathematics to deal effectively with common problems and issues. A person who is quantitatively literate can use numerical, geometric, and measurement data and concepts, mathematical skills, and principles of mathematical reasoning to draw logical conclusions and to make well-reasoned decisions. Degree graduates will demonstrate the ability to

6.1 use logical and mathematical reasoning within the context of various disciplines; 6.2 interpret and use mathematical formulas; 6.3 interpret mathematical models such as graphs, tables and schematics and draw inferences

from them; 6.4 use graphical, symbolic, and numerical methods to analyze, organize, and interpret data; 6.5 estimate and consider answers to mathematical problems in order to determine reasonableness; 6.6 represent mathematical information numerically, symbolically, and visually, using graphs

and charts. 7. Scientific Reasoning A person who is competent in scientific reasoning adheres to a self-correcting system of inquiry (the scientific method) and relies on empirical evidence to describe, understand, predict, and control natural phenomena. Degree graduates will demonstrate the ability to

7.1 generate an empirically evidenced and logical argument; 7.2 distinguish a scientific argument from a non-scientific argument;

P a g e | 31

7.3 reason by deduction, induction and analogy; 7.4 distinguish between causal and correlational relationships; 7.5 recognize methods of inquiry that lead to scientific knowledge.

*Though identified by the VCCS as core competencies, these two areas are not required to be assessed by the VCCS, and thus, they have not been made a priority of the College. The assessment of general education core competency achievement has thus focused on the five areas of communication, critical thinking, information literacy, quantitative reasoning, and scientific reasoning.

P a g e | 32

Institutional Effectiveness Support While the Institutional Effectiveness Process is a college-wide effort, it is supported primarily by the Director of Institutional Effectiveness and a Research Analyst. Specific duties and responsibilities are outlined below. DIRECTOR OF INSTITUTIONAL EFFECTIVENESS Institutional effectiveness responsibilities include the following:

• Manages the College’s institutional effectiveness and assessment programs. This includes assisting faculty, staff and administrators in developing strategic goals, and annual goals and outcomes that help College leadership ensure that institutional effectiveness and strategic planning practices are done at all levels in order for the College to meet its mission. The Director also develops and maintains the Institutional Effectiveness Handbook, Assessment Plan Report Templates, and other documents located on the IE website.

• Provides consultation, workshops, and other resources to improve the quality of assessment practices at all levels of the College.

• Serves on various College committees where assessment and evaluation are critical components.

• Oversees institutional research, providing data to faculty, staff, and administrators on

performance indicators that help units, programs, and divisions determine effectiveness.

• Serves as college liaison to the Southern Association of Colleges and Schools Commission on Colleges (SACSCOC).

RESEARCH ANALYST The Research Analyst provides support in the following areas: a) development of institutional research reports, b) database management, and c)college-wide survey data collection Duties and responsibilities include:

• Maintain assigned research and assessment databases.

• Assist in submitting data reports to the VCCS, SCHEV, and DOE (IPEDS)

• Assist in collecting responses on graduate exit survey and core competency assessments.

• Assist in providing data to faculty and staff for unit planning, research, and assessment reporting.

P a g e | 33

References Addison, J. (2013). Virginia Highlands Community College Institutional Effectiveness Handbook (5th

Ed.). Source: http://www.vhcc.edu/index.aspx?page=958 Nichols, J. O. (1995) A Practitioner’s Handbook for Institutional Effectiveness and Student Outcomes

Assessment Implementation. 3rd Edition. Agathon Press, New York. Nichols, J. O. & K. W. (2000). The Department Head’s Guide to Assessment Implementation in

Administrative and Educational Support units. Agathon Press, New York.

P a g e | 34

APPENDIX A: ACHIEVE 2015 Initiative The VCCS has established five goals that comprise the ACHIEVE 2015 Initiative: Access Increase the number of individuals who are educated and trained by Virginia's community college's by 50,000 to a total of 423,000, with emphasis on increasing the number from underserved populations by at least 25,000 individuals. Affordability Maintain tuition and fees at less than half of the comparable cost of attending Virginia's public four-year institutions, and increase the number of students who receive financial assistance and scholarships by 36,000. Student Success Increase the number of students graduating, transferring or completing a workforce credential by 50 percent, including increasing the success of students from underserved populations by 75 percent. Workforce Double the number of employers provided training and services to 10,000, with a particular focus on high-demand occupational fields. Resources Raise at least $550 million in gifts and grants to support the mission of Virginia's community colleges.

P a g e | 35

APPENDIX B: DSLCC Institutional Goals 2014-2015

DSLCC DSLCC will increase overall enrollments by offering educational programs and services that are relevant to the service area population, taught using innovative methods, and designed around student needs.

DSLCC DSLCC will fulfill its mission by providing employees with appropriate services, training, support, means of communication, and positive engagement with customers—internally throughout the college and externally with interaction throughout the community.

DSLCC DSLCC will maximize student success by providing services to support all learners in an environment that promotes the achievement of educational and occupational goals.

DSLCC DSLCC will be a viable, affordable choice for quality post-secondary education.

DSLCC DSLCC will participate in, and launch ideas for, economic development for the region as they pertain to educational and training needs for advancing educational attainment of all residents.

DSLCC DSLCC will support new job creation and business retention by providing workforce training, retraining, and services to meet current and emerging needs of service area business and industry.

DSLCC DSLCC will manage its administrative and fiscal affairs through a systematic institutional effectiveness process that complements the goals of ACHIEVE 2015 and respects the recommendations of the Reengineering Task Force.

DSLCC DSLCC will provide adequate funding for quality programs, services, and facilities by using its resources in cost-effective ways

P a g e | 36

APPENDIX C: Excerpt from 2013-2014 President’s Goals Template Dabney S. Lancaster Community College

Presidential Goals in Alignment with VCCS Achieve 2015 Dr. John J. Rainone, President

DRAFT 1 – approved by DSLCC Local Board on 04/14/14 FINAL DRAFT 07/09/14

Preamble for VCCS Achieve 2015 To contribute to the economic and civic vitality of the commonwealth and its international competitiveness, Virginia’s Community Colleges commit to increasing access to affordable education and training for more individuals so they acquire the knowledge and skills to be successful in an ever-changing global economy. To support achievement of Achieve 2015 goals, DSLCC will accomplish the following in 2014-2015 under each of the VCCS goals. Achieve 2015: Access Increase the number of individuals who are educated and trained by Virginia's Community Colleges by 50,000 to a total of 423,000, with emphasis on increasing the number from underserved populations by at least 85,000 individuals. Goal 1: DSLCC will increase overall enrollments by offering educational programs and services that are relevant to the service area population, taught using innovative methods, and designed around student needs. DSLCC will: 1. Increase annualized FTE enrollment by at least 1 percent over 2013-14.

a) Create at least two new academic awards that respond to emerging, critical workforce and STEM-H needs.*

b) Increase the total number of profiles created in the Virginia Education Wizard by at least 3 percent.* c) Evaluate the success of DSLCC's participation in the shared services distance learning model

(SSDL) to determine plans for continuing in future years.* d) Upon final approval from the Virginia Board of Nursing, implement the PN to RN transition

pathway during spring 2015. e) Evaluate and create a plan to expand offerings in the College’s entire service area. f) Evaluate an improved course scheduling system for day classes that improves student enrollment

and completion rates. Implement recommendations for fall 2015. g) Centralize college-wide class scheduling responsibilities to one person in order to streamline the

course schedule and improve students’ abilities to follow recommended course sequences beginning fall 2015.

h) Make courses more accessible to working students and free up time for faculty to complete administrative work by evaluating the potential to offer two day/week courses, three-hour block courses on Fridays, and more evening and weekend courses.

i) Complete a three-year Academic Master Plan that complements the College’s Mission

P a g e | 37

APPENDIX D: DSLCC Annual Institutional Effectiveness Timeline

January

April

July

October

December

November

September

August

June

May

March

February

• Budget Managers Submit Budget Priority Reports and Budget Requests to VP FAS

• New President’s Goals Established for Following Year • Educational Program Assessment Reports Due

• College Mission Reviewed and Reaffirmed by President’s Council

• College Mission Reviewed and Reaffirmed by Local Board

• President’s Goals Shared with Chancellor

• Unit Directors and Vice Presidents Submit Final Assessment

• Budget Established for Next Fiscal Year

• Annual Status Report on Strategic Plan Performance Indicators Disseminated

• Updates to the Strategic Plan Discussed and Approved by President’s Council

• Updates to Strategic Plan and President’s Goals Shared with College Faculty/Staff

• President’s Goals Approved by Local Board

• Chancellor establishes annual goals for VCCS

• President’s Council Reviews Progress Towards Meeting Annual President’s Goals

• President updates college on status of goals

• Administrative Units Establish New Goals and Strategies for Following Year

• Assessment Plans are reviewed by supervisors and discussed with directors

P a g e | 38 APPENDIX E : Schedule of Assessment Activities

Assessment Activity 2011-2012

2012-2013

2013-2014

2014-2015

2015-2016

General Surveys & Data Reports

Graduate Exit Survey X X X X X Center for Community College Student Engagement (C C C S E) X X

Graduate Follow-up Survey X X X

Educational

Occupational/Technical (A A S) X X X X X Occupational/Technical (Certificate) X X X X X Transfer Programs (A & S) X X X X X VCCS Core Competencies X X X X X Program Health Reports X X X Course Evaluations X x x x x

Educational and Administrative Support Divisions and Units

VP for Academic Affairs X X X X X Library X X X X X VP for CEWS X X X X X VP for FAS X X X X X Business Office X X X X X Technical Services X X X X X Buildings and Grounds X X X X Human Resources X X X X President’s Office x x x x x Student Services X X X X X Institutional Effectiveness X X X X X Educational Foundation X X X X X

Basic Skills Placement Testing X X X X X Developmental Course Success Rates X X X X X

Non-Completers Questionnaires X X X Telephone/Personal Interviews X X X

Other Assessment of Strategic Goals X X Assessment of SLO Assessment X X X Official Written Performance Evaluations X X X X X

P a g e | 39 APPENDIX F: Transfer Program Curriculum Map

Courses BIO 101

CHM 111

PHY 201

BIO 141

ENG 111

HIS 111

HIS 121

MTH 151

MTH 151

MTH 163

MTH 173

ECO 201

SDV 100

SDV 101

2nd sem

BIO 102

CHM 112

PHY 202

BIO 142

ENG 112

HIS 112

HIS 122

MTH 152

MTH 164

MTH 174

Communication

1.A

1.B 1.C 1.D 1.E

1.F

Critical Thinking

2.A

2.B 2.C 2.D 2.E

2.F

Cultural & Social

Understanding

3.A

3.B 3.C 3.D

3.E

Information Literacy

4.A

4.B 4.C 4.D

4.E

Personal Development

5.A

5.B

Quantitative Reasoning

6.A

6.B 6.C 6.D 6.E

6.F

Scientific Reasoning

7.A

7.B 7.C 7.D

7.E

P a g e | 40

MTH 157

MTH 240

CST 110

ECO 202

ACC 211

3rd sem

ENG 241

ENG 243

ITE 115

PSY 230-

4th sem

ENG 242

ENG 244

ACC 212

ENG 278

PSY 200

PSY 235

PSY 116

PSY 215

BIO 150

PLS 135

PLS 136

SOC 200

SOC 268

ENG 253

ENG 254

ENG 276

ART 201

ART 202

ART 211

ART 212

P a g e | 41 APPENDIX G: Academic Assessment Evaluation Rubric

Academic Program Assessment Evaluation Rubric v1.0 Dabney S. Lancaster Community College

Highlighted Elements Not Expected in Reports Until 2014-2015 Assessment Cycle

I. Student Learning Outcomes Beginning

1 Developing

2 Good

3 Exemplary

4 Score

A. Clarity and Specificity

No outcomes stated. Outcomes present, but with imprecise verbs (e.g., know, understand), vague description of content/skill/or attitudinal domain, and non-specificity of whom should be assessed (e.g., “students”).

Outcomes generally contain precise verbs, rich description of the content/skill/or attitudinal domain, and specification of whom should be assessed (e.g., “Students graduating from the Nursing A.D.N. program”).

All outcomes stated with clarity and specificity including precise verbs, rich description of the content/skill/or attitudinal domain, and specification of whom should be assessed (e.g., “Students graduating from the Nursing A.D.N. program”)

B. Orientation

No outcomes stated in student-centered terms.

Some outcomes stated in student-centered terms.

Most outcomes stated in student-centered terms.

All outcomes stated in student-centered terms (i.e., what a student should know, think, or do).

II. Learning Experiences Mapped to SLOs Beginning

1 Developing

2 Good

3 Exemplary

4 Score

No activities/courses listed.

Activities/courses listed but link to outcomes is absent.

Most outcomes have classes and/or activities linked to them.

All outcomes have classes and/or activities linked to them.

P a g e | 42

III. Systematic method for evaluating progress on SLOs Beginning

1 Developing

2 Good

3 Exemplary

4 Score

A. Relationship between measures and outcomes

Seemingly no relationship between outcomes and measures or no measures indicated for a majority of outcomes.

Measures are linked to stated outcomes, but no explanation of how the measures were created to explicitly assess the outcome is provided. The measures do not seem to be an appropriate format for the stated outcome. All outcomes may or may not be linked to specific measures. Not enough information is provided about the measure to determine the appropriateness of its relationship to the outcome.

General detail about how outcomes relate to measures is provided. For example, the test was developed for a given course in which the outcome is taught or the program uses a licensure exam to determine student achievement.

Detail is provided regarding outcome-to-measure match. When appropriate, the measure uses multiple items to assess a specific outcome. For performance assessments, rubrics or checklists were developed to assess specific outcomes and the specific criteria of the performance assessments are linked. For licensure exams, subsections of the test are mapped to competencies/outcomes.

B. Types of Measures No measures indicated for a majority of outcomes.

Most outcomes assessed primarily via indirect (e.g., surveys, course grades) measures.

Most outcomes assessed primarily via direct measures

All outcomes assessed using at least one direct measure (e.g., tests, essays, performance tasks, writing rubrics, checklists).

C. Multiple Measures Used No measures indicated for a majority of outcomes

All outcomes assessed using only one measure or indicator.

Some outcomes assessed using multiple measures or indicators. Primary measures constitute performance assessments, essays/research papers, or multiple test items. Secondary measures may or may not include indirect measures (e.g., surveys)

The majority of outcomes assessed using multiple measures or indicators. Primary measures constitute performance assessments, essays/research papers, or multiple test items. Secondary measures may or may not include indirect measures (e.g., surveys)

D. Specification of Targets for Success No benchmarks/ targets for success identified for outcomes

Statement of desired result (e.g., student growth, comparison to previous year’s data, comparison to faculty standards, performance vs. a criterion), but no specificity (e.g., students will grow; faculty will be satisfied with the results)

Desired result specified. (e.g., our students will meet or exceed the national average score; 90% of students will demonstrate competency). “Gathering baseline data” is acceptable for this rating.

Desired result specified and justified (e.g., Last year the typical student scored 20 points on measure x. The current cohort underwent more extensive coursework in the area, so we hope that the average student scores 22 points or better.)

Beginning Developing Good Exemplary Score

P a g e | 43

1 2 3 4 E. Data collection & Research design integrity

No information is provided about data collection process or data not collected.

Limited information is provided about data collection such as who and how many took the assessment, but not enough to judge the veracity of the process (e.g., thirty-five students took the test). Or, information is provided, but the process presents several major issues that jeopardize the validity of the findings (first-year students are assessed on program outcomes, use of course grades).

The data collection process is well documented with information such as a description of the sample, testing protocol, testing conditions, and student motivation. Nevertheless, some methodological issues that could potentially weaken the validity of the findings (Low student motivation, unstandardized testing conditions, poor or no reliability information).

The data collection process is clearly explained and is appropriate to the specification of desired results (e.g., representative sampling, adequate motivation, standardized testing conditions, students assessed at appropriate time in the program)

IV. Presence of Results

A. Presentation of Results/Findings

No results presented Results are present, but it is unclear how they relate to the outcomes or the desired results for the outcomes. Or, only results presented are general statements about student performance (e.g., student performed well).

Results are present, and they directly relate to the outcomes and the desired results for outcomes but presentation is sloppy or difficult to follow. Statistical analysis may or may not be present. If applicable, results are disaggregated by distance learning method (Compressed Video, online, independent study)

Results are present, disaggregated by distance learning (if applicable), and they directly relate to outcomes and the desired results for outcomes, are clearly presented (e.g., tables or graphs), and any statistical analyses seem appropriate (t-tests, means or percentages provided).

B. History of results No results presented Only current year’s results

provided. Past iteration(s) of results (e.g., last year’s) provided for some assessments in addition to current year’s.

Past iteration(s) of results (e.g., last year’s) provided for majority of assessments in addition to current year’s.

C. Interpretation of Results No interpretation attempted

Interpretation attempted, but the interpretation does not refer back to the outcomes or desired results of outcomes. Or, the interpretations are clearly not supported by the methodology and/or results.

Interpretations of results seem to be reasonable inferences given the outcomes, desired results of outcomes, and methodology. A statement is made regarding whether or not the results indicate that students are meeting the outcome, but no reference is made to an action plan, or to the effects of an action plan on student learning. If results are disaggregated by distance learning, little to no comparison of these

Interpretations of results seem to be reasonable given the outcomes, desired results of outcomes, and methodology. And, interpretation includes how classes/ activities might have affected results. Results were shared with any other faculty involved with the program, such as adjuncts or other full-time faculty. If results are disaggregated by distance learning method, results are compared

P a g e | 44

results is made. and inferences are made about any differences between methods.

V. Evidence of Improvements Based on Assessment Results Beginning

1 Developing

2 Good

3 Exemplary

4 Score

A. Improvement of programs regarding student learning and development

No mention of any improvements to the program.

Examples of improvements or plans for improvement documented in an action plan but the link between them, the assessment findings, and specific outcomes is absent or unclear.

Examples of improvements (or plans to improve) are documented in an action plan and directly related to findings of assessment and outcomes. However, the improvements lack specificity.

Examples of improvements (or plans to improve) documented in an action plan and directly related to findings of assessment and outcomes. These improvements are very specific (e.g., approximate dates of implementation, specifics of the improvements, where in curriculum they will (have) occur(red), budget or resources requested (if needed)).

B. Improvement of assessment process.**

No mention of how this iteration of assessment is improved from past administrations.

Some critical evaluation of past and current assessment, including acknowledgement of flaws, but no evidence of improving upon past assessment or making plans to improve assessment in future iterations.

Critical evaluation of past and current assessment, including acknowledgement of flaws; Plus evidence of some moderate revision, or general plans for improvement of assessment process.

Critical evaluation of past and current assessment, including acknowledgement of flaws; both present improvements and intended improvements are provided; for both, specific details are given. Either present improvements or intended improvements must encompass a major revision.

**If a program receives a score of exemplary in at least 8 elements, it will automatically receive a score of at least “Good” for element 5b.

Rubric adapted from: Fulcher, K. H., Sundre, D. L., & Russell, J. A. (2009). Assessment Progress Template Rubric. The Center for Assessment and Research Studies, James Madison University, Harrisonburg, VA.

P a g e | 45 APPENDIX H: SLO Assessment Reporting Template for Educational Programs

SLO Assessment Reporting Template Dabney S. Lancaster Community College

I. LEARNING OUTCOMES

Describe your student learning outcomes using measurable verbs and specific content. Specify when the student should be able to attain outcome (e.g., graduating students will…, after their first year, students will…)

II. Learning Experiences Mapped to Outcomes

Provide information about where in the curriculum each program learning outcome is covered. Consider providing a grid with course prefixes along the top row and learning outcomes down the side. Mark an X in the columns where the outcome is taught in the corresponding course.

III. METHODS FOR GATHERING ASSESSMENT RESULTS

Describe the measure(s) you use to assess student learning for EACH learning outcome. Consider including the following information:

a. What type of measure is it (e.g., Multiple choice test, essay graded with a rubric, portfolio, checklist)?

b. Where in the curriculum is the measure administered? List specific course abbreviation or the place in the curriculum when students tend to take the course (e.g., spring semester of the second year).

c. If it isn’t obvious, briefly explain how the measure is appropriate for the stated outcome. How do you ensure that the test/rubric/checklist is explicitly linked to the learning outcome?

d. If possible, provide a copy of the rubric or a test outline that shows the link between the measure and the learning outcome.

Describe the process by which you collect assessment results:

e. How is the assessment administered? Is it given at the beginning of the semester, the end of the semester, or outside of class?

f. Is the assessment high stakes (i.e., is it given for a grade?). g. If the assessment is administered during class, are there multiple course sections taught by

multiple faculty members? How do you ensure consistency in data collection between sections?

IV. SPECIFYING A TARGET FOR SUCCESS

Specify the target score or level of achievement that indicates student success on the stated outcome. Consider the following:

P a g e | 46

a. How do you determine if students are meeting the learning outcome (i.e., What score on the measure would make you satisfied that students are meeting the learning outcome?)

b. How did you decide on this target for success?

V. PRESENCE AND INTERPRETATION OF RESULTS

Clearly provide the results from the measure(s). Please include the following information:

a. The number of students who completed the assessment. Are all of these students enrolled in the program or are you just using class attendance (Best practice is to only include results from students in the program of interest)?

b. The scores from the assessment that directly relate to the learning outcome. c. Interpret the findings relative to the target score. Did students meet the target? If not, why do

you think students did not meet the outcome? If yes, be sure to state that students are meeting the learning outcome.

d. If you have implemented an action plan to improve performance on this outcome, do the results illustrate an improvement in student performance? If yes, interpret the growth with respect to the program improvement. To what degree do you think the program changes were responsible for the improvement? If results did not improve, will you stick with the program improvements specified in the action plan or will you create a new action plan?

VI. IDENTIFYING AND IMPLEMENTING IMPROVEMENTS BASED ON ASSESSMENT RESULTS

If students are not meeting the learning outcome, provide an action plan that you will implement next academic year to improve performance. Consider the following information:

a. A description of the proposed program modifications. Provide as much detail as possible. b. A general timeline for implementation. Which semester will you implement this plan? When will

the action plan be fully implemented? c. Any resources required for implementation. d. An update on any prior action plans. Were they fully implemented? Are they still being

implemented? Did they have to be terminated or placed on hold?

P a g e | 47 APPENDIX I: Academic Program Assessment Template

Assessment Reporting Template: Educational Programs

Program:

Reporting Year:

Calendar Year (Spring and Fall) ☐ Academic Year (Fall and Spring) ☐ Cohort Based ☐

Student Learning Outcomes Summary

List all learning outcomes for your program (Graduates will be able to…):

1.

2.

3.

Outcome 1:

List Outcome Here

List Applicable Specializations, Certificates or Career Studies Certificates:

Measure 1:

Target for Success:

Data Collection Process:

Results from the past 2+ years:

2013-2014

2012-2013

2011-2012

Interpretation of Results (Include impact of previous improvement plans on results here):

Target Met? Yes ☐ No ☐

Improvement Plans Based on Results (Include as much detail as possible):

If additional resources are needed to implement this improvement plan, please describe below:

P a g e | 48 APPENDIX J: Administrative Unit Assessment Rubric

Administrative Unit Assessment Evaluation Rubric v1.0 Dabney S. Lancaster Community College

Highlighted Elements Not Expected in Reports Until 2014-2015 Assessment Cycle

VI. Mission/Goals/Outcomes Beginning

1 Developing

2 Good

3 Exemplary

4 Score

C. Mission Statement Present No mission statement provided

A statement is provided, but it does not include information about the intended role and services of the department/office.

Mission statement includes information about the role and services of the department/office, but no connection can be made to the institutional mission

Mission statement includes information about the role and services of the department/office, identifies stakeholders, and demonstrates clear alignment with and/or expansion of institutional mission

D. Clarity of Departmental Goals Department goals are absent

Department goals present, but statements don’t identify stakeholders (e.g., students, faculty, staff, community) or departmental functions (e.g., provide technical support).

Department goals are present, clearly identify the stakeholder and departmental functions. Only some goals have specific outcomes linked to them.

All department goals identify the stakeholder and the function that serves them. All goals have measurable outcomes linked to them.

E. Departmental Goals Linked to Institutional Goals Department goals absent

Department goals are listed, but there is no link to institutional goals

Some department goals linked to institutional goals

All department goals are linked to at least one institutional goal.

F. Clarity and Specificity of Outcomes No outcomes stated. Outcomes present, but with imprecise

verbs (e.g., know, understand, provide), vague description of content. The outcome lacks specificity regarding level of success and does not appear to be realistic, attainable, or timely.

Outcomes generally contain precise verbs, rich description of the content, and the target of the outcome is specified and measurable. Generally specific, but it may be difficult to determine the extent to which the outcome is realistic, attainable, or timely.

All outcomes are clearly stated in measurable terms. Expectations for achievement are reasonable and targets appear to be attainable given the scope of the statement. Outcomes are specific enough to determine the extent to which they are realistic, attainable, and timely.

P a g e | 49

VII. Systematic method for evaluating progress on outcomes Beginning

1 Developing

2 Good

3 Exemplary

4 Score

F. Relationship between measures/Performance Indicators (PI) and outcomes

Seemingly no relationship between outcomes and measures or no measures/PI indicated for a majority of outcomes.

Measures/PI are linked to stated outcomes, but no explanation of how the measures were developed to explicitly assess the outcome is provided. The measures/PI do not seem to be an appropriate format for the stated outcome. All outcomes may or may not be linked to specific measures/PI. Not enough information is provided about the measure/PI to determine the appropriateness of its relationship to the outcome.

General detail about how outcomes relate to measures is provided. For example, the survey items were written to measure satisfaction of graduating students. Any performance indicators appear to be reasonable given the stated outcome, though the degree to which the PI can be used to identify changes may be questionable (using overall retention rates to evaluate the effectiveness of a single program)

Information is provided aligning the measure with the outcome. The measure/PI is an appropriate method for assessing the stated outcome and provides information that can be useful in identifying changes (using the retention rates of a specific group to determine effectiveness of a program that targets group in question).

G. Types of Measures No measures/PI indicated for a majority of outcomes.

Most outcomes assessed primarily via indirect (e.g., self-reported data) measures.

Most outcomes assessed primarily via direct measures (e.g., retention rates, satisfaction surveys, usage data)

All outcomes assessed using at least one direct measure (e.g., PI, surveys (if appropriate for outcome), rubrics).

H. Specification of Targets for Success No benchmarks/ targets for success identified for outcomes

Statement of desired result (e.g., student growth, comparison to previous year’s data), but no specificity (e.g., students will grow; attendance will increase)

Desired result specified. (e.g., attendance will increase by 5%; 90% of students will state satisfaction with program). “Gathering baseline data” is acceptable for this rating.

Desired result specified and justified (e.g., last year fall to spring retention rate for UDP students was 55%. The current cohort underwent extensive one on one advising, so we hope that the fall to spring retention rate improves by 2%)

P a g e | 50

Beginning 1

Developing 2

Good 3

Exemplary 4

Score

I. Data collection & Research design integrity No information is provided about data collection process, or data are not collected.

When measures are used, limited information is provided about data collection such as who and how many took the assessment, but not enough to judge the veracity of the process (e.g., thirty-five students took the survey). Or, information is provided, but the process presents several major issues that jeopardize the validity of the findings (used graduate survey data to determine success of all students).

When measures are used, the data collection process is well documented with information such as a description of the sample, survey protocol and conditions, and motivation. Nevertheless, some methodological issues that could potentially weaken the validity of the findings (small sample, unstandardized testing conditions, poor or no reliability information). Or, all assessment tools are performance indicators that require no data collection procedures.

When a measure is used, the data collection process is clearly explained and is appropriate to the specification of desired results (e.g., representative sampling, adequate motivation, standardized testing conditions, population assessed at appropriate time in the program/intervention)

VIII. Presence of Results

D. Presentation of Results/Findings No results presented Results are present, but it is

unclear how they relate to the outcomes or the desired results for the outcomes. Or, only results presented are general statements about performance (e.g., survey was administered, project was completed).

Results are present, and they directly relate to the outcomes and the desired results for outcomes but presentation is difficult to follow. Statistical analysis may or may not be present. If applicable, results are not disaggregated by appropriate groups (men/women, first year/returning students, etc.)

Results are present, disaggregated by groups (if applicable), and they directly relate to outcomes, the desired results for outcomes, are clearly presented (e.g., tables or graphs), and any statistical analyses seem appropriate (t-tests, means or percentages provided).

E. History of results No results presented Only current year’s results

provided. Past iteration(s) of results (e.g., last year’s) provided for some assessments in addition to current year’s.

Past iteration(s) of results (e.g., last year’s) provided for majority of assessments in addition to current year’s.

F. Interpretation of Results No interpretation attempted

Interpretation attempted, but the interpretation does not refer back to the outcomes or desired results of outcomes. Or, the interpretations are clearly not supported by the methodology and/or results.

Interpretations of results seem to be reasonable inferences given the outcomes, desired results of outcomes, and methodology. A statement is made regarding whether or not the results indicate that students are meeting the outcome, but no reference is made to an action plan, or to the effects of an action plan on student learning.

Interpretations of results seem to be reasonable given the outcomes, desired results of outcomes, and methodology. And, interpretation includes how services/programs might have affected results. Results were shared with other staff in the department or with stakeholders. If results are disaggregated

P a g e | 51

If results are disaggregated by groups, little to no comparison of these results is made.

by groups, results are compared and inferences are made about any differences between groups.

IX. Evidence of Improvements Based on Assessment Results Beginning

1 Developing

2 Good

3 Exemplary

4 Score

C. Improvement of programs regarding student learning and development

No mention of any improvements to the department or specific programs/services.

Examples of improvements or plans for improvement documented in an action plan but the link between them, the assessment findings, and specific outcomes is absent or unclear.

Examples of improvements (or plans to improve) are described and/or documented in an action plan and directly relate to findings of assessment and outcomes. However, the improvements lack specificity.

Examples of improvements (or plans to improve) described and/or documented in an action plan and directly related to findings of assessment and outcomes. These improvements are very specific (e.g., approximate dates of implementation, specifics of the improvements, budget or resources requested (if needed)).

D. Improvement of assessment process.**

No mention of how this iteration of assessment is improved from past administrations.

Some critical evaluation of past and current assessment, including acknowledgement of flaws (e.g., low survey response needs to be addressed), but no evidence of improving upon past assessment or making plans to improve assessment in future iterations.

Critical evaluation of past and current assessment, including acknowledgement of flaws; Plus evidence of some moderate revision, or general plans for improvement of assessment process (e.g., low survey response will be addressed by sending out the survey earlier).

Critical evaluation of past and current assessment, including acknowledgement of flaws; both present improvements and intended improvements are provided; for both, specific details are given. Either present improvements or intended improvements must encompass a major revision. (e.g., will ask staff to help revise survey items to more closely align with outcomes. Survey will be sent to all students within first two weeks of semester and $50 gift card will be raffled off to students who complete it).

**If a program receives a score of exemplary in at least 8 elements, it will automatically receive a score of at least “Good” for element 4b

Rubric adapted from: Fulcher, K. H., Sundre, D. L., & Russell, J. A. (2009). Assessment Progress Template Rubric. The Center for Assessment and Research Studies, James Madison University, Harrisonburg, VA.

P a g e | 52 APPENDIX K: Administrative Unit Assessment Report Outline

Administrative Unit Assessment Reporting Template Dabney S. Lancaster Community College

I. Mission Statement

Does your department have a mission statement? If so, provide it on the cover page. Ideally, the mission provides a comprehensive statement about the core functions of your department, including the stakeholders you serve. It should also explicitly or implicitly refer to DSLCC’s mission statement, demonstrating alignment with and expansion of the College’s mission. For those units with several staff, it is encouraged to craft a mission statement as a team.

II. Departmental Goals Meant to align with your departmental mission statement and with institutional goals, describe the goals of your department. Different from outcomes (described next), your goals should be general statements about the core functions and services that you provide and should reference the stakeholders who benefit as a result of these services (for example, a goal for Student Services might be: “Students will receive accurate and timely academic advising”).

III. OUTCOMES Your outcomes are how you will determine your effectiveness of meeting your goals. Outcomes are specific, measurable statements about specific services, programs, and functions. These statements can be written from the perspective of what your department will do (conduct maintenance inspections once per month) or from the perspective of stakeholders (90% of students will be satisfied with front desk customer service). Describe your outcomes using measurable verbs and specific content.

IV. METHODS FOR GATHERING ASSESSMENT RESULTS Describe the measure(s) and/or performance indicators you use to evaluate effectiveness for EACH outcome. Consider including the following information:

h. What type of measure is it (e.g., survey, checklist, etc.)? i. What performance indicator are you using (e.g., retention rates, budget amounts, revenue

generated, usage data, etc.)? Performance indicators are set pieces of data, rather than information gathered from assessment measures such as surveys or checklists.

j. If it isn’t obvious, briefly describe the measure in order to demonstrate its appropriateness for measuring the stated outcome.

k. For any measures you may have, briefly describe: 1) how the data will be collected, 2) when the data will be collected, and c) the population from whom the data will be collected.

P a g e | 53

V. SPECIFYING A TARGET FOR SUCCESS Specify the target score or level of achievement that indicates success on the stated outcome. Consider the following:

c. How do you determine if your department is meeting the outcome (i.e., What score on the measure/performance indicator would make you satisfied that your department is meeting the outcome?) Consider percentages (90% of students will be satisfied with services), benchmarks (retention rates for at-risk students will be at or above the rates of other students), or improvement over time (the amount of revenue generated from non-credit classes will increase by 5% over last year).

VI. RESULTS Clearly provide the results from the measure(s). Please include the following information:

a. The number of students/stakeholders who completed the assessment. b. The scores from the measure/performance indicator that directly relate to the outcome. c. At least two years of data (if available). Having multiple years of data help provide context and

evidence of improvement over time.

VII. Interpretation of Results Interpret the results using a few brief statements. Consider the following:

a. Interpret the findings relative to the target score. Did you meet the target? If not, why do you think you did not meet the outcome? If yes, be sure to state that you met the learning outcome. The results, when considering the target, should clearly support either assertion.

b. If you implemented an action plan over the past year to improve performance on this outcome, do the results illustrate an improvement in performance? If yes, interpret the growth with respect to the department improvement. To what degree do you think the changes were responsible for the improvement? If results did not improve, will you stick with the improvements specified in the action plan or will you create a new action plan?

VIII. DESCRIPTION OF CHANGES/IMPROVEMENTS IMPLEMENTED OVER MOST RECENT ACADEMIC/CALENDAR YEAR. There may be some overlap here with your interpretations, but provide a description of any changes/improvements you made over the past year in an effort to improve upon this year’s results. Consider the following:

a. A description of the changes you had planned to implement (use last year’s action plan to inform this description. Ideally, you fully implemented your action plan from last year and can credit these changes for altering the results. Describe what you did, and highlight anything that you couldn’t do, or that you did differently from the proposed plan.

b. Any additional changes you made that might not have been part of a previous action plan. c. This section should be written in the past tense.

P a g e | 54

IX. IDENTIFYING IMPROVEMENTS TO BE IMPLEMENTED OVER NEXT ACADEMIC/CALENDAR YEAR If you are not meeting the outcome (or you met the outcome, but still see ways to improve), provide an action plan that you will implement next academic year to improve performance. Consider the following information:

a. A description of the proposed modifications. Provide as much detail as possible. b. A general timeline for implementation. When will you implement this plan? When will the

changes be implemented? What additional budget/resources will be needed? c. Any resources required for implementation. d. This section should be written in the future tense.

P a g e | 55 APPENDIX L: Administrative Unit Assessment Plan Template

Administrative Unit Assessment Plan 2014-2015

Department/Office:

Department/Office Mission Statement (if available):

Date Reviewed with Supervisor:

Please complete the following assessment plan and review with your supervisor prior to September 15, 2014. Please submit electronic copies of your final plan to your supervisor and to the Director of Institutional Effectiveness no later than September 22, 2014 (Feel free to submit it earlier!).

P a g e | 56 Departmental Goal: [Broad statement about core functions of office/department]

DSLCC Goal: [From the President’s Goals, list the college’s goal(s) that your departmental goal aligns with]

Outcome [Measurable statement about specific functions/services that will determine the extent to which you are meeting the departmental goal]

President’s Goal? ☐ Yes. Please specify: ___________ [ex. Goal 1.a.] ☐ No

Measure(s)/ Performance Indicators

[If you use multiple measures, consider discussing each measure on a separate page]

Target for Success

[Specify the level of performance on your measure or P.I. at which you will consider your outcome met]

Results (please provide at least two years of data if possible)

2012-2013 2013-2014 2014-2015

Interpretations of results

You do not need to provide information in this plan, just the end of year report

[Briefly explain any reasons or rationale for this year’s results. Feel free to explain them within the context of past years’ results as well. Be sure to interpret them in the context

of your specified target]

Target Met? ☐ Yes ☐ No ☐ Partially

Current Changes Planned for 2014-2015 Based on 2013-2014 Results

[Describe any changes/ideas you plan to implement this year based on last year’s results

Changes Planned for 2015-2016

[Leave blank until end of year report]

[Based on results for 2014-2015, describe changes you intend to make for 2015-2016 here. Be as descriptive as possible, including timelines and budgets if known]