2
assessment reports were perhaps too vague to guide policy-making. Findings were never digested into a form that busy policymakers could use. Instead, they were displayed as obscure charts compiled in huge notebooks with insufficient descrip- tion of what they meant. However, as our knowledge of LUSLLL was combined with enhanced skills in word processing, data analysis, and graphics production, our reports became more focused, shorter, and more useful. A second primary product of LUSLLL assessment is timely responses to assess- ment findings. Initial studies called for improvement in faculty advising and li- brary services in particular. LUSLLL more than doubled the number of faculty inter- acting by telephone with students, while virtually eliminating facilitators (nonfac- ulty personnel) who had helped students with problems. Library Services increased library holdings, installed a toll-free line for LUSLLL students, and further im- proved procedures for serving students by setting up a computer-based faculty-stu- dent communication system. These and other timely responses to assessment find- ings paved the way to increased quality of service. Improved education outcomes fol- lowed as a third primary product of as- sessment in our institution, but a clearer picture of education outcomes can only come with further assessment. Nonethe- less, results of our studies indicate that we have made progress in selected areas, li- brary and computer usage in particular. Throughout the short history of LUSLLL, the exponential growth of the program has placed enormous pressure on the assessment process. Procedures have been developed to collect vast quan- tities of data and summarize them in re- ports that clearly present strengths and weaknesses in ways that elicit responses properly aimed toward improved institu- tional programs and education outcomes. David Towles is coordinator of research and assessment in the School of Lifelong Learning and Ellen Black is associate vice president of planning, research, and assessment at Liberty University. Lynchburg, Virginia. Midlands Technical College Midlands Technical College (MTC), which Serves the Columbia, South Caro- lina, metropolitan area with three cam- puses, hasseena6196enrollmentgrowth since 1978. It currently has over 11,OOO students, enrolling nearly one-quarter of the area high school 1990 graduating seniors who went on to college. MTC also serves approximately 18,000 area residents through its Continuing Educa- tion Division. In 1986, the faculty, staff, and gov- erning board of MTC initiated a strate- gic planning process, which resulted in the document Vision of Excellence. This document provided the foundation for a comprehensive institutional effective- ness initiative begun in 1988. At this same time, South Carolina enacted a law known as the "cutting edge" legislation. This law mandated that effective systems of quality assess- ment and accountability be established and maintained by South Carolina pub- lic colleges. Systems must be designed to determine institutional effectiveness, disseminate the results of outcomes to constituents within the state. and initiate changes in curriculum. programs, and policy based on data related to institu- tional effectiveness. The MTC initiative goes beyond South Carolina's mandated requirements, and, in 1989-1990. MTC received the designation of lead institu- tional effectiveness college among South Carolina's two-year technical and community colleges. In defining institutional effective- ness, MTC also goes beyond the tradi- tional definitions of program evaluation and assessment of student outcomes. MTC's considers it to be an institutional perspective that focuses on accurate planning. assessment of accomplish- ments (both of students and the institution's overall effectiveness), and the use of assessment results to plan and make decisions. Institutional effective- ness has two major components: plan- ning. which is defined as a process of documenting the intended purpose, di- rection, and expected outcomes of the college and providing foresight in the formulation of policies, programs, and services; and evaluation, which is de- fined as a process of measuring the col- lege against its stated purpose and indicators of effectiveness in tenns of outcomes accomplished. The planning process involves both strategic planning and operational pIan- ning. Strategic planning is used to iden- tify the major direction and priority initiatives of the college. This begins with clarification of the college's mis- sion, its role and scope, and its values. In order to achieve realistic results, MTC is clear about the internal and external contexts within which the college oper- ates. Based on these considerations. a set of eight institutional goals was created for the period 1992-1997. These goals include enhancing and developing the curricula to meet multiple challenges; providing the highest quality instruc- tion through excellence in teaching and comprehensive instructional support; and maintaining and refining support processes that enhance student success. In turn, each goal is associated with a set of priority initiatives, such as the implementation of new general educa- tion core requirements and implementa- tion of the instructional component of effectiveness measures, including pro- gram review, needs analyses, and pro- ductivity measures. While long-range goals and related initiatives are necessary to give an insti- tution direction, they are not sufficient to provide a complete picture of ongo- ing programs and services and their ef- fectiveness. For this purpose, MTC developed a set of assessment criteria to (continued on page 13) March-April 1993 Volume 5. Number 2 II

Campus profiles: Midlands technical college

Embed Size (px)

Citation preview

assessment reports were perhaps too vague to guide policy-making. Findings were never digested into a form that busy policymakers could use. Instead, they were displayed as obscure charts compiled in huge notebooks with insufficient descrip- tion of what they meant. However, as our knowledge of LUSLLL was combined with enhanced skills in word processing, data analysis, and graphics production, our reports became more focused, shorter, and more useful.

A second primary product of LUSLLL assessment is timely responses to assess- ment findings. Initial studies called for improvement in faculty advising and li- brary services in particular. LUSLLL more than doubled the number of faculty inter- acting by telephone with students, while virtually eliminating facilitators (nonfac- ulty personnel) who had helped students with problems. Library Services increased library holdings, installed a toll-free line for LUSLLL students, and further im- proved procedures for serving students by setting up a computer-based faculty-stu- dent communication system. These and other timely responses to assessment find- ings paved the way to increased quality of service.

Improved education outcomes fol- lowed as a third primary product of as- sessment in our institution, but a clearer picture of education outcomes can only come with further assessment. Nonethe- less, results of our studies indicate that we have made progress in selected areas, li- brary and computer usage in particular.

Throughout the short history of LUSLLL, the exponential growth of the program has placed enormous pressure on the assessment process. Procedures have been developed to collect vast quan- tities of data and summarize them in re- ports that clearly present strengths and weaknesses in ways that elicit responses properly aimed toward improved institu- tional programs and education outcomes.

David Towles is coordinator of research and assessment in the School of Lifelong Learning and Ellen Black is associate vice president of planning, research, and assessment at Liberty University. Lynchburg, Virginia.

Midlands Technical College

Midlands Technical College (MTC), which Serves the Columbia, South Caro- lina, metropolitan area with three cam- puses, hasseena6196enrollmentgrowth since 1978. I t currently has over 11,OOO students, enrolling nearly one-quarter of the area high school 1990 graduating seniors who went on to college. MTC also serves approximately 18,000 area residents through its Continuing Educa- tion Division.

In 1986, the faculty, staff, and gov- erning board of MTC initiated a strate- gic planning process, which resulted in the document Vision of Excellence. This document provided the foundation for a comprehensive institutional effective- ness initiative begun in 1988.

At this same time, South Carolina enacted a law known as the "cutting edge" legislation. This law mandated that effective systems of quality assess- ment and accountability be established and maintained by South Carolina pub- lic colleges. Systems must be designed to determine institutional effectiveness, disseminate the results of outcomes to constituents within the state. and initiate changes in curriculum. programs, and policy based on data related to institu- tional effectiveness. The MTC initiative goes beyond South Carolina's mandated requirements, and, in 1989-1990. MTC received the designation of lead institu- tional effectiveness college among South Carolina's two-year technical and community colleges.

In defining institutional effective- ness, MTC also goes beyond the tradi- tional definitions of program evaluation and assessment of student outcomes. MTC's considers it to be an institutional perspective that focuses on accurate planning. assessment of accomplish- ments (both of students and the institution's overall effectiveness), and the use of assessment results to plan and make decisions. Institutional effective-

ness has two major components: plan- ning. which is defined as a process of documenting the intended purpose, di- rection, and expected outcomes of the college and providing foresight in the formulation of policies, programs, and services; and evaluation, which is de- fined as a process of measuring the col- lege against its stated purpose and indicators of effectiveness in tenns of outcomes accomplished.

The planning process involves both strategic planning and operational pIan- ning. Strategic planning is used to iden- tify the major direction and priority initiatives of the college. This begins with clarification of the college's mis- sion, its role and scope, and its values. In order to achieve realistic results, MTC is clear about the internal and external contexts within which the college oper- ates.

Based on these considerations. a set of eight institutional goals was created for the period 1992-1997. These goals include enhancing and developing the curricula to meet multiple challenges; providing the highest quality instruc- tion through excellence in teaching and comprehensive instructional support; and maintaining and refining support processes that enhance student success. In turn, each goal is associated with a set of priority initiatives, such as the implementation of new general educa- tion core requirements and implementa- tion of the instructional component of effectiveness measures, including pro- gram review, needs analyses, and pro- ductivity measures.

While long-range goals and related initiatives are necessary to give an insti- tution direction, they are not sufficient to provide a complete picture of ongo- ing programs and services and their ef- fectiveness. For this purpose, MTC developed a set of assessment criteria to

(continued on page 13)

March-April 1993 Volume 5 . Number 2 I I

(continuedfiom page 11) answer the question, “How effective is our institution in providing ongoing pro- grams and services that encourage sm- dent success and support our mission?” These criteria are called “critical suc- cess factors” and include accessible and comprehensive programs of high q d - ity, student satisfaction and retention, posteducation satisfaction and success, economic development and community involvement, sound and effective resource management, and dynamic or- ganizational involvement and develop- ment.

According to MTC documents. criti- cal success factors are the key things that must succeed for the organization to flourish and achieve its goals. They are defined in a way that guides the development of indicators of effective- ness and sets of measurable criteria in response to two questions: “What do we want the results of our college’s effec- tiveness to be?” “What specific evi- dence are we willing to accept as an indication that the results have actually been achieved? For example, the fol- lowing are two critical success factors and their related indicators of effective- ness: (1) srudenr satisfaction and reten- tion: accurate entry testing and come placement, retention to achievement of

goals, are as follows: Retention of sub- population groups will be within 5% of the collegewide average, and the fresh- man-to-sophomore retention rate will be at or above the national retention rate for two-year public colleges. The factors, in- dicators, and standards (together with their results) are reviewed each year to provide an update on the overall institutional ef- fectiveness of MTC.

Operational planning involves the year- to-year process of converting the strategic plan into action. The operational plan in- cludes the collection of data related to each critical success factor. Sources of information include the student data base on enrollment, retention, race, gender, grade point average, and developmental performance; program enrollment, full- time equivalents, section sizes, and grade point averages, graduation rates, compe- tence performance, placement rates, se- nior projects, licensure test results, transfer performance, advisory committee re- views, and instructor loads; surveys of current students, withdrawing students, graduates, employers, faculty, staff, and community members; support services in- volving student extracurricular activities, career and counseling services, usage re- ports, and financial aid reports; adminis- trative data on cost per full-time equivalent, financial audits, college goals,

While long-range goals and related initiatives are necessary to give an institution direction, they are not sufficient to

provide a complete picture of ongoing programs and services and their effectiveness.

students’ goals, satisfaction withinstruc- tion and personal growth, and assess- ment of student services; and (2) posteducation satisfaction and success: graduate employment and continuing education, employer satisfaction with graduates, and alumni satisfaction with education and training.

The indicators and their supporting measurement criteria are the observed, quantified, and qualified results of per- fomance on the critical success factors. For example, two of the standards set in relation to the student satisfaction and retention factor and one of its indicators, retention to achievement of students’

facilities usage studies, and external fund- ing; and comparison to normed data from standardizcd tests, licensure examinations, and national standards and guidelines.

The evaluation process is an ongoing part of implementing the operational plan. Of course, the preparation of budgets is influenced by the planning process, as well as by anticipated continuing costs.

As the result of a reorganization in 1989 and a merger in 1990, the Office of Institutional Effectiveness and the Office of Research and Analysis were combined, and their new mission was described. This merger helped to ensure that those mak- ing decisions at all levels would have the

best information possible and that they would use the information in sapport of student success and the college’s mis- sion.

From their experience. MTC faculty and administrators have identified the following principles for implenmting institutional effectiveness prolvuns:

Secure support from top leaders and keep them informed and involved. Connect assessment to the college purpose and make it put of a cam- puswidc systematic planning process. Create internal motivation by decid- ing which questions h u t studcot

and services are most important to answer. since not everything can k assessed. Assess where the institution is and what is cumntly being done fmt. and then build on hat. Review and study what others have done so as to avoid reinventing the wheel. Develop or obtain technical expertise as needed and clearly assign respon- sibilities. Organize the effectiveness assess- ment so that it encourages active in- volvement at all levels in collabo- rative efforts and in data-gathering processes. Determine how effectiveness will be assessed using creditable research methods and multiple measures. Focus on outcomes, performance. ef- fective communication of results, and their use for decision making. Follow up to ensure that results produce change.

learniog and the college’s progruns

For more information about the in- stitutional effectiveness effort at MTC, contact Dorcas A. Kitchings. Director. Research and Analysis, Midlands Tech- nical College, P.O. Box 2408. Colum- bia, SC 29202. Tel.: (803) 738-1400.

Peter J . Gray is director of evaluation and research at the Center for Instructional Development. Syracuse University. Syracuse, New York.

darch-April 1993 Volume 5. Number 2 13