2
Getting Started in Outcomes Assessment Setting Objectives, Selecting Instruments, Utilizing Findings Donald Lumsden, Michael E. Knight ean College of New Jersey initiated an outcomes assess- ment program in 1984 as part of its Governor’s Challenge Grant, which had as its theme Excellence and Equity. Learning outcomes place emphasis on the teaching-learning process with quality education as the goal. In this context, assessment is part of a cybernetic feedback system that provides rational bases for adjust- ing instruction (curriculum, teaching methods, student preparation) to en- sure that educational goals are being achieved. This article examines what we have learned through our experiences in outcomes assessment, focusing on the beginning steps in implementing such a program. Its purpose is to assist others in higher education who are embarking on assessment programs of their own. These strategies for out- comes assessment are divided into three categories: setting objectives, selecting instruments, and utilizing findings. Setting Objectives After examining programs and deter- mining objectives, many have con- cluded that this step may be sufficient justification for the total outcomes assessment process. Academic pro- grams tend to grow incrementally,and their goals frequently are stated in terms of what will be taught or what courses students will take. Assessment provides a refreshing point of view as the focus shifts from content alone to results of the curriculum. The following guidelines can help faculty prepare outcomes-oriented objectives. Be Inclusiue. To begin, seek out multiple points of view and establish draft lists of potential program goals. Faculty, alumni, enrolled students, and, where appropriate, outside p r e fessionals can bring different perspec- tives on what graduates of programs should achieve. Transfer programs in two-year colleges may be improved by close linkages with institutions to which their graduates apply for ad- mission. Faculty at Kean College found surveys and meetings helpful in gain- ing a pluralistic point of view. For example, some programs found the Program Self-Assessment Survey (PSAS) materials from ETS useful at this developmental stage. Be comprehennue. At Kean, the charge is to examine a broad range of learning outcomes. The scope in- cludes wgnitive mtunnes, what students Not all objectives are equal in importance. know; perfin& what they can do; affectiveoutwmes, what they thinkabout and how they value the subject mat- ter; and satisfadion, how they feel about the curriculum and services in the program. To the extent possible, all potential outcomes of the college ex- perience are considered. Keep an Outcomes Perspective. Eac- ulty are oriented to examine content and process. An outcomes perspec- tive is a different look at the curricu- lum, and it usually takes a conscious effort to describe goals in terms of expected results, with the emphasis on student responses. A legalistic a p plication of behavioral objectives,how- ever, may interfere with the process at this point. John Harris’s presentation at the first AAHE Conference in Columbia, S.C. (October 1985), may be helpful to those who have had little experience in writing outcomes- oriented objectives. Set Priorities. Not all objectives are equal in importance. Goal state- ments need to be weighted or or- dered in terms of their significance. Using 100 percent as the total, deter- mine what proportions of the whole each objective is worth. Even if it ap pears that objectives either cannot be measured or may not reveal very ex- tensive learning outcomes, these ob- jectives still should remain on the list. If they are omitted, then the curricu- lum may be skewed to minimize some very significant outcomes. The p r e gram faculty, not the outcomes assess- ment process, should continue to determine the appropriateness of the goals. Gain Consmsw. The outcomes objectives and the priorities need to be accepted by all the faculty in the program; a small committee or a single Eaculty member should not make these important decisions for the larger group. Various approaches for reach- ing consensus, such as the Delphi Technique or some modification of it, can be used to obtain agreement. As indicated in our first guideline, any approach should begin by including everyone in the program in the origi- nal solicitation of ideas. Selecting Instruments The assessment guidelines at Kean College limit approaches to selecting instruments. Faculty must develop their own assessment measures and set criteria for appropriate levels of performance. Outside consultants are used to evaluate objectives, 10

Getting started in outcomes assessment setting objectives, selecting instruments, utilizing findings

Embed Size (px)

Citation preview

Getting Started in Outcomes Assessment Setting Objectives, Selecting Instruments, Utilizing Findings Donald Lumsden, Michael E. Knight

ean College of New Jersey initiated an outcomes assess- ment program in 1984 as part

of its Governor’s Challenge Grant, which had as its theme Excellence and Equity. Learning outcomes place emphasis on the teaching-learning process with quality education as the goal. In this context, assessment is part of a cybernetic feedback system that provides rational bases for adjust- ing instruction (curriculum, teaching methods, student preparation) to en- sure that educational goals are being achieved.

This article examines what we have learned through our experiences in outcomes assessment, focusing on the beginning steps in implementing such a program. Its purpose is to assist others in higher education who are embarking on assessment programs of their own. These strategies for out- comes assessment are divided into three categories: setting objectives, selecting instruments, and utilizing findings.

Setting Objectives After examining programs and deter- mining objectives, many have con- cluded that this step may be sufficient justification for the total outcomes assessment process. Academic pro- grams tend to grow incrementally, and their goals frequently are stated in terms of what will be taught or what courses students will take. Assessment provides a refreshing point of view as the focus shifts from content alone to results of the curriculum.

The following guidelines can help faculty prepare outcomes-oriented objectives.

Be Inclusiue. To begin, seek out multiple points of view and establish

draft lists of potential program goals. Faculty, alumni, enrolled students, and, where appropriate, outside p r e fessionals can bring different perspec- tives on what graduates of programs should achieve. Transfer programs in two-year colleges may be improved by close linkages with institutions to which their graduates apply for ad- mission. Faculty at Kean College found surveys and meetings helpful in gain- ing a pluralistic point of view. For example, some programs found the Program Self-Assessment Survey (PSAS) materials from ETS useful at this developmental stage.

Be comprehennue. At Kean, the charge is to examine a broad range of learning outcomes. The scope in- cludes wgnitive mtunnes, what students

Not all objectives are equal in importance.

know; p e r f i n & what they can do; affectiveoutwmes, what they thinkabout and how they value the subject mat- ter; and satisfadion, how they feel about the curriculum and services in the program. To the extent possible, all potential outcomes of the college ex- perience are considered.

Keep an Outcomes Perspective. Eac- ulty are oriented to examine content and process. An outcomes perspec- tive is a different look at the curricu- lum, and it usually takes a conscious effort to describe goals in terms of expected results, with the emphasis on student responses. A legalistic a p plication of behavioral objectives, how- ever, may interfere with the process at this point. John Harris’s presentation

at the first AAHE Conference in Columbia, S.C. (October 1985), may be helpful to those who have had little experience in writing outcomes- oriented objectives.

Set Priorities. Not all objectives are equal in importance. Goal state- ments need to be weighted or or- dered in terms of their significance. Using 100 percent as the total, deter- mine what proportions of the whole each objective is worth. Even if it a p pears that objectives either cannot be measured or may not reveal very ex- tensive learning outcomes, these ob- jectives still should remain on the list. If they are omitted, then the curricu- lum may be skewed to minimize some very significant outcomes. The p r e gram faculty, not the outcomes assess- ment process, should continue to determine the appropriateness of the goals.

Gain Consmsw. The outcomes objectives and the priorities need to be accepted by all the faculty in the program; a small committee or a single Eaculty member should not make these important decisions for the larger group. Various approaches for reach- ing consensus, such as the Delphi Technique or some modification of it, can be used to obtain agreement. As indicated in our first guideline, any approach should begin by including everyone in the program in the origi- nal solicitation of ideas.

Selecting Instruments The assessment guidelines at Kean College limit approaches to selecting instruments. Faculty must develop their own assessment measures and set criteria for appropriate levels of performance. Outside consultants are used to evaluate objectives,

10

instruments, and criteria. Faculty do not have the option of choosing na- tionally normed standardized tests. The following concerns apply, how- ever, whether instruments are cre- ated or selected.

Gelb’ng Started Ideally, objectives should be in place, so that the appr- priate instruments can be identified to assess them. We have found that the interaction between objective- setting and instrumentdevelopment is important. Creating or selecting an instrument operationalizes and de- tines the objective. Examining assess- ment techniques, therefore, helps clarify the statement of objectives.

V i d y The instrument (s) must provide indicators of achievement that reflect the objectives for the program. One reason Kean faculty rejected stan-

may be assessed yearly, others accord- control how it may be used. The pro- ing to a longer time schedule. gram faculty should determine how

use/ul 1 n f - h Most issues to present the information to imple- affecting instrument Selection are ment necessary changes in the teach- summarized ir. this question: how can ing-learning process. It might be used one use the information fiom this within a department for program im- instrument? If information iS t0 im- provement or in periodic review re- prove the teaching-learning process) ports to justify conclufions and rec- the instrument must provide results ommendations. that help make instructional decisions. 0 Ask and A n m m Questions. In This is best accomplished if the assess- most cases, findings will raise more merit approaches are aligned questions than they answer. Those with the objectives and the priorities questions should be pursued to deter- given to each. mine if findings are accurate, if more

data can shed light on specific issues, utilizing Findings and if information can be applied to

The bases for utilizing findings are evaluate goal achievement. Only identified in the previous section: the simple questions will be answered eas- intended uses for the information ily using general data. Follow-up ac- should influence the selection of in- tions should spotlight special concerns struments. This section provides more raised by the more general findings.

If the appmach to assessment is designed to improve programs, there can be no bad news.

The concepts of outcomes assessment are simple: determine objectives for what you are doing, select and use instruments to get information about

dardized tests is that they do not nec- essarily reflect the particular objec- tives of the local program. Also, for any instrument to be valid, it must first be reliable; that is, it must mea- sure the objectives consistently.

Multiple Measures. The measure of a program’s effectiveness should not be reduced to one number result- ing from a single instrument. One score does not provide the basis for assessing all objectives of a program. Multiple indicators should be used to gain a broader, more accurate view of a program’s success.

Time Constraints. Assessment must be practical. Critical concerns about any assessment experience in- clude who will take it, when it will be administered, how long it will take, who will administer it, and who will score or evaluate it. Kean’s faculty members have found that some very creative approaches were not very practical and they have had to make adaptations to use them effectively. It may be helpful to develop a schedule for a comprehensive assessment pro- gram. For example, some objectives

how well the ibjectives are being general guidelines for using findings. achieved, and use the information to The way results are used will deter- improve attainment of the objectives. mine the overall impact and value of It is the implementation of the con- outcomes assessment. cepts that is not simple.

Be Positive. If the approach to assessment is designed to improve programs, there can be no bad news. Any finding that suggests that objec- Donald Lumrden is profess~r of tives are not being achieved is a posi- Communications and Michael 8 rive result if it leads to constructive Knight is coordinator of Assessmat changes. An attitude that findings are of Student Leaning and Development, negative will create a chilling effect Kean College o f N m J m q . that may result in the elimination of measures that could be used adversely. If not managed wisely, this aspect of outcomes assessment can create con- flicts among programs, institutions, and governing boards. Herein lies the power to make or break the useful- ness of outcomes assessment to im- prove higher education.

ZdentiJj Data Ownas. Those who gather data or have data gathered for them by others should own i t They should determine how findings are to be used. This view coincides with the issue raised in the previous paragraph: if persons other than the creators of the data have ownership, they also

11