162
WASC Assessment Leadership Academy Oakland, California August 1, 2011 Presentation by Trudy W. Banta Professor of Higher Education and Senior Advisor to the Chancellor for Academic Planning and Evaluation Indiana University-Purdue University Indianapolis 355 N. Lansing St., AO 140 Indianapolis, Indiana 46202-2896 tbanta@ iupui.edu http://www.planning.iupui.edu

WASC Assessment Leadership Academy Oakland, California August 1, 2011 Presentation by Trudy W. Banta Professor of Higher Education and Senior Advisor to

Embed Size (px)

Citation preview

WASC Assessment Leadership Academy

Oakland, California

August 1, 2011

Presentation by

Trudy W. BantaProfessor of Higher Education

andSenior Advisor to the Chancellor forAcademic Planning and Evaluation

Indiana University-Purdue University Indianapolis355 N. Lansing St., AO 140

Indianapolis, Indiana 46202-2896tbanta@ iupui.edu

http://www.planning.iupui.edu

Outline

• The Great Testing Debate

• Alternatives to standardized tests of generic skills

© TWBANTA-IUPUI

© TWBANTA-IUPUI

Group Assessment Has Failed toDemonstrate Institutional Accountability

• Focus on improvement at unit level

• Rare aggregation of data centrally

• Too few faculty involved

• HE scholars focused on K-12 assessment

© TWBANTA-IUPUI

2006Commission on the Future of

Higher Education

We need a simple way to compare institutions

The results of student learning assessment, including value added measurements (showing skill improvement over time) should be . . . reported in the aggregate publicly.

© TWBANTA-IUPUI

Now We Have

the

Press to Assess with a Test

My History

Educational psychology

Program evaluation & measurement

Performance funding in Tennessee

1990 USDOE effort to build a national

test

1992 Initiated evidence-based culture

at IUPUI

© TWBANTA-IUPUI

2007

Voluntary System of Accountability

~ Assessment of Learning ~

defined as

critical thinking, written communication, analytic reasoning

© TWBANTA-IUPUI

VSA Recommendations(over my objections)

Collegiate Assessment of Academic

Proficiency (CAAP)

Measuring Academic Proficiency &

Progress (MAPP)

Collegiate Learning Assessment (CLA)

(College BASE)

© TWBANTA-IUPUI

© TWBANTA-IUPUI

TN = Most Prescriptive(5.45% of Budget for Instruction)

1. Accredit all accreditable programs (25)

2. Test all seniors in general education (25)

3. Test seniors in 20% of majors (20)

4. Give an alumni survey (15)

5. Demonstrate use of data to improve (15)

___

100

© TWBANTA-IUPUI

At the University of Tennessee

CAAP

Academic Profile (now MAPP)

COMP (like CLA and withdrawn by 1990)

College BASE

© TWBANTA-IUPUI

In TN We Learned

1) No test measured 30% of gen ed skills

2) Tests of generic skills measure primarily prior learning

3) Reliability of value added = .1

4) Test scores give few clues to guide improvement actions

An Inconvenient Truth

.88 = correlation of VSA – recommended

test scores with SAT/ACT scores

thus

77% of the variance in institutions’ scores

Is due to students’ prior learning

© TWBANTA-IUPUI

© TWBANTA-IUPUI

How Much of the Variance in Senior Scores is Due to College Effects?

• Student motivation to attend that institution (mission differences)

• Student mix based on • age, gender • socioeconomic status • race/ethnicity • transfer status • college major

© TWBANTA-IUPUI

How Much of the Variance in Senior Scores is Due to College Effects?

(continued)• Student motivation to do well• Sampling error• Measurement error• Test anxiety College effects

______

23 %

Threats to Conclusions Based on Test Scores

1. Measurement error

2. Sampling error

3. Different tests yield different results

4. Different ways of presenting results

5. Test bias

6. Pressure to raise scores- Daniel Koretz

“Measuring Up”

Harvard U. Press - 2008

© TWBANTA-IUPUI

Using NSSE

95% variance in responses

is WITHIN Institutions

- Pike, Kuh, McCormick, Ethington & Smart (2011)

© TWBANTA-IUPUI

© TWBANTA-IUPUI

Student Motivation

• Samples of students are being tested

• Extrinsic motivators (cash, prizes) are used

We have learned:

• Only a requirement and intrinsic motivation will bring seniors in to do their best

Recent University of Texas Experience

30 – 40% of seniors at flagships earn

highest CLA score (ceiling effect)

flagship campuses have lowest value

added scores

© TWBANTA-IUPUI

Concerns About Value Added

• Student attrition• Proportion of transfer students• Different methods of calculating • Unreliability• Confounding effects of maturation

© TWBANTA-IUPUI

Word from Measurement Experts

Given the complexity of

educational settings, we may never be satisfied that value added models can be used to appropriately partition the causal effects of teacher, school, and student on measured changes in standardized test scores.

- Henry Braun & Howard Wainer

Handbook of Statistics, Vol. 26: Psychometrics

Elsevier 2007

© TWBANTA-IUPUI

Employing currently available standardized tests of generic

skills to compare the quality

of institutions is not a valid use of those tests.

© TWBANTA-IUPUI

© TWBANTA-IUPUI

Consequences of ScoresBelow Expectations?

Public criticism Loss of students Reduced funding Corrective actions imposed Prep courses for students Profits for coaching consultants

2009 NILOA SurveyCampus-wide Approaches to Assessment

1. A national survey (76%)

2. Standardized test of general skills (39%)

3. Portfolios, specialized tests, external judges

© TBANTA-IUPUI

OECD’s AHELOfor

HEIs from 15 countries

1. Generic skills (CLA)

2. Disciplines (Engineering and Economics)

3. Value added

4. Contextual information indicators

© TWBANTA-IUPUI

Designing Effective Assessment:

Principles & Profiles of Good Practice

Trudy W. Banta

Elizabeth A. Jones

Karen E. Black

Jossey-Bass (Wiley) 2009

© TWBANTA-IUPUI

Profiles

Invited over 1000

Received 146

Selected 49 for use in full

Categorized all 146 and published

Web sites

© TWBANTA-IUPUI

Outline for Profiles

Background and Purpose

Methods over ? Years

Resources Required

Findings

Use of Findings

Impact of Using Findings

Success Factors

Web sites© TWBANTA-IUPUI

Standardized tests

of generic skills

(e.g., writing, critical thinking)

used by just 8%

always supplemented

© TBANTA-IUPUI

© TWBANTA-IUPUI

Standardized tests

CAN

initiate conversation

Advantagesof standardized tests of generic skills

promise of increased reliability & validity

norms for comparison

© TWBANTA-IUPUI

Limitationsof standardized tests of generic skills

cannot cover all a student knows

narrow coverage, need to supplement

difficult to motivate students to take

them!

© TWBANTA-IUPUI

Better Ways to Demonstrate Accountability

Performance Indicators

1. Access (to promote social mobility)

2. Engaging student experience

3. Workforce development

4. Economic development

5. Civic contribution of students, faculty, staff, graduates

© TWBANTA-IUPUI

Effects of Education on SocialMobility

Relate data on

high school courses and completion

college courses and completion

to

Career placements & earnings

private and public employment

military enlistments

incarcerations

- Florida DOE

© TWBANTA-IUPUI

Australian DOE1. Increase participation of individuals

from low SES backgrounds

as undergraduates

as graduate students

2. Improve engagement & satisfaction

survey of student engagement

first year retention

satisfaction of completers

© TWBANTA-IUPUI

Workforce Development

• % graduates placed in field• % graduates placed locally• Degree, certificate, CE programs

aligned with regional priorities• Internships/coop programs• Increased earnings of graduates

- APLU

© TWBANTA-IUPUI

Economic Development• Creation of intellectual property• Patent and license awards• # start-up companies• Sponsored research $

% NSF, NIH• Use of academic facilities by industry• Alignment of assets to support regional

economic clusters

- APLU© TWBANTA-IUPUI

Civic Contribution Internships supported by institution Volunteer hours for students, faculty,

staff Service learning placements $ contributed to United Way et al. Pro bono legal & health services No-cost presentations to community

groups

© TWBANTA-IUPUI

Land-Grant Extension1. Useful information for

agricultural producers

small business owners

rural families

2. After-school STEM programs

3. Driver-ed for teen traffic offenders

4. Green jobs in energy fields

5. Sustainable agriculture practices

© TWBANTA-IUPUI

National Governors AssociationCenter for Best Practices

A simple graduation rate

penalizes colleges serving low

SES students

may discourage open enrollment

may lead to lower graduation

standards

© TWBANTA-IUPUI

National Governors AssociationCenter for Best Practices

Track intermediate student milestones

Successful completion of remedial

and core courses

Advancement from remedial to credit

courses

Transfer from 2-to 4-year college

Credential attainment

© TWBANTA-IUPUI

Oregon Community Collegestrack:

% needing remedial courses % completing remedial or ESL courses # credits earned each year toward

degree or certificate Semester-to-semester and fall-to-fall

persistence

- Inside HE 10/12/09

© TWBANTA-IUPUI

Peter Ewell:

“ . . . shift state funding formulas so that colleges receive money based on how many students are still enrolled by the end of the academic term rather than at the beginning.”

- Inside HE 11/18/09

© TWBANTA-IUPUI

If We Must Measure LearningLet’s Use:

1. Standardized tests in major fieldslicensure and certification tests

ETS Major Field Tests

2. Internship performance

3. Senior projects

4. Study abroad performance

5. Electronic portfolios

6. External examiners

© TWBANTA-IUPUI

© TWBANTA-IUPUI

Western Governors Universityoffering

Competence-based on-line degrees

1) Define the domain of knowledge/skill2) Develop objective test items3) Design performance tasks4) Evaluate performance5) Use findings to improve curriculum,

instruction, student services6) Continuously improve assessment quality

and validity

2009 NILOA SurveyProgram Level Approaches

1. Portfolios (80% in at least 1 area)

2. Performance assessments

3. Rubrics

4. External judges

5. Student interviews

6. Employer surveys

© TBANTA-IUPUI

© TWBANTA-IUPUI

Student Electronic Portfolio• Students take responsibility for

demonstrating core skills• Unique individual skills and

achievements can be emphasized• Multi-media opportunities extend

possibilities• Metacognitive thinking is enhanced

through reflection on contents- Sharon J. Hamilton

IUPUI

More use of RUBRICS

locally developed

VALUE from AAC&U

© TWBANTA-IUPUI

VALUE Rubrics

• Critical thinking• Written communication• Oral communication• Information literacy• Teamwork• Intercultural knowledge• Ethical reasoning

© TWBANTA-IUPUI

Accountability Report85% achieve Outstanding ratings in

writing as defined . . .

78% are Outstanding in applying knowledge and skills in internships

75% are Outstanding in delivering an oral presentation

© TWBANTA-IUPUI

For External Credibility

Collaborate on rubrics

Use employers as examiners

Conduct process audits

© TWBANTA-IUPUI

E-Port Challenges

• Reliability of rubrics• Student motivation if used for

assessment

(Barrett, 2009)

• Differences in topics for products to be evaluated

(Sekolsky & Wentland, 2010)

© TWBANTA-IUPUI

Obstacles to UsingPerformance-Based Measures

• Defining domains and constructs• Obtaining agreement on what to

measure and definitions• Defining reliability and validity• Creating good measures

- Tom Zane

WGU© TWBANTA-IUPUI

Will it take 80 years . . . ?

3 Promising Alternatives

E portfolios

Rubrics

Assessment communities

- Banta, Griffin, Flateby, Kahn

NILOA Paper #2 (2009)

© TWBANTA-IUPUI

PART 2

Building An Evidence-Based Culture

Evidence at several levels

An institutional example

© TWBANTA-IUPUI

© TWBANTA-IUPUI

Organizational Levels for Assessment

National

Regional

State

Campus

College

Discipline

Classroom

Student

Evidence at the Classroom Level

• Background Knowledge Probe

• Minute paper in class

• Just-in-time teaching on line

© TWBANTA-IUPUI

© TWBANTA-IUPUI

Background Knowledge Probe(Pre-Test – Indirect Measure)

1. ARCHAEOLOGYA. Have never heard of thisB. Have heard of it, but don’t really know what it meansC. Have some idea what it means, but

not too clearD. Have a clear idea what this means

and can explain it- Classroom Assessment Angelo and Cross

Primary Trait Scoring

Assigns scores to attributes (traits) of a task

STEPS Identify traits necessary for success in

assignment Compose scale or rubric giving clear

definition to each point Grade using the rubric

Assessment of Group Interaction

The Student Participant:• Listened to others • Actively contributed to discussion• Challenged others effectively• Was willing to alter own opinion• Effectively explained concepts/insights• Summarized/proposed solutions

5=Consistently excellent

3=Generally satisfactory

1=Inconsistent and/or inappropriate

Evidence of Generic Skills

• ePort with rubrics

• Standardized tests

© TWBANTA-IUPUI

Evidence at the Program Level

• Individual and team projects• Research papers• Internships• Electronic portfolios• Peer review

© TWBANTA-IUPUI

Assessment in Sociology and Anthropology

Focus groups of graduating students Given a scenario appropriate to the discipline, a

faculty facilitator asks questions related to outcomes faculty have identified in 3 areas: concepts, theory, methods.

2 faculty observers use 0-3 scale to rate each student on each question

GROUP scores are discussed by all faculty Murphy & Goreham

North Dakota State University

© TWBANTA-IUPUI

Internships

Evaluated against specific criteria by

• Students

• Faculty

• Field-based supervisors

Elements of Program Review

Self Study

Review by Respected Peers

Recommendations

Follow-up

© TWBANTA-IUPUI

Evidence at the Institutional Level

• Learning outcomes• Questionnaires, interviews, focus groups• Productivity measures• Cost analyses• Management ratios• Program evaluation• Peer review• Accreditation

© TWBANTA-IUPUI

© TWBANTA-IUPUI

Outcomes Assessment Requires Collaboration

In setting expected program outcomes

In developing sequence of learning experiences (curriculum)

In choosing measures In interpreting assessment findings In making responsive improvements

Faculty and Staff Development

Focus faculty and student affairs professionals on improving learning in and outside class

Attend conferences together Study literature on student learning Provide workshops on teaching and learning Provide resources (e.g., grants, summer

salary, release time)

© TWBANTA-IUPUI

Some Evaluative Questions If we undertake a new approach:

Is instruction more effective? Are students learning more?

Are students more satisfied? Are faculty more satisfied?

Do outcomes justify costs?

© TWBANTA-IUPUI

Campus Interest in Assessment

WHAT WORKS in….

increasing student retention? general education? use of technology in instruction? curriculum in the major?

© TWBANTA-IUPUI

Good assessment is good research . . .

An important question An approach to answer the

question Data collection Analysis Report

-Gary R. Pike (2000)

© TWBANTA-IUPUI

Involve Students1. Set learning expectations in recruiting

2. Communicate learning outcomes in orientation

3. Involve student leaders in promoting learning

4. Involve students in evaluating courses/curricula

5. Let students know their recommendations are used.

Student Advisory Council at Montevallo

A way to provide continuous student assessment

Student Recommendations1 Develop a statement of expected

ethical behaviors for students2 Add a second research course with lab3 Increase comparative psychology4 Add terminals for statistics lab5 Increase opportunities for research,

writing, and speaking

Engage Graduates

• Alverno

• Penn State

© TWBANTA-IUPUI

Involve Employers

• Developing curriculum

• Assessing student learning

© TWBANTA-IUPUI

Involving EmployersCombination of survey and focus groups

for employers of business graduates Identified skills, knowledge, personality attributes

sought by employers Encouraged faculty to make curriculum changes Motivated student to develop needed skills Strengthened ties among faculty, students, employers

- Kretovics & McCambridge

Colorado State University

Colorado State UniversityCollege of Business

Curriculum changes based on employer suggestions:

1 credit added to Business Communications for team training and more presentations

Ethics & social responsibility now discussed in intro courses

New Intro to Business course emphasizing career decision-making

More teamwork, oral & written communication, problem-solving in Management survey courses

- Kretovics & McCambridge

Plan

Implement

Evaluate

ImproveCulture of Evidence

© TWBANTA-IUPUI

PLANNING

1. Campus mission, goals

2. Unit goals aligned

3. Programs based on assessable goals with PIs

4. Annual reports on the Web© TWBANTA-IUPUI

Outline for Annual Reports

IUPUI Theme

Unit Goal

Objective

Actions Taken

Actions Planned

Evidence of Progress

© TWBANTA-IUPUI

Evaluation Services1. Assessment of learning

2. Surveys

3. Program reviews

4. Performance indicators

5. Program cost analysis

6. Web-based evaluation tools

7. Program evaluation/action research

8. Accreditation© TWBANTA-IUPUI

Surveys

1. Enrolled Students Our own NSSE

2. Graduates3. Employers4. Stop outs5. Faculty6. Staff

© TWBANTA-IUPUI

Information Gateway

http://reports.iupui.edu/gateway/

Information about

Students

Faculty

Staff

Alumni

Finances

© TWBANTA-IUPUI

Since 1993Campus-wide surveys

have stimulated changes in

Curricula Advising Increased writing practice Increased attention to first-year

experiences Placement of graduates

© TWBANTA-IUPUI

Goal and Objectives for Student Learning

Enhance undergraduate student learning

and success

1. Strengthen generic skills

2. Provide honors programming

3. Offer learning communities

4. Strengthen advising

5. Provide tutoring and mentoring

© TWBANTA-IUPUI

Employ Multiple Methods

1) Direct

Projects, papers, tests, observations

2) Indirect

Questionnaires, interviews, focus groups

Unobtrusive measures

Syllabi, transcripts

© TWBANTA-IUPUI

© TWBANTA-IUPUI

Student Learning OrientedCourse Evaluation

1. Learners held high expectations for one another

2. Learners interacted frequently with others

3. Learners participated in learning teams4. Learners respected diverse talents and

ways of learning

-CournoyerAdvances in Social

Work – Fall 2001

Since 1994Assessment of Learning

has stimulated changes in

Student support programsCurriculumMethods of instructionInternshipsMethods of assessment

© TWBANTA-IUPUI

What is ABC? ABC is a costing methodology based upon the fact that different activities and products consume different proportions of resources

Resources

Product A

Product B

Product C

Activities

Activities

Activities

© TWBANTA-IUPUI

Some tasks within instruction curriculum planning course design class preparation class instruction assessment course evaluation

© TWBANTA-IUPUI

What Is ABC?Traditional vs. ABC

Traditional Accounting Perspective Salary & wages 1,350,000 Benefits 495,000 Travel 45,000 Facilities 220,000 Supplies 90,000

Total $2,200,000

Activity-Based Perspective Teach courses 940,000 Perform research 430,000 Provide service 250,000 Administer programs

350,000

Provide tech support

230,000

Total $2,200,000

© TWBANTA-IUPUI

Some Applications of Economic Model

1. Estimate costs of administrative services as compared to cost of outsourcing2. Determine fees for various programs3. Restructure processes to expedite work flow and minimize costs

© TWBANTA-IUPUI

Since 1992Activity-based Costing has stimulated changes in

Planning Budgeting Assessment

© TWBANTA-IUPUI

Elements of Program Review

Self Study

Review by Respected Peers

Recommendations

Follow-up

© TWBANTA-IUPUI

Goals of Program Review at IUPUI

To improve student learning To assess and improve program

quality To increase cross-disciplinary

collaboration To enhance community connections To reinforce importance of aligning unit

and campus planning

© TWBANTA-IUPUI

Following a Program Review

1. Program receives reviewers’ report

2. Faculty meet to consider findings

3. Faculty respond in writing

4. Program chair, dean, provost meet to consider written response

5. Improvements are implemented

© TWBANTA-IUPUI

Since 1995 Program Reviews have stimulated changes in

•Planning for the future•Research emphases•Faculty hiring priorities•Advisory councils•Cross-disciplinary collaboration

© TWBANTA-IUPUI

Program Review at IUPUI

www.planning.iupui.edu/assessment/

© TWBANTA-IUPUI

© TWBANTA-IUPUI

THE TEAM

1. Chancellor, Provost

2. IMIR

3. Program Review & Assessment Committee (PRAC)

4. Faculty Development

© TWBANTA-IUPUI

Open sharing of information and evidence-based decision-making

• Financial and satisfaction data for units

• Annual planning/budgeting hearings

• Performance indicators derived from unit

reports over time

• Campus performance report for

community

Program Review & Assessment Committee

� 2 reps from each school� 2 librarians� Other units

Student Life

Faculty Development

Internship coordinator

© TWBANTA-IUPUI

© TWBANTA-IUPUI

Program Review and Assessment Committee

• Provides a forum for exchange of information about assessment

• Oversees program review

• Suggests/provides faculty development

• Develops annual reports

© TWBANTA-IUPUI

Characterizing the Culture

▪ Appointment of Assessment Specialists• Faculty Development

• Library• Student Life• Service Learning• Enrollment Services• University College

▪ Appointment of Associate Deans for Assessment

© TWBANTA-IUPUI

Characterizing the Culture

New initiatives require assessment

University College

student support programs

Distance learning

New academic programs

© TWBANTA-IUPUI

Characterizing the Culture

Promotion & Tenure Guidelines

Faculty/Staff Development Grants

Awards

© TWBANTA-IUPUI

Build Assessment into Valued Processes

1. Assessment of learning

2. Curriculum review and revision

3. Survey research

4. Program review

5. Scholarship of Teaching & Learning

6. Evaluation of initiatives

7. Faculty development

8. Promotion & tenure

9. Rewards and recognition

© TWBANTA-IUPUI

Establishing a Culture of Evidence

takes

Strong leadership

Support

Time

PART 3

Examples of Effective Assessment Practice

© TWBANTA-IUPUI

Profiles

Invited over 1000

Received 146

Selected 49 for use in full

Categorized all 146 and published

Web sites

© TBANTA-IUPUI

Outline for Profiles

Background and Purpose

Methods over ? Years

Resources Required

Findings

Use of Findings

Impact of Using Findings

Success Factors

Web sites© TBANTA-IUPUI

Plan

Implement

Evaluate

ImproveCulture of Evidence

© TBANTA-IUPUI

~ Organization ~of

Principles & ProfilesPlanning

Implementing

Improving & Sustaining

- Building a Scholarship of Assessment

Banta & Associates

Jossey-Bass 2002

© TBANTA-IUPUI

Planning Principles

1. Engaging stakeholders

2. Connecting assessment to valued goals & processes

3. Creating a written plan

4. Timing assessment

5. Building a culture based on evidence

© TWBANTA-IUPUI

Planning Profiles

Brigham Young University

Campus Wiki for degree learning outcomes

USMA at West Point

Interdisciplinary teams assess 10

mission-related goals for learners

Kennesaw State University

2008 CHEA Award for linking assessment

with planning, program review, faculty

development

© TWBANTA-IUPUI

USMA @ West Point6 Developmental Domains

1. Intellectual

2. Physical

3. Military

4. Social

5. Moral-ethical

6. Human spirit

© TBANTA-IUPUI

USMA @ West PointIntellectual Domain

10 Goals (write, speak, think; engineering, math, info tech)

A. Stated learner outcomes

1. Standards

a. Rubrics developed by faculty

© TBANTA-IUPUI

USMA @ West PointInterdisciplinary Goal Teams

use

• Curriculum-embedded direct measures of learning

• Student surveys (fr., sr.)• Graduate survey (3 years after)• Employer surveys• Employer focus groups

© TBANTA-IUPUI

USMA @ West PointUse of Assessment Findings

Review of core curriculum

Changes in warranted areas:

History

English

Engineering

Information Technology

© TBANTA-IUPUI

Implementation Principles

1. Providing leadership

2. Creating faculty/staff development

3. Placing responsibility with unit

4. Using multiple methods

5. Communicating findings

© TWBANTA-IUPUI

Implementation Profiles

• California State University, Sacramento• Strong leadership, multiple methods

• Texas Christian University• Faculty learning communities

• Tompkins Cortland Community College• Capstone rubrics

© KBLACK-IUPUI

Cal State-Sacramento (1)

Sources of Motivation for Assessment

1. New VP for Student Affairs

2. Reaccreditation looming

3. Enrollment & budget challenges

4. Pledge to become more data-driven and focused on student learning

© TWBANTA-IUPUI

Cal State-Sacramento (2)

1. Align department & division missions

2. Develop SMART goals, 1 for student learning

Specific

Measurable

Aggressive, yet attainable

Results-oriented

Timely

© TWBANTA-IUPUI

Cal State-Sacramento (3)

MeasuresPre-post MC tests on policies, resources

Essays with rubrics (reinstatement)

Portfolios

Observation of skills

(Leadership, RA reports on scenarios,

role-playing)

© TWBANTA-IUPUI

Cal State-Sacramento (4)

Findings1. Some SLOs met

2. Some SLOs not met

3. Some measures not effective

4. Too few participants to assess

5. Too many participants to assess

effectively

© TWBANTA-IUPUI

Cal State-Sacramento (5)

Use of Findings1. Better training for RAs in reporting

2. Better training for peer mentors in orientation (emphasizing policies)

3. More time to discuss films

4. Better PowerPoint presentations

5. Increase participation in counseling

6. Redesign vague test items

© TWBANTA-IUPUI

Implementation Profiles (Continued)

Pennsylvania State UniversityPULSE Survey

Moravian College

Using technology for curriculum maps

Alverno CollegePortfolios in Teacher Education

Northeastern Illinois UniversityMultiple methods including national

standardized test

© KBLACK-IUPUI

A Look At The Profiles

• Leadership 18%• Faculty and Staff Development 18%• Responsibility at Unit Level 33%• Methods

• Rubrics 37%• Surveys 33%• Electronic/Technology 20%• Portfolios 14%• National Standardized Tests 8%

© KBLACK-IUPUI

Improving/Sustaining Principles

1. Providing credible evidence of learning to multiple stakeholders

2. Reviewing assessment reports

3. Ensuring use of results

4. Evaluating the assessment process

© EJONES-WVU

Improving/Sustaining Profiles

San Jose State University

Specialists in each college, awards,

learning outcomes in 5-year plans

Hocking Technical College

Annual assessment work day

Colorado State University

Integration of learning outcomes in

on-line template for program reviews

© TWBANTA-IUPUI

Sustaining Professional Development: Faculty Learning Communities

Texas Christian University

--Six areas in general education

1. religious traditions

2. historical traditions

3. literary traditions

4. global awareness

5. cultural awareness

6. social values

7. citizenship

© EJONES-WVU

Sustaining Professional Development: Faculty Learning Communities

Texas Christian University

--Created faculty learning communities to address the following:

a. identify and create assessment strategies

b. share results of assessment processes

c. discuss results to enhance teaching and learning experiences

© EJONES-WVU

Required Resources To Implement and Sustain Assessment

1. Faculty release time

2. Stipends for faculty leaders

3. Assessment committee

4. New full-time assessment position created

5. External consultants

© EJONES-WVU

Required Resources To Implement and Sustain Assessment

6. Financial resources to pay for tests and purchase surveys

7. Administrative support

8. Professional development

9. Technology

© EJONES-WVU

Some Big Ideas • Influence of accreditation is strong• Engaging faculty may require extra pay• Standardized tests of generic skills are

not used alone• Linking assessment with planning and

program review works• Impact is not measured in learning

gains

© TWBANTA-IUPUI

© TWBANTA-IUPUI

Group Assessment Has Failed toDemonstrate Institutional Accountability

• Focus on improvement at unit level

• Rare aggregation of data centrally

• Too few faculty involved

• Involved faculty return to discipline

• HE scholars focused on K-12 assessment

Impact of Using Findings

More attention to:

improving assessment tools

need to do assessment

participating in faculty

development

using assessment findings

© TWBANTA-IUPUI

Where Learning Has Improved

Alverno College – Milwaukee, WI

Truman State University – Kirksville, MO

© TWBANTA-IUPUI

Computer-Based Testingat

James Madison University

• Information Literacy

• Scientific Reasoning

• Quantitative Reasoning

University of South Florida

Cognitive Level & Quality of Writing Assessment

(CLAQWA)

Rubric of 16 traits X 5 (Bloom’s) levels

Used by peers and teachers

Improves writing and thinking

© TWBANTA-IUPUI

San Diego State University

In portfolios, masters & doctoral students

reflect on

curricular & co-curricular learning

program learning outcomes

Oral presentations of synthesized learning

Evaluated by faculty, external professionals

Synthesized learning has improved

© TWBANTA-IUPUI

North Carolina State University

DEAL Model for Critical Reflection(Description, Examination, Articulation of Learning)

Rubric levels based on Bloom’s Taxonomy

Improves higher order reasoning

and critical thinking skills

© TWBANTA-IUPUI

Plan

Implement

Evaluate

ImproveCulture of Evidence

© TBANTA-IUPUI

Building A Scholarship of Assessment

• National Institute for Learning Outcomes

Assessment (NILOA)• AAC&U’s VALUE Project• Teagle’s Wabash Study and Assessment

Scholars• Lumina’s big Goal and Degree

Qualifications Framework• New Leadership Alliance for Student

Learning & Accountability

© TWBANTA-IUPUI

ASSESSMENT UPDATE

Bi-monthly

Published by Jossey-Bass

Since 1989

Articles up to 2000 words

4 Columns

Book Reviews

© TWBANTA-IUPUI

© TWBANTA-IUPUI

Scholarship Reconsidered

• Four kinds of scholarship Discovery Integration Application Teaching

-Boyer (1990)

© TWBANTA-IUPUI

SoTLdiffers from

the scholarship of discovery

in its focus on the classroom

L.Shulman (2004)

© TWBANTA-IUPUI

SoTL Approach to Classroom Research 1. Articulate learning goals. 2. Formulate a question about the learning

situation based on the goals. 3. Design a way to collect data. 4. Teach to the goals. 5. Assess the student learning toward the

goal. 6. Analyze the feedback. 7. Reflect on the results for future teaching

decisions. 8. Share the results.

© TWBANTA-IUPUI

SoTL involves

1. Systematic investigation of a research question

2. Study of related literature3. Going public with findings4. Critical review by peers5. Use of research as foundation for

further work

H. Timberg (2007)

© TWBANTA-IUPUI

Types of SoTL QuestionsThe context of teaching: institutional

factors, physical facilities, organizational support

Example: Is it more effective to teach this class in one-hour or two-hour sessions?

Example: How does sitting around tables rather than sitting in rows affect learning?

© TWBANTA-IUPUI

Sound Familiar? The Scholarship of Assessment and

Scholarship of Teaching and Learning are integrally related:

Can address topics besides learning (civic engagement)

Study of learning issues in

actual settings

based on evidence

resulting in public

sharing

Can in-clude con-ceptual non-empirical questions or issues

SoASoTL

© TWBANTA-IUPUI

Building a Scholarship of Assessment

- Banta & Associates

Jossey-Bass Publishers

April 2002

© TWBANTA-IUPUI

Scholarly Assessment Involves

o selecting/creating assessment methods

o trying the methodso reflecting on strengths/weaknesseso modifying the methods or trying

new oneso improving assessment continuously

© TWBANTA-IUPUI

Scholarly Assessment

Conduct syllabus analysis

- Is critical thinking emphasized?

Develop student guide to assessment

- Do students understand

why and how they are assessed?

© TWBANTA-IUPUI

The Scholarship of Assessment Involves

basing assessment studies on relevant theory/practice

gathering evidence

developing a summary of findings

sharing findings with the assessment community

© TWBANTA-IUPUI

Scholarship of Assessment

Compare two teaching methods

- Is technology-enhanced instruction

more effective?

Validate a measure of student civility

- Do interventions increase civility?

Barriers to Scholarship in Assessment

• Campus coordinators are trained in other disciplines

• Scholars in relevant fields don’t do outcomes assessment

• Assessment scholarship is not rewarded

• Campus coordinators return to their own disciplines

• Few graduate programs prepare assessors

© TWBANTA-IUPUI

Some Research Traditions Underlying Assessment

Program evaluationOrganizational change and

developmentCognitive psychologyStudent developmentMeasurement Informatics

© TWBANTA-IUPUI

Assessment MethodsImprove instruments to measure

content knowledge at more

complex levels

affective development

effects of educational interventions

changes in learning over time

© TWBANTA-IUPUI

Assessment Methods

• How can we use technology in assessment more effectively?

• How can we demonstrate the validity of locally developed instruments?

• How can faculty make consensual judgments about the quality of student performance?

• How can student feedback be designed to help faculty improve their teaching?

© TWBANTA-IUPUI

Organizational Behavior & Development

How can assessment be combined with other systemic changes to improve teaching & learning?

What patterns of organizational behavior promote and sustain assessment?

What methods of providing and managing assessment information are most effective?

Which public policy initiatives are most effective in promoting improvement on campuses?

© TWBANTA-IUPUI

Shared Reflective Practice

Conduct meta-evaluations of approaches to assessment

Determine what works best within disciplines

Develop consortia of institutions to provide forums for reflection

© TWBANTA-IUPUI

Engaging Faculty

Introduce assessment as research

Connect assessment with the scholarship of teaching

Support learning about assessment through faculty development

© TWBANTA-IUPUI

Targets for Research on Engaging Faculty

How can we determine the interests and commitments of stakeholders?

How should we educate stakeholders for choosing methods?

How can we reduce costs and maximize assessment’s benefits?

What ethical principles should guide our work?

Derived from Michael Quinn Patton’sUtilization – Focused Evaluation (1997)

© TWBANTA-IUPUI