Upload
avis-walsh
View
216
Download
1
Embed Size (px)
Citation preview
Assessing Information Literacy Skills: A Look at Project SAILS
Joseph A. Salem, Jr.Kent State University
ARL New Measures Initiatives CREPUQFebruary 11, 2005
Context
Explosion of interest in information literacy Accountability Assessment
Formative: for planning/improvement Summative: evidence/documenting
What Is an Information Literate Person? Information Power - American Association of
School Librarians 9 standards
Big6 Information Competency Standards for
Higher Education – ACRL 5 standards, 22 performance indicators, 87
outcomes, 138 objectives
Our Questions
Does information literacy make a difference to student success?
Does the library contribute to information literacy?
How do we know if a student is information literate?
The Idea of SAILS
Perceived need – No tool available Project goal – Make a tool:
Programmatic evaluation Valid Reliable Cross-institutional comparison Easy to administer for wide delivery Acceptable to university administrators
The Project Structure
Kent State team Librarians, programmer, measurement expert
Association of Research Libraries partnership Ohio Board of Regents collaborative grant
with Bowling Green State University (part for SAILS)
IMLS National Leadership Grant Working with many institutions
www.projectsails.org
Project Parameters
Test based on ACRL document Test development model: Systems
design approach Measurement model: Item Response
Theory Tests cohorts of students (not
individuals) – programmatic assessment A name
Standardized Assessment of Information Literacy Skills
Test Development
Systems design approach:
1. Determine instructional goal
2. Analyze instructional goal
3. Analyze learners and contexts
4. Write performance objectives
5. Develop assessment instrument
The Systematic Design of Instruction. 6th ed. By Walter Dick, Lou Carey, James O. Carey. Boston: Pearson/Allyn and Bacon, c2005.
ACRL Standards – Significant Challenges Breadth and depth Objectives
Multi-part Multi-level
Habits/behaviors versus knowledge
Consider Skill Sets
Regrouping the ACRL objectives (and some outcomes)
12 sets of skills organized around activities/concepts
More closely mirrors instructional efforts?
Item Development Process
Review competencies and draft some items:
"How can I know that a student has achieved this competency?"
Formulate a question and answers. This may take several iterations.
Develop additional responses that are incorrect, yet plausible. Aim for five answers total.
Testing the Test Items
Conduct one-on-one trials Meet with individual students, talk through test
items Conduct small group trials
Administer set of items to group, engage in discussion after
Conduct field trials Administer set of items to 500+ students, analyze
data
Measurement Model
Item Response Theory Also called Latent Trait Theory Measures ability levels Looks at patterns of responses
For test-takers For items
Rasch measurement using software program “Winsteps” (www.rasch.org)
Response Pattern Example
Easier questions Harder questions
Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8
Person A C C C C C
Person B C C C C C
Person C C C C C C
Person D C C C C C C C C
C = gave correct answer
Data Reports
Based on standards and skill sets Looking at cohorts, not individuals Show areas of strength and areas of
weakness
The Person-Item Map
Plots items according to difficulty level Plots test-takers according to their patterns of
responses Can mark average score for cohorts
Cross-institutional average Specific institution average
The Bar Chart
Another representation of the information Group averages
Major, class standing, etc. Which groups are important to measure?
How do you know which differences in means are important?
Current Instrument Status
158 items developed, tested, and in use Most ACRL learning outcomes covered
Not Standard 4: Uses information effectively to accomplish a specific purpose.
12 skill sets developed based on ACRL document
IMLS Grant Status
Phase I complete 6 institutions participated Feedback from institutions
Phase II underway 36 institutions participated
Phase III started June 2004 About 70 institutions participating Wrap-up summer 2005
Project Highlights
Discipline-specific modules Canadian version of the instrument Automated survey generation Automated report generation
Next Steps for SAILS
IMLS grant period ends on September 30, 2005
Stop administering SAILS to allow analysis of the instrument Does the instrument measure what we
want it to? Are institutions getting what they need?
Next Steps for SAILS
Analyze data and input from institutions
Validate the instrument Factor analysis and skill sets Outside criterion testing through
performance testing Test-taker characteristics
Sex, ethnicity, class standing, GPA Test administration methods
Next Steps for SAILS
Re-think how results can be presented or used Scoring for the individual Pre and post-testing Cut scores
Administrative challenges Automate data analysis Re-engineer administrative tools Create customer interface
Test development Develop new items
Summary
Vision: Standardized, cross-institutional instrument that
measures what we think it does To answer the question:
Does information literacy make a difference to student success?
For More Information
www.projectsails.org [email protected]
Joseph Salem, [email protected]
Mary Thompson, project [email protected]; 330-672-1658
Questions?