29
Project SAILS: Facing the Challenges of Information Literacy Assessment Julie Gedeon Carolyn Radcliff Rick Wiggins Kent State University EDUCAUSE 2004 Conference Denver Colorado

Project SAILS: Facing the Challenges of Information Literacy Assessment Julie Gedeon Carolyn Radcliff Rick Wiggins Kent State University EDUCAUSE 2004

Embed Size (px)

Citation preview

Project SAILS: Facing the Challenges of Information

Literacy Assessment

Julie GedeonCarolyn Radcliff

Rick WigginsKent State University

EDUCAUSE 2004 ConferenceDenver Colorado

2

What is information literacy?

• Ability to locate, access, use, and evaluate information efficiently and effectively.

• Guiding document: “Information Competency Standards for Higher Education” – Association of College & Research Libraries

(http://www.ala.org/ala/acrl/acrlstandards/informationliteracycompetency.htm)

3

Our questions

Does information literacy make a difference to student success?

Does the library contribute to information literacy?

How do we know if a student is information literate?

4

The Idea of SAILS

Perceived need – No tool available Project goal – Make a tool:

Program evaluation Valid Reliable Enables cross-institutional comparison Easy to administer for wide delivery Acceptable to university administrators

5

Project parameters

TestSystems design approachMeasurement model – Item

Response TheoryTests cohorts of students (not

individuals)A name

Standardized Assessment of Information Literacy Skills

6

The project structure

Kent State teamOhio Board of Regents collaborative

grant with Bowling Green State University (part for SAILS)

IMLS National Leadership GrantAssociation of Research Libraries

partnershipwww.projectsails.org

7

Technical components

Environment Item builderSurvey builderSurvey generatorReport generationChallenges

8

Environment

Linux (Red Hat)ApacheMySQLPHP

9

Survey process

Create survey questions (items)Create survey for this phaseAdd schools for this phaseSchools create web front-endCollect data

10Item Builder

11

Item maintenance

12

Survey Builder

13

Survey items

14

Random selection of items

15

School information

16SAILS front-end

17

Redirection to SAILS web site

Parameters passed: Unique student identifier School code Authorization code

18

Link test

19

Demographic data

20Survey questions

21

Report process

Send schools unique identifiersUpload demographicsScan & upload paper surveysGenerate entire dataset fileOffline IRT analysisUpload IRT resultsGenerate reports

22

Sample report text

23

Sample report graph

24

Technical challenges

Creation of the front-endCustomizations for schoolsAutomating the data analysisSupporting different languages

25

Data analysis

Item Response Theory Measures ability levels Looks at patterns of responses

For test-takersFor items (questions)

Based on standards and skill setsShow areas of strength and areas of

weakness

26

Status

Instrument 126 items developed, tested, and in use Web-based and paper-based administration

Grant Project - IMLS Phase I complete - 6 institutions Phase II complete - 34 institutions Phase III began June 2004 - 77 institutions

27

Next steps for SAILS

Analyze data and other inputAdministrative challenges

Self reported demographic data Testing environment Report generation

Does the instrument measure what we want it to?

Are institutions getting what they need?

28

Summary

Vision: Standardized, cross-institutional

instrument that measures what we think it does

To answer the questions: Do students gain information literacy

skills? Does information literacy make a

difference to student success?

29

For more [email protected]

Julie Gedeon, [email protected] Radcliff, [email protected] Wiggins, [email protected]

Mary Thompson, project [email protected]; 330-672-1658