9
Information Literacy Learning Outcomes and Student Success by Sue Samson Available online 30 March 2010 Information literacy learning outcomes of randomly selected first-year and capstone students were analyzed using an assessment instrument based on the ACRL competency standards. Statistically significant differences between student populations in the selective and relative use of information inform the library instruction program and apply to research and teaching libraries. Sue Samson is Professor and Head, Information and Research Services, Mansfield Library, The University of Montana, Missoula, MT, USA <[email protected]>. INTRODUCTION The critical need for students to be knowledgeable about finding, retrieving, analyzing, and effectively using information has become a campus-wide issue fostered by librarians and underscored by regional accreditation standards. Accreditation standards not only define information literacy but also include learning outcomes, emphasize the importance of collaboration between librarians and faculty, and are based on the competency standards of the Association of College and Research Libraries (ACRL). 1 As information literacy has become an important aspect of higher education, the need exists for authentic assessment models to identify learning outcomes. The iterative process of information literacy requires evaluation of information, critical thinking, revision, and integration and is not well addressed by standardized testing. 2 Among alternative assessment methodologies is the process of evaluating student work directly. This methodology focuses on student-constructed use of information within the context of their curriculum and has the potential to examine changes in learning outcomes across disciplines and through- out the college experience. The objectives of this study were to: 1) quantify learning outcomes based on the five ACRL standards using student portfolios; 2) compare the selective use of information resources between first-year and capstone students; 3) compare the relative use of information re- sources between first-year and capstone students; and 4) identify significant patterns of learning outcomes to inform the library's information literacy curriculum. LITERATURE REVIEW The term information literacy became part of the American Library Association (ALA) lexicon in 1989 as part of the Final Report of the ALA's Presidential Committee on Information Literacy. Learning out- comes in relation to information literacy appear in the published library literature as early as 1998 3 and more regularly shortly after 2000 with the official ratification of information literacy competencies by the ACRL. Interest in learning outcomes has further expanded as regional academic accreditation standards have begun to focus on learning outcomes. By 2004, the Middle State Association for regional academic accreditation refers explicitly to the ACRL standards of information literacy, and incorporates them into their own expectations for learning outcomes, confirming the trend toward a more explicit endorsement of information literacy in general, and the ACRL standards in particular.4 Information literacy-related behaviors have also been included on an experimental basis in the 2006 National Survey of Student Engagement (NSSE), providing interesting results of correlations between informa- tion literacy scales and other NSSE measures. 5 In addition to the NSSE experiment, other national and regional assessment initiatives that survey large groups of students across 202 The Journal of Academic Librarianship, Volume 36, Number 3, pages 202210

Information Literacy Learning Outcomes and Student Success

Embed Size (px)

Citation preview

Page 1: Information Literacy Learning Outcomes and Student Success

Information Literacy Learning Outcomes andStudent Successby Sue SamsonAvailable online 30 March 2010

Information literacy learning outcomes ofrandomly selected first-year and capstone

students were analyzed using an assessmentinstrument based on the ACRL competencystandards. Statistically significant differencesbetween student populations in the selective

and relative use of information inform thelibrary instruction program and apply to

research and teaching libraries.

Sue Samson is Professor and Head, Information and Research Services,Mansfield Library, The University of Montana,

Missoula, MT, USA<[email protected]>.

202 The Journal of Academic Librarianship, Volume 36, Number 3, pages

INTRODUCTIONThe critical need for students to be knowledgeable about finding,retrieving, analyzing, and effectively using information has become acampus-wide issue fostered by librarians and underscored by regionalaccreditation standards. Accreditation standards not only defineinformation literacy but also include learning outcomes, emphasizethe importance of collaboration between librarians and faculty, andare based on the competency standards of the Association of Collegeand Research Libraries (ACRL).1 As information literacy has become animportant aspect of higher education, the need exists for authenticassessment models to identify learning outcomes.

The iterative process of information literacy requires evaluation ofinformation, critical thinking, revision, and integration and is not welladdressed by standardized testing.2 Among alternative assessmentmethodologies is the process of evaluating student work directly. Thismethodology focuses on student-constructed use of informationwithin the context of their curriculum and has the potential toexamine changes in learning outcomes across disciplines and through-out the college experience.

The objectives of this studywere to: 1) quantify learning outcomesbased on the five ACRL standards using student portfolios; 2) comparethe selective use of information resources between first-year andcapstone students; 3) compare the relative use of information re-sources between first-year and capstone students; and 4) identifysignificant patterns of learning outcomes to inform the library'sinformation literacy curriculum.

LITERATURE REVIEW

The term information literacy became part of the American LibraryAssociation (ALA) lexicon in 1989 as part of the Final Report of theALA's Presidential Committee on Information Literacy. Learning out-comes in relation to information literacy appear in the published libraryliterature as early as 19983 and more regularly shortly after 2000 withthe official ratification of information literacy competencies by theACRL. Interest in learning outcomes has further expanded as regionalacademic accreditation standards have begun to focus on learningoutcomes. By 2004, the Middle State Association for regional academicaccreditation “refers explicitly to the ACRL standards of informationliteracy, and incorporates them into their own expectations for learningoutcomes, confirming the trend toward amore explicit endorsement ofinformation literacy in general, and the ACRL standards in particular.”4

Information literacy-related behaviors have also been included on anexperimental basis in the 2006 National Survey of Student Engagement(NSSE), providing interesting results of correlations between informa-tion literacy scales and other NSSE measures.5

In addition to the NSSE experiment, other national and regionalassessment initiatives that survey large groups of students across

202–210

Page 2: Information Literacy Learning Outcomes and Student Success

academic institutions provide indicators of student learning outcomes.The Education Testing Service (ETS) reports that many collegestudents lack the information and communication technology (ICT)literacy skills necessary to navigate, evaluate and use informationbased on the administration of their iSkills assessment, an Internet-based assessment that gauges cognitive skills within a digitalenvironment.6 Another assessment instrument developed for largescale assessment and based on the ACRL Information LiteracyCompetency Standards for Higher Education is Project SAILS. Whilethe ETS ICT Literacy Assessment was based on the Standards, ProjectSAILS has mapped its skill sets to outcomes and objectives of theStandards.7 Though promising, analyses of SAILS have the limitation ofproviding only cohort-level information and the application of resultsare not yet appearing in the literature.8

Kuh and Gonyea9 report on the College Student ExperiencesQuestionnaire that provides data from 300,000 students from 1984 to2002. The results of this exploratory study indicate that libraryexperiences of undergraduates positively relate to select education-ally purposeful activities and that the library appears to be a positivelearning environment for all students, especially members ofhistorically underrepresented groups. An example of a system-wideassessment instrument is the South Dakota Information LiteracyExam, a home-grown, dual-measure instrument, developed for asystem of small and medium-sized universities and required to becompleted by all students to document and assess their informationliteracy skills.10 As they have validated and vetted the instrument, theauthors have also developed a generic version available to othersimilar systems. An example of an institution-based assessment is theInformation Seeking Skills Test, a computer-administered, compe-tence test, that measures the ability to find and evaluate informationand is required of all freshmen at James Madison University.11

Student learning outcomes are described by Gratch-Lindauer12 asone of three arenas of information literacy assessment. The other twoarenas are the learning environment and information literacyprogram components. Although these arenas are complementaryand not isolated, this review of learning outcomes literature identifiedmultiple methodologies for making a connection between studentacademic success and level of information literacy knowledge.Tancheva et al.13 emphasize the importance of using a combinationof attitudinal, outcomes-based, and gap-measure assessments toaddress the shortcomings of any one of these. Other authors report alack of success in their methodology. One study gauged theeffectiveness of their information literacy program by comparingaverage course grades in course sections that received informationliteracy instruction to grades in sections that did not and found noclear benefit to the instruction.14 McMillen and Dietering15 found nosingle assessment tool or method proved adequate to effectivelymeasure student learning happening both inside and outside of theirlibrary.

Working from the broad perspective of assessing informationliteracy in general education, Mackey and Jacobson16 found course-specific strategies enhance institutional assessment efforts by pro-viding a range of instruments to measure information literacy withinunique educational contexts. The use of electronic portfolios alongwith rubrics enabled librarians at Washington State UniversityVancouver to evaluate their information literacy program based onACRL best practices guidelines, authentic assessment techniques, andthe tenets of phenomenography, amethodology that is very adaptableto the assessment of information literacy.17

A recent example of pre- and post-testing with 60-min sessionsestablishes this methodology as helpful in identifying areas in whichinstruction could be improved and where positive results were beingachieved.18 Rockman19 reports on the critical issue of integratinginformation literacy into the learning outcomes of academic dis-ciplines. An example is documented in a 3-year study formulated to

test the efficacy of a curriculum designed to foster information literacyskills of graduate students in a chemistry bibliography course.20 Anassessment tool was given to students at the beginning and at the endof the semester, and results indicated marked improvements in theaverage student scores from all 3 years.

Relevant to the focus of this research project, Knight21 reports in astudy using rubrics to assess information literacy that “students'academic work is a useful gauge of their achievement of informationliteracy-based learning outcomes” and that a “rubric is a valuableassessment tool that provides a reliable and objective method foranalysis and comparison.” Three additional studies document thevalidity of this assessment methodology. Findings of an assessment byportfolio validated this methodology in a study of an informationliteracy module.22 Scharf et al.23 further support the validity of using arubric and portfolio for program analysis in their careful and rigorousstudy of the effectiveness of information literacy instruction forundergraduates at a technological university. Finally, citation analysisof final year project reports provides further documentation of thevalue of this methodology in identifying information literacy learningoutcomes.24

INFORMATION LITERACY CURRICULUM

With an undergraduate student population of 11,799, a graduatestudent population of 2059, and 581 full-time faculty, UM is a mid-size coeducational, doctoral university. The information literacycurriculum at the Maureen and Mike Mansfield Library is based on arubric that identifies learning initiatives and outcomes for first-yearthrough graduate level coursework. The strategic integration ofinformation literacy into the curriculum begins with first-yearinitiatives that serve as the basis for information literacy instructionin the disciplines at the junior and senior levels. During fiscal year2009, 420 curriculum integrated classes were taught to 9,513students; and, in addition, 3 credit classes were offered.

First-year curriculum is based on a model of teaching the teachers,and integration decisions have been made on the basis of severalfactors: courses that are a part of the standard university curriculum;courses with a research component, usually smaller enrollmentclasses; and courses with a large enrollment through participationin the Freshman Interest Group program, which offers the opportu-nity to provide cross-disciplinary information literacy instruction.Specific standards and teaching strategies have been identified fortargeted courses to establish quality learning opportunities for first-year students. At every opportunity, librarians seek to serve asresearch consultants and pedagogical guides and to facilitate thesuccessful delivery of information literacy content by teaching facultyin the disciplines. Targeted first-year courses include: EnglishComposition, Introduction to College Writing, Introduction to PublicSpeaking, Critical Writing, Freshman Interest Groups, FreshmanSeminar, and Honors College Seminar.

Based on the delivery of lower-division information literacyinstruction, liaison librarians work collaboratively with faculty in alldepartments, schools, and colleges to tailor advanced informationliteracy instruction to upper-division students in their major studies.Liaison librarians target research andwriting courses in all majors andfacilitate the successful delivery of information literacy contentthrough collaboration with faculty that includes: integration ofinformation literacy standards into the curriculum and learningoutcomes of individual academic units; provision of consultativeservices to teaching faculty to develop curriculum-integrated libraryresearch assignments; promotion of instruction in the use of libraryresources to students and faculty, integrating the tiered LibraryInformation Literacy Curriculum; creation of web-based subjectresources for faculty, students, and staff; and provision of regular,advertised office hours, scheduled reference assistance, and smallgroup instruction sessions as part of the Learning Commons.

May 2010 203

Page 3: Information Literacy Learning Outcomes and Student Success

Table 1[ACRL Standard One: Know] Comparison of the

Selective Use of Information Sources and Formatsby Student Populations

Sources andFormats

First-yearEnglish

CompositionN=16

SeniorCapstoneN=16 Total x2 α

Books 5 8 13 1.373 b .05

Films 2 0 2 2.667 b .05

Images 2 4 6 1.030 b .05

Interviews 3 4 7 .366 b .05

Newspapers 8 1 9 7.729 N .01

Peer-reviewedarticles

8 8 16 .124 b .05

Primary sources 0 10 10 14.69 N .01

Popular magazines 0 0 0 .032 b .05

Reference tools 1 1 2 .534 b .05

Recordings 0 0

Trade publications 0 0

Web sites 13 7 20 4.934 N .05

Wikipedia 3 0 3 3.679 b .05

Total sources 45 43 88

METHODOLOGY

The potential survey population consisted of students 18 years oldand older enrolled in English Composition during fall semester 2007and in capstone courses across the curriculum during spring semester2008. Four randomly selected sections of English Composition andfour randomly selected capstone classes were selected for analysis. Allsections of English Composition were selected from the ENEX 101class number sequence. Capstone classes were identified from theuniversity catalog and were required to use the term capstone as partof either the class title or class description. The randomly selectedcapstone courses included Capstone in Rural and EnvironmentalChange (SOC 460), Senior Thesis (ECON 489), Legal Research andWriting (LEG 287T), and Photojournalism Senior Seminar (JOUR 481).

Within each class section, four studentswere randomly selected fora representative sample of 16 students in the first-year compositionclasses and 16 students in the capstone classes to serve as the studygroup. Student identification numbers were captured to establishquantitative points of comparison between information literacy andstudent demographics and to facilitate a longitudinal assessment upongraduation of current first-year students during phase 2 of the study.The Institutional Review Board reviewed and approved the project,and all participants provided informed, signed consent forms.

“For each student [surveyed], class portfolios and/orfinal research projects were collected [and] classgrades were recorded to measure student success

within the curriculum ….”

For each student, class portfolios and/or final research projectswere collected for analysis using an assessment instrument. Based onInformation Literary Competency Standards for Higher Education,25 theinstrument assesses the performance indicators of each of fivestandards with quantifiable measures. In addition, class grades wererecorded tomeasure student successwithin the curriculumof the classas a point of comparison to information literacy standards. Data werecollected using a web-based form and stored in a Sequel database.

Two analyses of the data were conducted. The relationship betweenstudent status and the use of information resources was tested usingthe chi-square formula with Yates's correction for small sample size.26

A level of N .05 was considered evidence of a significant differencebetween the student populations. The rank-sum test was used tocompare the relative use of each resource.27 A one-sided p-value ofb .05 was used to provide evidence that a comparative differenceexisted between the students groups in the use of specific resources.

RESULTS

Standard One: The Information Literate StudentDetermines the Nature and Extent of the

Information NeededThe quantifiable measures used to assess the performance of

students relative to their knowledge of information resources werethe type of sources and formats used by students in the bibliographiesof their research papers.

Chi-square analysis of individual use of information sources andformats suggest a statistically significant difference in the use ofnewspapers, primary sources, and web sources between the first-yearand capstone student populations (Table 1). First-year students citednewspapers and web sources significantly more than capstonestudents, and capstone students cited primary sources significantlymore than first-year students. Differences in the use of Wikipediaapproached significance, with first-year students citing this source

204 The Journal of Academic Librarianship

more than capstone students. The number of first-year and capstonestudents who cited books, images, interviews, peer-reviewed articles,and reference tools in their bibliographies are not significantlydifferent between the populations. Neither student population usedpopular magazines, recordings, or trade publications.

Using the rank-sum test to analyze the relative use of informationsources and formats based on the total number of sources cited by eachstudent population provides another viewpoint of the data (Table 2).Small p-values provide convincing evidence that a significantdifference exists between the student populations in the total numberof citations and total number of primary sources. Nearing significanceis the total use of books and images. First-years students usedsignificantly fewer total citations and primary sources than didcapstone students, while capstones students used books and imagesmore frequently than first-year students. Use of total number ofunique formats, films, interviews, newspapers, peer-reviewed articles,reference tools and web sources were not significantly different.

Standard Two: The Information Literate StudentAccesses Needed Information Effectively and Efficiently

Access to information was quantified by the following questions(Table 3):

○ Is there evidence that a database has been used?

○ Which database(s)?

○ Is there evidence that Google Scholar has been used?

○ Is there evidence that interlibrary loan may have been used?

○ Is there evidence that the student has collected his/her ownresearch for the project:

○ Describe the nature and extent of that research.

Page 4: Information Literacy Learning Outcomes and Student Success

Table 2[ACRL Standard One: Know] Comparison of the Relative Use of Information

Sources and Formats by Student Populations

Sources and Formats

First-year EnglishComposition N=16

Mean (Range)Senior Capstone N=16

Mean (Range) Z-statisticOne-sidedp-value

Total citations 6.4 (3–10) 22.9 (2–49) −2.57 .0052

Total unique formats 2.8 (1–5) 2.9 (2–4) −.62 .2676

Books 0.7 (0–4) 2.1 (0–10) −1.39 .0823

Films 3 0

Images 0.2 (0–12) 8.6 (0–41) −1.54 .0630

Interviews 0.3 (0–3) 1.0 (0–7) −0.97 .1660

Newspapers 0.8 (0–3) 0.1 (0–1) 2.75 .9970

Peer-reviewed articles 1 (0–5) 6.0 (0–25) −1.11 .1335

Popular magazines 0 0

Primary sources 0 3.4 (0–9) −3.67 .0002

Reference tools 1 1

Recordings 0 0

Trade publications 0 0

Web sites 3.1 (0–10) 1.8 (0–12) 1.92 .9726

Wikipedia 3 0

Total sources 103 366

Table 3[Standard Two: Access] Comparison of Selective Access of Information Sources by Student Populations

Evidence of source First-year English Composition N=16 Senior Capstone N=16 x2 αLibrary databases 10 12 .83 b .05

Google scholar 0 0 .032 b .05

Interlibrary loan 0 4 1.269 b .05

Personal research 0 9 12.677 N .01

Interviews 0 16 (5 students)

Surveys 0 197 (1 student)

Field research 0 0

Focus groups 0 0

Others 0 3

Databases used LexisNexis (4) Academic Search Premier (6)

Quick Search (4) Catalog (6)

Academic Search Premier (3) JSTOR (6)

Catalog (2) Montana Rules of Civil Procedure (4)

CQ Researcher Science Direct (3)

Infotrac: Expanded Academic ASAP Business Search Premier (2)

Omnifile Infotrac: Expanded Academic ASAP (2)

Pubmed OmniFile (2)

Science Direct Educatior's Reference Complete

Emerald Full Text

Environment CompleteQuick Search, Taylor and Francis Sociology

May 2010 205

Page 5: Information Literacy Learning Outcomes and Student Success

Data analysis identifies a statistically significant difference in theuse of personal research by capstone students versus that of first-yearstudents. Examples of personal research by capstone students includeda survey, interviews, the application of modeling to raw data, and thefinal products of semester-long photojournalism projects.

Use of library databases and interlibrary loan indicate nosignificant differences between the two student populations. How-ever, identification of the specific databases used by each groupprovides additional indications of differences between the studentpopulations. First-year students primarily retrieved their sources fromgeneral interdisciplinary databases, including LexisNexis Academic,Quick Search (a federated search feature), Academic Search Premier,and the Library catalog among others. Capstone students also usedAcademic Search Premier and the Library catalog, but then exploredsubject-specific databases including JSTOR, Montana Rules of CivilProcedure, Science Direct, and Business Search Premier, among others.No evidence of use of Google Scholar was identified in either studentpopulation.

Standard Three: The Information Literate StudentEvaluates Information and Its Sources Critically andIncorporates Selected Information into His or Her

Knowledge Base and Value SystemThe ability to evaluate and incorporate information was identified

by the contextual application of sources into the research based on thefollowing questions:

○ How many short direct quotes (three lines or less) are includedin the text?

Table[Standard Three: Evaluate] Comparison of Selectiv

the Context of Stu

Contextual ApplicationFirst-year English

Composition N=16

Short quotes 14

Long quotes 9

In-text citations 14

Relevance 15

Use of quotes as filler 4

Recognize publication bias 6

Examples of publication bias Identified the differing pointsof view provided by authors onfire and management; digitalphotography; AIDS orphans inAfrica; reality TV; alcoholadvertising to teens; andjuvenile incarceration.

Include Cultural context 16

Examples of context Current topics used in contextof everyday experiences.

Original thesis statement 15 (93.8%)

Offer new hypothesis 1 (6.3%)

Investigate differing viewpoints 6 (37.4%)

206 The Journal of Academic Librarianship

○ How many long direct quotes (four lines or more) are includedin the text?

○ How many total in-text citations are included in the essay?

○ Are the quotes used as filler?

○ Does the author acknowledge, questions, or combat possibleauthor or publication bias in the essay?

○ How does the student accomplish this?

○ Does the author connect sources to their cultural context?

○ How does the student accomplish this?

○ Does the author create an original thesis statement using thesupporting evidence s/he presents?

○ Does the author offer a new hypothesis in his/her paper thatmay need to be tested in further studies?

○ Does the author investigate different viewpoints offered in theliterature?

Using chi-square analysis, statistical differences were foundbetween the student populations in their use of quotes as filler andin their recognition of publication bias (Table 4). First-year studentswere more likely to use quotes as filler and capstone students weremore likely to identify publication bias within the context of theirresearch papers. Publication bias was documented within the text by

4e Evaluation of Information Resources withindent Projects

Senior Capstone N=16 x2 α16 2.667 b .05

10 .259 b .05

12 .818 b .05

16 2.065 b .05

0 6.065 N .05

15 7.729 N .01

Uses primary source data tostructure debate; providedalternate readings of Montanalaw; included interviewresponses on both sides of issue.

15 2.067 b .05

Use of specific examples;provides history, context,and examples; use of primarysources in legal context;photographic images in contextof subject content.

15 (93.8%) .534 b .05

6 (37.5%) 3.314 b .05

3 (18.8%) .14 b .05

Page 6: Information Literacy Learning Outcomes and Student Success

Table 5[Standard Three: Evaluate] Comparison of Relative Evaluation of Information Resources within

the Context of Student Projects

Contextual Application

First-year EnglishComposition N=16

Mean (Range)Senior Capstone N=16

Mean (Range) Z-statistic One-sided p-value

Short quotes 5.4 (0–19) 4.4 (1–12) .1712 .5675

Long quotes 1.7 (0–6) 1.5 (0–50 0 .5000

In-text citations 12.3 (0–26) 29.1 (0–128) .5285 .6985

an awareness of an angle that a particular author might present;clarity on the various viewpoints involved with an issue; interviewresponses on both sides of an issue; and/or the use of primary sourcematerials as a way to structure the debate. Nearing statisticalsignificance was the ability of capstone students to offer a newhypothesis based on their research.

In all other aspects of evaluating and incorporating informationinto their research, no statistical differences between the studentpopulations were identified. The student populations were similar intheir use of short quotes, long quotes, in-text citations, relevance ofquotes, providing cultural context, using an original thesis statement,and investigating differing viewpoints. Cultural context was furtherextrapolated by analysis. First-year students used current topics in thecontext of everyday experiences; capstone students used specificexamples, historical context, primary sources in a legal context, andphotographic images in the context of their subjects.

The rank-sum test did not further identify differences in the use ofshort quotes, long quotes, or in-text citations by the two studentpopulations (Table 5).

Standard Four: The Information Literate Student,Individually or as a Member of a Group,

Uses Information Effectively to Accomplisha Specific Purpose

Attempts to capture the effective use of information werequantified with three questions (Table 6):

○ Does the student organize the content in amanner that supportsthe purposes and format of the product or performance?

○ Does the student maintain a journal or log of activities relatedto the information seeking, evaluating, and communicatingprocesses?

○ Does the student reflect on past successes, failures, andalternative strategies?

Table 6[Standard Four: Use] Frequencies of How StudentsOrganize and Proceed with the Research Process

Process

First-year EnglishComposition

N=16Senior Capstone

N=16

Organization supportsproject

15 (93.8%) 16 (100%)

Journal maintained 15 (93.8%) 0 (0.0%)

Reflection on processsuccess

16 (100%) 3 (18.8%)

With the exception of one first-year research paper, all researchprojects were determined to be organized effectively in support oftheir project goals. All but one of the first-year students maintained ajournal of activities related to the information seeking process but didnot expand their content to include the processes of evaluation orcommunication. None of the capstone students included a journal orlog as part of their final project. Finally, all first-year students includedsome form of written reflection on the writing process, and 18.8% ofcapstone students included similar reflections.

Standard Five: The Information Literate StudentUnderstands Many of the Economic, Legal,and Social Issues Surrounding the Use of

Information and Accesses and Uses InformationEthically and Legally

Two questions were used to quantify student understanding ofusing information ethically and legally (Table 7):

○ Does the student select an appropriate documentation style anduse it consistently to cite sources?

○ Does the student post permission granted notices, as needed, forcopyrighted material?

Both the first-year and capstone students used appropriatedocumentation style 87.5% of the time. The use of permissionsgranted notices was not applicable to the research papers of the first-year students whose research topics and findings did not go beyondquotations and/or paraphrasing and subsequent documentation.However, 43.8% of the capstone students presented copyrightedmaterial without posting permission granted notices.

DISCUSSION AND IMPLICATIONS FOR INSTRUCTIONImportantly, many variables exist within a population of students thatmight affect their information literacy learning outcomes. In thisresearch project, student assignments varied both in the expectation

Table 7[Standard Five: Ethical/Legal] Frequencies of the Useof Appropriate Documentation and the Appropriate

Use of Copyrighted Materials

Documentation

First-year EnglishComposition

N=16Senior Capstone

N=16

Appropriatedocumentation style

14 (87.5%) 14 (87.5%)

Permissions notprovided

NA 7 (43.8%)

May 2010 207

Page 7: Information Literacy Learning Outcomes and Student Success

of basic first-year versus advanced capstone research and in thenature of the capstone assignments among disciplines. There are alsomany variables in place relative to student experiences during theiracademic careers that may impact learning outcomes. However, evenwith these variables in place, the data document the elements of theinformation literacy rubric and serve as a guideline to determine areasof student success that match the rubric and areas where informationliteracy instruction can be strengthened.

Standard One: KnowBoth the statistical differences and the similarities in the ability of

these two student populations to determine the nature and extent ofinformation needed reflect the targeted rubric of the MansfieldLibrary information literacy curriculum. Although first-year studentscited statistically more newspapers and web sites, they also citedbooks, images, interviews, peer-reviewed articles, and reference toolsto support their findings. Information literacy instruction is integratedinto the first-year curriculum of English Composition through amodelof teach-the-teachers.28 This instruction is focused on students asscholars and their ability to identify different types of sources, toevaluate sources for their appropriateness in academic research, toexamine differing viewpoints, and to explore the use of libraryresources through the use of databases and the library catalog. Thus,the findings support that the information literacy curriculum isguiding the students to the use of library resources as a complementto web sites, an important first step in the role of students as scholars.

“the findings support that the information literacycurriculum is guiding the students to the use of libraryresources as a complement to web sites, an important

first step in the role of students as scholars.”

Capstone students used significantly more total citations andmoreprimary sources, books and images to document their research. TheLibrary's information literacy curriculum also targets upper-divisionwriting and research classes with sessions that include choosing theappropriate types of sources, the value of original scholarship, citationtracking, critical evaluation of quotes, and tracing quotations tooriginal sources. While capstone students cited statistically moreprimary sources, they also used a range of resources appropriate totheir research topic. The data document that capstone students showconvincing evidence that information literacy provides them with abetter ability to conduct in-depth research, and they cited significantlymore total sources, primary sources, books, and images than first-yearstudents. While the use of images and interviews were influenced bythe photojournalism capstone class in termsof numbers, theywere notused more frequently by individual students between the two groups.

The data documents that both student populations were similar inthe range of different formats used and in their use of interviews,peer-reviewed articles, reference tools, and web sources and did notcite popular magazines, recordings, or trade publications. This alignswell with the library's information literacy curriculum.

Standard Two: AccessStudents in both groups provided evidence of an ability to access

needed information. While capstone students used statistically morepersonal research to complete their projects, there was no statisticaldifference between the groups in their use of library databases. Thissimilarity again reflects the information literacy curriculum and itsefforts to teach students tomaximize their research through the use ofappropriate library resources. Students in both populations locatedrelevant databases most appropriate for their topics and the nature of

208 The Journal of Academic Librarianship

their research. First-year students used the federated quick searchthat incorporates four interdisciplinary databases and the librarycatalog. As a result, they discovered and accessed peer-reviewedarticles, newspaper articles, and books. Access to this search feature ishighlighted in their information literacy instruction. In contrast,capstone students focused their research on subject-specific data-bases and built sizeable bibliographies of quality sources. Subjectresources are a primary focus of information literacy instruction toupper-division students. Subject databases are less likely to indexnewspapers but are more likely to index book chapters as part of theirtargeted literature review.

Although available to all students, the use of interlibrary loan is nottargeted to first-year students in information literacy instructionbased on the premise that library collections support the level ofresearch required to complete first-year research papers. First-yearstudents in this study showed no evidence of using interlibrary loan.In contrast, interlibrary loan use is encouraged for in-depth researchand included in the information literacy rubric for upper-divisionstudents. One-fourth of the capstone students showed evidence ofusing interlibrary loan while many were well-supported by e-journalsubscriptions that provided full-text of peer-reviewed articles.

Over half of the capstone students incorporated personal researchinto their projects. All of the photojournalism students were requiredto complete personal research that included interviews of theirsubjects, but the Sociology and Economics students also includedsurveys and the application of models to raw data to build theirarguments. The Legal Research students interpreted and appliedprimary source law to specific legal cases. This statistically significantuse of personal research in capstone classes across the disciplines canbe used to augment information literacy instruction.

No evidence was provided for the use of Google Scholar by eitherstudent population. While this is a more challenging interpretation ofthe data, Google Scholar is an important research tool that indexes awide range of scholarly sources and provides links to libraryresources. This provides another important indicator of how theinformation literacy curriculum can be strengthened by incorporatingGoogle Scholar into the instruction matrix.

Standard Three: EvaluateThe critical evaluation of sources and their incorporation into

research was evidenced in multiple ways. Although first-yearstudents were statistically more likely to use quotes as filler, theywere statistically similar to capstone students in their use of shortquotes, long quotes, and in-text citations. These data are veryreflective of the focus of information literacy instruction in first-yearand capstone classes. The information literacy rubric for first-yearstudents includes: recognize different information resources andexplain the values and differences between them; construct in-textcitations and a bibliography, inclusive of all source types and formats;and explain the importance of citing research sources and academichonesty. At the 400-level, the rubric includes: construct advancedsearches using controlled vocabularies and Boolean operators;execute cited reference searches; and recognize and explain thevalue of tracking citations forward and backward.

Capstone students were statistically more likely to recognizepublication bias. This is significant since it is reflective of upper-division information literacy instruction. For capstone students whoseresearch is more in-depth, their information literacy instructionincludes identifying important scholars in a field of study, reviewingthe quality and availability of information, identifying gaps inresearch, and noting the value of original scholarship.

It is particularly interesting to note that both groups providedcultural context for their research—first year students citing concreteexamples for their current topics, and capstone students expandingtheir examples to include specific examples, historical context, and

Page 8: Information Literacy Learning Outcomes and Student Success

primary source materials. Within this cultural context, both studentpopulations examined differing points of view. Information literacyinstruction to first-year students encourages that they examinealternate points of view as a way of strengthening their arguments—explore the other side of an issue in order to present your ideas withawareness of alternative viewpoints. All students provided an originalthesis statement, with capstone students more likely to offer a newhypothesis based on their research project. Thus, the extent to whichthese student populations provided cultural context, identifieddiffering points of view, provided an original thesis, and offered anew hypothesis may well be linked to the level of research regnantwith assignment expectations and to information literacy instruction.

Standard Four: UseOf all five standards, evidence in support of using information

effectively was the most difficult to quantify. In general, theorganization of the research papers and projects of all studentssupported the purpose and format of the research product. A notableindicator for strengthening information literacy instruction was withthe photojournalism projects that could have been greatly enhancedwith the addition of a bibliography. Although students wererequired to tell a story with their own photographs, an informedbibliography would have provided a degree of credibility to theirknowledge base about the topics of the photographs. Informationliteracy instruction to this group of students should lead with ameetingwith their professor to discuss the possibilities for adding thisto the assignment.

The use of a journal or log of research activities was a portfoliorequirement for first-year students as part of the integration ofinformation literacy into their writing curriculum, and they com-pleted this with a modest level of success. This element of in-formation literacy could be improved significantly with a structuredresearch log that solicits specific feedback for the processes ofinformation seeking, evaluating, and communicating. A well-structured log would provide a wealth of information for thelibrarian working with English Composition classes and could beused for further refining the curriculum and identifying those areasmost successful and those most in need of additional instruction.Capstone students were not required to complete a research log anddid not do so.

First-year students were also required to reflect on pastsuccesses, failures, and alternative research strategies. They includ-ed these comments as part of their writing portfolios but, again,with modest success. Replacing these reflections with a well-structured log described in the previous paragraph could provide asuccessful method of further integrating information literacy intothe writing curriculum and provide good information on how bestto succeed with first-year students. Capstone students were notrequired to include reflections but almost half did so as part of theirresearch projects. This is an interesting aspect of in-depth researchthat inherently evolves as students seek and critically evaluateinformation.

Standard Five: EthicsThe ethical and legal use of information provided very interesting

analysis. All students used an appropriate documentation style andprovided sources for their research findings. These sources rangedfrom free web sites to peer-reviewed articles to primary sourcematerials. The least well-documented sources were web sites thatdid not always include complete citation information but did alwaysinclude an identifiable uniform resource locator. However, permis-sion granted notices for copyrighted materials were not used by anystudents in this study and is an aspect of the legal use of informationthat is clearly not well understood by student or teaching faculty. Theuse of permission granted notices was not applicable to first-year

students whose research papers did not include contents fromcopyrighted materials. The primary oversight occurred with thephotojournalism student projects. A photograph is consideredintellectual property and is automatically copyrighted to thephotographer as soon as it is created.29 Therefore, it is importantfor both the photographer and the subject to have a photo copyrightrelease form signed to document the release of rights on certainphotographs for a certain amount of time or for a specific purpose.This would be an excellent way to incorporate information literacyinto a capstone photojournalism project to teach students about thevalue of their creativity and the importance of legally protecting theirproducts.

CONCLUSIONS

This paper validates the methodology that assesses informationliteracy learning outcomes using writing portfolios and researchpaper bibliographies. It also quantifies learning outcomes andestablishes benchmarks for first-year and capstone student popula-tions, offers conclusions based on comparative data between thestudent populations, informs the library's information literacyinstruction program, and correlates identifiable learning outcomeswithin the established information literacy rubric.

“This paper … quantifies learning outcomes andestablishes benchmarks for first-year and capstonestudent populations, offers conclusions based on

comparative data between the student populations,informs the library’s information literacy

instruction program, and correlates identifiablelearning outcomes within the established

information literacy rubric.”

Similar to Scharf et al.,30 the study provided a quantitative base-line assessment of information literacy competencies of a represen-tative sample of students. Importantly, findings provide a substantivelink between information literacy learning outcomes and theinformation literacy curriculum rubric in place at the MansfieldLibrary. While many variables exist within a population of studentsthat might affect their information literacy, the data reflect theelements of the information literacy rubric and serves as a guideline todetermine areas of student success and areas where informationliteracy instruction can be strengthened. Similar to findings byKnight,31 the study indicates that student work product is anextremely useful gauge of the role of library instruction in highereducation.

Benchmarks of learning outcomes for first-year and capstonestudent populations were compared to the Library's informationliteracy rubric that is based on the ACRL standards. The students ineach population demonstrated the ability to locate and evaluateinformation in support of their research. There were statisticallysignificant differences between the two populations relative to thedepth and demands of their research but not in their abilities to locateand cite appropriate resources that met the assignment requirements.However, similar to other studies using this methodology, there wereindications that certain areas of instruction need to be strengthened.32

Of particular notewere the effective use of information and the ethicaluse of information.

While these findings can be extrapolated through the lens ofinformation literacy instruction, they also reflect the inherentexpectations of research results unique from first-year to capstoneclasses. Teaching faculty form the conduit for the integration of

May 2010 209

Page 9: Information Literacy Learning Outcomes and Student Success

information literacy into the curriculum. Their expectations foracceptable research have a direct effect on the extent to whichstudents will explore quality sources. While findings cannot bedirectly connected to information literacy instruction, they do makeclear that student research reflects the primary focus of theinformation literacy curriculum, that first-year students will go wellbeyond Google, and that capstone students attain higher-levelresearch skills.

Acknowledgments: The author extends sincere appreciation toFred B. Samson, Ph.D., for assistancewith statistical analysis; to CharliePotter, Humanities Collections Librarian, The Libraries of the Clar-emont College, CA, for assistancewith the design of the project; toWesSamson, Systems Engineer, Mansfield Library, The University ofMontana, for data compilation; and to Patti McKenzie, ReferenceTechnician, and Megan Stark, Undergraduate Services Librarian,Mansfield Library, The University of Montana, for assistance withportfolio collection.

NOTES AND REFERENCES

1. Association of College and Research Libraries, Information LiteraryCompetency Standards for Higher Education. Online. AmericanLibrary Association. 2000. Available: http://www.ala.org/ala/mgrps/divs/acrl/standards/informationliteracycompetency.cfm(August 17, 2009).

2. Davida Scharf, et al., “Direct Assessment of Information LiteracyUsingWriting Portfolios,” Journal of Academic Librarianship 33 (July2007): 462–478.

3. Ross Todd, “From Net Surfers to Net Seekers: WWW, CriticalLiteracies and LearningOutcomes.” Teacher Librarian, 26(2): 16–21.

4. Laura Saunders, “Perspectives on Accreditation and InformationLiteracy as Reflected in the Literature of Library and InformationScience,” Journal of Academic Librarianship 34 (July 2008):305–313.

5. Bonnie Gratch-Lindauer, “Information Literacy-related StudentBehaviors” College & Research Libraries News, 68(July/August2007): 432–436, 441.

6. Ilene F. Rockman & Gordon W. Smith, “Information and commu-nication technology Literacy: New Assessment for Higher Educa-tion,” College & Research Libraries News 66 (September 2005):587–589;Irvin R. Katz, “Testing Information literacy in Digital Environments:ETS's iSkills Assessment.” Information Technology and Libraries(September 2007): 3–12.

7. Shelley Gullikson, “Faculty Perceptions of ACRL's InformationLiteracy Competency Standards for Higher Education,” Journal ofAcademic Librarianship 32 (November 2006): 584–592.

8. Paula McMillen & Anne-Marie Deitering, “Complex Questions,Evolving Answers: Creating a Multidimensional AssessmentStrategy to Build Support for the ‘Teaching Library’,” Public ServicesQuarterly 3 (1/2, 2007): 57–82;Carol A. Leibiger & William E. Schweinle, “The South DakotaInformation Literacy Exam: A Tool for Small and Medium-sizedUniversity to Document and Assess Information Literacy.” Presen-tation at the Association of College and Research LibrariesThirteenth Annual Conference, Seattle, WA, 2007.

210 The Journal of Academic Librarianship

9. George D. Kuh & Robert M. Gonyea, “The role of the AcademicLibrary in Promoting Student Engagement in Learning,” College &Research Libraries 64 (July 2003): 256–282.

10. Leibiger and Schweinle, “The South Dakota Information LiteracyExam.”

11. Christine E. DeMars, Lynn Cameron, & T. Dary Erwin, “InformationLiteracy as Foundational: Determining Competence,” The Journal ofGeneral Education 52 (4, 2003): 253–265.

12. Gratch-LindauerBonnie , “The Three Arenas of Information LiteracyAssessment,” Reference &User Services Quarterly 44 (Winter 2004):122–129.

13. Kornelia Tancheva, Camille Andrews, & Gail Steinhart, “LibraryInstruction Assessment in Academic Libraries,” Public ServicesQuarterly 3 (1/2, 2007): 29–56.

14. Priscilla Coulter, Susan Clarke, & Carol Scamman, “Course Grade as aMeasure of the Effectiveness of One-shot Information LiteracyInstruction,” Public Services Quarterly 3 (1/2, 2007) (2007): 147–163.

15.McMillen and Deitering, “Complex Questions.”16. Thomas P. Mackey & Trudi E. Jacobson, “Developing an Integrated

Strategy for Information Literacy Assessment in General Educa-tion,” Journal of General Education 56 (2, 2007): 93–104.

17. Karen R. Diller & Sue F. Phelps, “Learning Outcomes, Portfolios, andRubrics, Oh My! Authentic Assessment of an Information LiteracyProgram,” portal: Libraries and the Academy 8 (1, 2008): 75–89.

18. Christine Furno & Daphne Flanagan, Information Literacy: Gettingthe Most From Your 60 Minutes Reference Services Review 36 (3,2008): 264–271.

19. Ilene Rockman, Integrating Information Literacy in the LearningOutcomes of Academic Disciplines College & Research Libraries News64 (October 2003): 612–615.

20. Ada Emmett & Judith Emde, Assessing Information Literacy SkillsUsing the ACRL Standards as a Guide Reference Services Review 35 (2,2007): 210–229.

21. Lorrie A. Knight, Using Rubrics to Assess Information LiteracyReference Services Review 34 (1, 2005): 43–55.

22. Valerie Sonley, et al., Information Literacy Assessment by Portfolio: ACase Study Reference Services Review 35 (1, 2007): 41–70.

23. Scharf, “Direct assessment of information literacy.”24. N.N. Edzan, Analysing the References of Final Year Project Reports

Journal of Educational Media & Library Sciences 46 (Winter 2008):211–231.

25. Association of College and Research Libraries, “Standards.”26. Jack Levin & James Alan Fox, Elementary Statistics in Social

Research: The Essentials. Second Ed. Pearson Education, Inc, Boston,MA, 2007.

27. Fred L. Ramsey & Daniel W. Schafer, The Statistical Sleuth: A Coursein Methods of Data Analysis. Duxbury Press, Belmont, CA, 1997.

28. Sue Samson & Kim Granath, “The Integration of Reading, Writing,and Research: Added Value to University First-year ExperiencePrograms.” Reference Services Review 32 (2, 2004): 149–156.

29.Michelle A. Bruck, “How toWrite a PhotocopyRelease Form.”Online.eHow™. 2009. Available: http://www.ehow.com/how_5125783_write-photo-copyright-release-form.html (July 12,2009).

30. Scharf, “Direct assessment of information literacy.”31. Knight, “Using Rubrics to Assess Information Literacy.”32. Ibid.; Scharf, “Direct assessment of information literacy.”