17
This article was downloaded by: [University of North Carolina] On: 11 November 2014, At: 15:53 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Journal of Political Science Education Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/upse20 Information Literacy and the Undergraduate Research Methods Curriculum B. Gregory Marfleet a , Brian J. Dille a & Brian J. Dille b a Carleton University b Mesa Community College Published online: 01 Sep 2006. To cite this article: B. Gregory Marfleet , Brian J. Dille & Brian J. Dille (2005) Information Literacy and the Undergraduate Research Methods Curriculum, Journal of Political Science Education, 1:2, 175-190, DOI: 10.1080/15512160590961793 To link to this article: http://dx.doi.org/10.1080/15512160590961793 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms- and-conditions

Information Literacy and the Undergraduate Research Methods Curriculum

  • Upload
    brian-j

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Information Literacy and the Undergraduate Research Methods Curriculum

This article was downloaded by: [University of North Carolina]On: 11 November 2014, At: 15:53Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registeredoffice: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Journal of Political Science EducationPublication details, including instructions for authors andsubscription information:http://www.tandfonline.com/loi/upse20

Information Literacy and theUndergraduate Research MethodsCurriculumB. Gregory Marfleet a , Brian J. Dille a & Brian J. Dille ba Carleton Universityb Mesa Community CollegePublished online: 01 Sep 2006.

To cite this article: B. Gregory Marfleet , Brian J. Dille & Brian J. Dille (2005) Information Literacyand the Undergraduate Research Methods Curriculum, Journal of Political Science Education, 1:2,175-190, DOI: 10.1080/15512160590961793

To link to this article: http://dx.doi.org/10.1080/15512160590961793

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the“Content”) contained in the publications on our platform. However, Taylor & Francis,our agents, and our licensors make no representations or warranties whatsoever as tothe accuracy, completeness, or suitability for any purpose of the Content. Any opinionsand views expressed in this publication are the opinions and views of the authors,and are not the views of or endorsed by Taylor & Francis. The accuracy of the Contentshould not be relied upon and should be independently verified with primary sourcesof information. Taylor and Francis shall not be liable for any losses, actions, claims,proceedings, demands, costs, expenses, damages, and other liabilities whatsoever orhowsoever caused arising directly or indirectly in connection with, in relation to or arisingout of the use of the Content.

This article may be used for research, teaching, and private study purposes. Anysubstantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Page 2: Information Literacy and the Undergraduate Research Methods Curriculum

Information Literacy and the Undergraduate ResearchMethods Curriculum

B. GREGORY MARFLEETBRIAN J. DILLE

Carleton University

BRIAN J. DILLE

Mesa Community College

The ‘‘Information Literacy Competency Standards for Higher Education’’ generatedby the Association of College and Research Libraries (ACRL 2000) provides aplausible set of pedagogical goals to guide the construction and implementation ofan undergraduate research methods course. We argue that the context of theundergraduate methods course with its emphasis on developing analytical anddata management skills as well as exposure to concepts such as validity,reliability, measurement, and bias should be particularly fertile ground for thedevelopment of ACRL targeted competencies. Our paper outlines the five infor-mation literacy standards (as well as the performance indicators and outcomesassociated with each) described in the ACRL paper noting how they relate todesigning student assignments and research projects. It then describes a limitednatural experiment conducted using two sections of a methods course and twosections of a non-methods political science course. For one course in each of thesepairs of courses, the assignments and some additional course content were variedto reflect the information literacy standards (other course elements includingtext, instructor, and most lecture content were constant). Student scores on astandardized information literacy test indicate that information literacy-orientedcourses can improve student performance on standardized competency tests. How-ever, this improvement was equally present in both methods and non-methodscourses.

Keywords information literacy, research methods, core competencies

Most American Political Science departments and many International Relations pro-grams have moved to include a research methods component into the undergraduatecurriculum (Thies and Hogan 2003; Ishiyama and Breuning 2004). This phenomenonhas been in part due to a recognition of the increasing methodological sophisticationof the field and the prominence of empirical methods content in graduate programs(Barth et al. 2003). However, since only a minority of undergraduates actually pursue

Paper originally prepared for presentation at the annual meeting of the AmericanPolitical Science Association, August 27–31, 2003, Philadelphia, PA.

The authors would like to thank Jon Carlson and John Linantud for their helpfulcomments and suggestions.

Address correspondence to B. Gregory Marfleet, Carleton College, One North CollegeSt., Northfield, MN 55057. E-mail: [email protected]

Journal of Political Science Education, 1:175–190, 2005Copyright # 2005 Taylor & Francis Inc.ISSN: 1551-2169DOI: 10.1080/15512160590961793

175

Dow

nloa

ded

by [

Uni

vers

ity o

f N

orth

Car

olin

a] a

t 15:

53 1

1 N

ovem

ber

2014

Page 3: Information Literacy and the Undergraduate Research Methods Curriculum

advanced degrees in the policy or social sciences, the curricular goal of the methodscourse and the proper task orientation of the methods instructor have been somewhatvague. Are we trying to make students better consumers of academic research? Inspirecritical thinking? Impart specific technical skills? Or, evangelize on behalf of specificepistemologies?

One minimum expectation of a political science methods course is that studentslearn how to organize, evaluate, and analyze information. This suggests integratinginformation literacy (IL) objectives alongside instruction on statistical inference,research design, and research methodologies. Moreover, because of the attentionpaid to issues of hypothesis testing, standards of evidence, sampling, validity, andreliability as well as to the technical aspects involved in gathering, organizing, andworking with data (numerical or otherwise), we anticipate that the environment ofthe undergraduate research methods class should be particularly conducive to theintroduction of central IL concepts.

This paper explores the concept of information literacy and lists the standardssuggested by the Association of College and Research Libraries (ACRL 2001). Wethen apply those standards to the learning objectives of the undergraduate methodscourse and suggest pedagogical techniques that promote information literacy in asocial science setting. Finally, we test, using a standardized student assessment device,whether these methods are more effective in improving student mastery of the ACRLstandards in the context of a methods course than in a non-methods course of equiva-lent level. We conclude that the incorporation of information literacy componentsinto student assignments can have a positive impact (at least in the short run) oninformation literacy competence. However, this impact appears to be independentof the general course content. Information literacy test scores of students in researchmethods courses were not significantly higher than for those in the non-methods com-parison group courses with an equivalent information literacy orientation.

What is Information Literacy?

Information literacy is the set of skills needed to find, retrieve, analyze, and use infor-mation. Information-literate people are those who have learned how to learn. Theyknow how information is organized, where to find it and how to use it. They can alsodistinguish between sources of high and lowquality information and therefore researchmore efficiently and with higher quality results. IL skills are becoming increasinglynecessary as our society shifts from an industrial economy based on production toan economy based on information and services. This shift in society has evinced a cor-responding shift in what is expected from an education, and information literacy hasbeen identified as one of the new ‘‘core competencies’’ of the 21st century (Sellen 2002).

Unfortunately, these skills have proved elusive for many. We are awash withinformation in our society. Information is so readily available and comes at us withsuch speed that we are often unable to differentiate between good and bad infor-mation or know how to use information properly. David Shenk (1997) coined theterm ‘‘data smog’’ to describe what happens when too much information is availableto us. We become lost in the constant stream of inputs and data, creating a barrier togood decision making, and leading to heightened anxiety. Psychologists are begin-ning to describe a new ailment, Information Fatigue Syndrome, with symptoms ofparalysis of analytical ability, heightened anxiety and self-doubt, and increasingtendency to blame others (Bundy 1998).

176 B. G. Marfleet and B. J. Dille

Dow

nloa

ded

by [

Uni

vers

ity o

f N

orth

Car

olin

a] a

t 15:

53 1

1 N

ovem

ber

2014

Page 4: Information Literacy and the Undergraduate Research Methods Curriculum

This problem is acute in the undergraduate curriculum. It used to be that aninstructor could send her students to the library to gather information with someassurance that if they found their way to the building they would be able to completethe assignment. The resources housed within the boundaries of a library werefiltered. Librarians had previously evaluated and purposefully selected theseresources as worthy and potentially useful. Now, with a whole world of onlinematerial of dubious merit available to any student, critical evaluation skills mustbe taught more deliberately (Julien 2000).

Ironically, student familiarity with the internet and with search engines such asGoogle is no panacea for the problem. As one study noted, self-described web-experienced students tend to overestimate their capacity to find what they needonline and to feel confident in the quality of information they identified on the inter-net (OCLC 2002). This false confidence may lead students to underestimate theirneed for more formal training in web research or to seek librarian assistance andto engage in inefficient searches and uncritical acceptance of search result rankings(Nowicki 2003). Though most accounts are anecdotal, the consequence of the avail-ability of high volumes of uneven information is a reduction in the quality of studentwriting—and sometimes even an increase in plagiarism—as students attempt tostring together snippets from a variety of brief and easily accessed sources, oftenwithout any critical evaluation of their quality or potential biases (Thompson 2003).

One solution to data smog may be information literacy. The information-literateperson is able to cope with the challenge of excess information availability. Being ableto analyze and evaluate the information, one finds the confidence to evaluate and useinformation to make a decision. Acquiring information literacy skills also saves time;we avoid analyzing useless data (Mendelsohn 2003). From an instructional point ofview, information-literate students are able to generalize researching concepts andskills taught in a course and apply them to new problems and issue areas (Buchanan2002). Regardless of one’s discipline or the specific subject matter of a course, infor-mation literacy is clearly an important objective for effective instruction.

Information literacy learning objectives were first developed by the NationalCouncil of Teachers of Mathematics (NCTM 1989), which suggested that infor-mation literacy in the mathematics curriculum involved problem-solving, the useof estimation, thinking strategies for basic facts and formulating and investigatingquestions from problem situations. The focus of evaluation from instructors, accord-ing to this report, should be on using information in meaningful ways to demon-strate understanding. Political science methodology courses share these goals andconcerns, highlighting the applicability of these objectives to these courses. Otherorganizations and associations expanded on this concept throughout the 1980sand 1990s (ERIC Digest 1994).1 Since then, the idea of information literacy hasreceived widespread acceptance, particularly within the academic library community,and over 5000 books and articles have explored the subject (Rader 2002). With thestandards articulated by the ACRL in 2000, information literacy has become a pri-mary learning objective in higher education, although information literacy instruc-tion is still foreign to most faculty (Buchanan 2002).

The Information Literacy Competencies

Before proceeding further, it is necessary to summarize briefly the information liter-acy standards put forward by the ACRL. There are five competency standards that

Information Literacy 177

Dow

nloa

ded

by [

Uni

vers

ity o

f N

orth

Car

olin

a] a

t 15:

53 1

1 N

ovem

ber

2014

Page 5: Information Literacy and the Undergraduate Research Methods Curriculum

Table 1. ACRL information literacy competency standards for higher educationwith corresponding performance indicators

Competency Standard 1: the information-literate student determines the extent ofthe information needed.

Performance Indicator 1: the information-literate student defines and articulatesthe need for information.

Performance Indicator 2: the information-literate student identifies a variety oftypes and formats of potential sources for information.

Performance Indicator 3: the information-literate student considers the costsand benefits of acquiring the needed information.

Performance Indicator 4: the information-literate student reevaluates the natureand extent of the information needed.

Competency Standard 2: the information-literate student accesses neededinformation effectively and efficiently.

Performance Indicator 1: the information-literate student selects the mostappropriate investigative methods or information retrieval systems for accessingthe needed information.

Performance Indicator 2: the information-literate student constructs andimplements effectively designed search strategies.

Performance Indicator 3: the information-literate student retrieves informationonline or in person using a variety of methods.

Performance Indicator 4: the information-literate student redefines the searchstrategy if necessary.

Performance Indicator 5: the information-literate student extracts, records andmanages the information and its sources.

Competency Standard 3: the information-literate student evaluates information andits sources critically and incorporates selected information into one’s knowledge baseand value system.

Performance Indicator 1: the information-literate student articulates and appliesinitial criteria for evaluating both information and its sources.

Performance Indicator 2: the information-literate student compares newknowledge with prior knowledge to determine the value added, contradictionsor other unique characteristics of the information.

Performance Indicator 3: the information-literate student determines whether theinitial query should be revised.

Competency Standard 4: the information-literate student, individually or as a memberof a group, uses information effectively to accomplish a specific purpose.

Objectives are not given for this standard because its performance indicatorsare best addressed by the course instructor.

Competency Standard 5: the information-literate student understands many of theeconomic, legal, and social issues surrounding the use of information andaccesses and uses information ethically and legally.

Performance Indicator 1: the information-literate student understands many ofthe ethical, legal, and socioeconomic issues surrounding information andinformation technology.

Performance Indicator 2: the information-literate student acknowledges the useof information sources in communicating the product or performance.

Note: Adapted from ACRL, 2001.

178 B. G. Marfleet and B. J. Dille

Dow

nloa

ded

by [

Uni

vers

ity o

f N

orth

Car

olin

a] a

t 15:

53 1

1 N

ovem

ber

2014

Page 6: Information Literacy and the Undergraduate Research Methods Curriculum

are outlined in Table 1. In addition to articulating standards, the ACRL providesperformance indicators to assess whether students have mastered a particularstandard. The full description of the standards and their corresponding performanceindicators are available from the ACRL (2000).

The first standard states that the information-literate individual determines thenature and extent of the information needed. This standard deals with focusing andrefining a topic, determining the amount and type of information needed, and plan-ning the information search. Crafting a research design would be a key outcome in amethods course for this standard.

The second standard states that the information-literate student accesses neededinformation effectively and efficiently. This standard addresses selecting appropriateresources, brainstorming information strategies, and searching information resourcesincluding electronic databases and internet sources. An example of a methods courseoutcome for this standard would be producing an annotated bibliography drawnfrom contemporary disciplinary-relevant databases such as J-Stor, Pro-Quest orthe Social Science Citation Index.

The third standard states that the information-literate student evaluates infor-mation and its sources critically and incorporates selected information into one’sknowledge base and value system. This standard deals with determining the originof information, evaluating the credibility of the author or creator of that infor-mation, and selecting the most reliable information. Critically evaluating surveysreported in media outlets using knowledge of the requirements of scientific surveyconstruction would be an example of an outcome for this standard found in amethods course.

The fourth standard states that the information-literate student, individuallyor as a member of a group, uses information effectively to accomplish a specificpurpose. This standard is about organizing information for its intended audience,determining the best discipline-appropriate delivery format, and incorporatingthe information into a finished project. A sample outcome for a methods coursemight be a final group project where students present data they have gathered totest a hypothesis using an academic poster format typically seen at academicconferences.

The fifth, and final, standard states that the information-literate studentunderstands many of the economic, legal, and social issues surrounding the use ofinformation and accesses and uses information ethically and legally. This standarddeals with citing information sources correctly, demonstrating awareness of copy-right laws, and understanding that plagiarism is unlawful. The concepts in this stan-dard are covered in many courses, but a sample outcome for a methods course mightbe to have students discuss the ethics surrounding the use of humans as experimentalsubjects, or the concerns expressed about policy makers using data gathered byacademic researchers in ways that contravene currently accepted practices.

Pedagogical Strategies Suggested by Information Literacy

These five standards are concise, but Table 1 reveals that they are quite complexin their application. In order to integrate fully all five dimensions into a methodscurriculum, it may be necessary to rethink and reevaluate the pedagogy of the entirecourse. This does not mean, however, that an instructor focusing on informationliteracy needs to revamp an entire course preparation. One of the most common

Information Literacy 179

Dow

nloa

ded

by [

Uni

vers

ity o

f N

orth

Car

olin

a] a

t 15:

53 1

1 N

ovem

ber

2014

Page 7: Information Literacy and the Undergraduate Research Methods Curriculum

reasons given by faculty for resisting pedagogical innovation is that it is difficult toweigh the trade-offs between course-specific content and time-intensive pedagogy.Committing class time to teaching students how to do an effective internet search,for example, leaves less class time to explore more interesting ‘‘substantive’’ areas.Fortunately, when implementing information literacy competencies, these trade-offsare minimal, since the pedagogical strategies suggested by the standards are oftenfamiliar to instructors who have been teaching for any time at all.

There are a few strategies that seem to be especially likely to promote the specificoutcomes called for by the ACRL standards. These include: involvement of librar-ians in an instructional environment, transparency in assignment objectives withregard to IL competencies, and an emphasis on research-related writing with out-come and process components. Reflecting on this list should make it clear that weare not, in this article, offering or advocating a revolutionary pedagogy. However,we hope that by using recognized standards to assess student progress towards cer-tain course objectives, students and faculty can be more confident that the teachingand learning experience has been worthwhile.

Librarians have written most of the extant literature on information literacy.They tend to be its most vocal advocates. This is to be expected as librariansare information professionals and have experience helping students and facultyaccess and use information appropriately. However, information literacy is morethan just the effective use of libraries, and librarians have recognized that it mustbe fully integrated into the curriculum to have the impact on learning informationliteracy advocates hope for (Buchanan 2002; Fialkoff 2001). Collaboration betweenlibrarians, as information specialists, and faculty, as instructional and content spe-cialists, is necessary for students to become information literate. In spite of thisseemingly obvious statement, cooperation between librarians and faculty is rare(Rader 2002) and tends to be limited to ‘‘research projects.’’ This suggests that akey strategy to integrating information literacy standards into a course is to involvelibrary personnel at multiple stages in the instruction. In its most evolved form thismight demand collaborative course development. However, short of this, in-classinstruction by librarians, or feedback to the faculty on student behavior at thelibrary might be an achievable first step in recognizing the ‘‘need to treat infor-mation literacy as part of the curriculum, not simply part of the library’’ (White-head and Quinlan 2003).

One way to assist students is by expressing expectations in unambiguous terms.By outlining the learning objectives for each assignment in the syllabus, studentsunderstand why they are being given assignments, and instructors can be more con-fident that their assignments are addressing the information literacy standards. Ifstudents understand the purpose of assignments and how they will benefit them, theyare more likely to learn and retain the concepts the assignment seeks to support(UMUC 2003). If the learning outcomes are explicitly stated, even embeddedinto the assignments themselves, then students will accept their role in the learningprocess (Lebow 1993) and be able to better learn the material in incremental steps(Corbett and Kearns 2004).

In addition to being clear about the information literacy objectives for eachassignment, it is also important to evaluate the course as a whole to ensure thatnon-graded projects and in-class activities are designed to reinforce information lit-eracy. As a general guideline, the course should emphasize analysis over answers andprocess over outcome (Quarton 2003). This means that faculty cannot simply assign

180 B. G. Marfleet and B. J. Dille

Dow

nloa

ded

by [

Uni

vers

ity o

f N

orth

Car

olin

a] a

t 15:

53 1

1 N

ovem

ber

2014

Page 8: Information Literacy and the Undergraduate Research Methods Curriculum

a research project at the beginning of the semester and determine a final grade for theproject at its completion. Instead, the process-driven orientation will require studentssubmit work to be graded at multiple points in the process leading up to the finalassignment. Not only does this encourage students to follow the steps of the processthoughtfully, rather than rush through it in one sleep-deprived binge at the end ofthe semester, it also provides opportunities for feedback so the instructor can guide thatprocess. Likewise, asking students to provide answers to questions on exams withoutalso giving them opportunities to explain why the answers are what they are makesit difficult to know at what level students have mastered the material. As studentscomplete assignments that are explicitly tied to the outcomes associated with theinformation literacy standards, they are much more likely to achieve those standards.

Student writing assignments play a central role in fostering information literacy.Assignments that require students to identify, retrieve, evaluate, and effectivelyutilize information will clearly improve their information literacy (UMUC 2003).Buchanan (2002) suggests four specific strategies to promote information literacy:1) engage students in group activities that require them to seek and evaluate infor-mation; 2) provide feedback on their work and reinforce information literacyconcepts; 3) provide opportunities for students to apply their competence in infor-mation literacy through such assignments as compiling an annotated bibliographyor assembling data for presentation; and 4) challenging students with a ‘‘disequili-brium’’ experience, followed by analysis and discussion. Another way to promoteinformation literacy is to include both reflective and analytical writing in the assign-ments (Hunt and Birks 2004). Students who are asked to write reflectively arerequired to think about how they conducted the research process as well as todescribe the outcome. This process–relevant component enhances the analyticwriting that focuses on the organization of their outcome-relevant work.2

The degree to which the costs of implementation of these IL-fostering pedagogi-cal strategies becomes burdensome for instructors depends largely on the particularcompetencies of the faculty member and the institutional support available. Trans-forming preexisting written assignments to incorporate IL dimensions, being clearin their learning objectives or including a reflective component are simple and desir-able improvements. However, the incorporation of library staff into instructionalsettings (for faculty who are less familiar with the contemporary on-line databasesystems that are available) may be less simple in the absence of institutional bridgesacross academic and organizational units. Likewise, the increased burden of facultyfeedback on student writing at multiple stages of an assignment may place an extraload on instructors who do not have teaching assistants or tutorial mechanisms. Forcolleges interested in fostering IL competencies at the curricular level, an investmentin faculty workshops, writing support, and forging ties between departments andlibrary liaisons may be necessary.3 Fortunately for faculty, any expedition into thelibrary should reveal a library staff who is likely familiar with the IL concept andeager to engage with students and faculty in shared instructional duties.

Like the costs, the benefits can vary depending on the faculty member’s skills andinterest and the institutional setting. Moreover, they can be diffuse and their observ-able impact delayed or difficult to measure. Anecdotal evidence suggests that individ-ual faculty members have witnessed improvements in composition skills, or in overallperformance, among students in IL rich courses. End-of-term evaluation surveyshave similarly revealed that students self-report improvement in their ability to uselibrary resources and judge information more effectively (Ragins 2001). However,

Information Literacy 181

Dow

nloa

ded

by [

Uni

vers

ity o

f N

orth

Car

olin

a] a

t 15:

53 1

1 N

ovem

ber

2014

Page 9: Information Literacy and the Undergraduate Research Methods Curriculum

assessing improvement in research abilities objectively is far more difficult. In the longrun, such assessment requires extensive institutional involvement (Dunn 2002).

It is important to note that no individual instructor can or should be expected to‘‘do it all’’. Whatever assignments an instructor decides are appropriate for theircourse, if the project requires the students to develop and demonstrate critical think-ing, problem-solving, synthesis, and application skills through any kind of researchexperience, they may be confident that the students will improve their informationliteracy competence. Since inculcating the skills listed above is already a commongoal among college instructors, information literacy (and the techniques we notedfor achieving it) are ultimately a valuable contemporary supplement to what isalready being done.

Assessing Information Literacy Objectives

One of the primary reasons we became interested in the present analysis is the pau-city of studies that attempt to objectively test whether instruction infused with infor-mation literacy components makes a difference in short or long-term learningoutcomes (Bundy 1998). This is perhaps the main reason information literacy hasmade slow inroads into faculty practice. The present study seeks to investigate theshort-term outcomes, leaving long-term evaluation to future studies.

To assess the benefit of the ACRL information literacy objectives in undergrad-uate methods and non-methods courses, we used the information literacy assessmentinstrument developed at Mesa Community College (MCC). MCC annually admin-isters a nationally recognized, comprehensive program to assess student learningoutcomes. The assessment program is overseen by the Faculty Senate Student Out-comes Committee, a standing committee of the Faculty Senate, in collaboration withthe Dean of Instruction. The assessment measures and documents the degree towhich students attain specific learning outcomes valued and defined by faculty.The assessment question bank for information literacy was produced in cooperationwith the ACRL and is explicitly tied to the ACRL standards. The questionnairemeasures student competency on all five ACRL standards.

We administered this questionnaire to students enrolled in two undergraduatemethods and two non-methods political science courses at Carleton College. Ourquasi-experimental design took advantage of a planned alteration in course contentthat occurred between winter and spring terms during the 2002–03 academic year.

The written assignments for the spring term methods course and a course ininternational relations were both adapted to reflect the ACRL competency stan-dards. The courses also incorporated several of the pedagogical techniques describedabove, including in-class instruction from a social science research librarian, explicitdiscussion of the objectives of the assignments from an information literacy perspec-tive, a sequential assignment design that led students through the steps of their largerproject and a reflective component that asked students to describe and analyze theirresearch process as part of their assignment writing.

For example, one of the objectives of the IL-oriented courses was to familiarizestudents with the on-line research databases available through the college libraryand to provide themwith the tools necessary to perform effective searches. To facilitatethis learning, half of one class session for each of the two IL-oriented courses was givenover to an in-class visit from a research librarian, who led a discussion of effectivesearch techniques using J-Stor, Lexis-Nexis and the Social Science Citation Index.

182 B. G. Marfleet and B. J. Dille

Dow

nloa

ded

by [

Uni

vers

ity o

f N

orth

Car

olin

a] a

t 15:

53 1

1 N

ovem

ber

2014

Page 10: Information Literacy and the Undergraduate Research Methods Curriculum

Appendices A and B provide examples of actual assignments that were directedat developing familiarity with these online databases. The objectives of the assign-ments differ somewhat with regard to the information literacy competencies. Inthe international relations class, students were asked to use the databases to findnew information that would help them to answer an empirical question about chan-ging corporate norms. In the methods course, students were simply identifyingpotential project topics while exploring the existing literature to assess the state ofcurrent research. Nevertheless, in both instances, as part of their written paper stu-dents were asked to reflect on their search efficiency and evaluate the successfulnessof the search process while considering the quality and diversity of sources availableto them.

Table 2 describes the size and experimental condition with regard to informationliteracy and research methods environment for the four courses that comprised studygroups over the two academic terms.

With the exception of POSC 170, all courses were designated 200 level (sopho-more). Analysis of the average number of college courses completed per studentrevealed no significance difference between students in the two non-methods politicalscience courses (POSC 231 and 170 with an average of 17.8 and 17.5 courses com-pleted respectively). A significant difference (p < .001) was indicated between thenon-methods students and those in the methods class. Students in methods averaged21.5 and 22.3 completed college courses (winter and spring respectively). This reflectsthe tendency for students to enroll in methods during their junior year. Importantly,the difference in the mean number of courses completed between students in the twomethods courses (winter and spring) was not significant.

Since this quasi-experimental, posttest only, nonequivalent-control-group designrelies on existing groups (i.e., subjects were not randomized into treatments by theexperimenters), questions of selection bias present the greatest threat to internal val-idity. The hazards of potential systematic differences between experimental and con-trol groups is further exacerbated, in this instance, by our decision not to administera pretest at the beginning of the term. While such a test would have allowed us toassess initial levels of information literacy competence and more accurately measureimprovement, it would also have introduced the threat of a testing effect on our out-comes and likely required some alteration of the measurement instrument (a furthersource of potential validity problems).

Table 2. Experimental groups

Number ofstudents

Experimental conditions

Course Methods IL-oriented

American Foreign Policy(POSC 231 Winter)

24 No No

Methods of Political Research(POSC 230 Winter)

16 Yes No

International Relations andWorld Politics (POSC 170 Spring)

30 No Yes

Methods of Political Research(POSC 230 Spring)

14 Yes Yes

Information Literacy 183

Dow

nloa

ded

by [

Uni

vers

ity o

f N

orth

Car

olin

a] a

t 15:

53 1

1 N

ovem

ber

2014

Page 11: Information Literacy and the Undergraduate Research Methods Curriculum

Additional concerns about design validity arise from the absence of completecontrol over one of the experimental conditions. As noted above, almost all collegecourses contain some content that should develop a student’s information literacy.In light of increasing regard for the importance of these skills, it is a virtual certaintythat students were encouraged to develop information literacy competencies by otherprofessors prior to, or perhaps concurrently with, our experiment. Given this fact,any analysis must consider a priori differences in the levels of student experience withinformation management, manipulation and presentation tools. For our purposes,and in keeping with the assumption that all classes offer at least something in termsof IL, we will consider the number of courses taken by the students as a rough proxyfor this experience factor.

Having articulated these concerns about the design, we should also state ourconfidence that the general similarity reflected by the mean number of courses com-pleted as well as homogeneity in other areas including gender distribution, the ratioof political science majors to non-political science majors, and fortuitous constancyin the time of day that the courses met (the methods classes met at 8:30 A.M. M, W,F both terms while the international relations courses met at 12:40 P.M. M, W, Fboth terms), make these groups valid for the purpose of comparison. Althoughthe courses differ in their substantive topic (American Foreign Policy versus Intro-duction to International Relations) given the seeming demographic similarity ofthe courses we can think of no systematic reason why the spring introduction toIR class should exhibit inherently higher or lower information literacy performancesimply based on the substantive topic.

Administration of the information literacy tests occurred at the end of therespective terms when students were asked to complete the MCC-designed assess-ment. Compliance with our request was over 92%. The numbers in Table 2 reflectthe quantity of students who completed the test and not actual course enrollments.Two students who were enrolled in POSC 230 in the spring had already completedthe assessment (having been members of the winter term 231 class). Their responseswere excluded from the spring POSC 230 group. Our hypotheses with regard to scoreimprovement are as follows:

[H1] On average, students who participate in a course with an information literacyorientation (regardless of the course’s substantive content) should score higheron the standardized test than students in a non-IL-oriented course.

[H2] On average, students in a methods course, because of its emphasis on thecollection, analysis, and presentation of information, should score higher onthe standardized test than students in a non-methods course.

[H3] The combination of a methods course environment and an IL-oriented coursedesign should yield the highest average scores.

[H4] Since virtually all courses incorporate some information literacy relevantcomponents, more senior students who have had more opportunity to developinformation literacy competencies should score higher on the test than lesssenior students.

Table 3 presents the results of a three-way Analysis of Variance (ANOVA). Thedependent variable is standardized test scores. Regressors were entered for the twoexperimental conditions (to test H1 & H2). An interaction between experimentalconditions (to test H3) was incorporated, as was a dichotomous indicator of studentacademic standing as determined by number of college courses completed (first year

184 B. G. Marfleet and B. J. Dille

Dow

nloa

ded

by [

Uni

vers

ity o

f N

orth

Car

olin

a] a

t 15:

53 1

1 N

ovem

ber

2014

Page 12: Information Literacy and the Undergraduate Research Methods Curriculum

and sophomore 18 or less, junior and senior more than 18) as a proxy for experienceand as a control for potentially confounding group differences (H4).

The ANOVA results provide provisional support for H1 and H4, indicating thatIL-oriented courses have a significant effect on student scores and that studentexperience significantly affects test results. However, our contention that the acqui-sition of information literacy competencies would be enhanced by participation in aresearch methods course (H2) was not supported. Neither was our hypothesis that aninteraction of methods course and IL-orientation would have the most profoundeffect on scoring (H3).

Figure 1 illustrates further the magnitude as well as the direction of the three pri-mary determinants of average information literacy test scores. The effect of themethods class environment, though not significant, was slightly positive. However,this impact was completely overwhelmed by the more substantial effect of the IL-oriented courses and the general experience of the students. Unfortunately, thisfigure cannot reflect the influence of these conditions while controlling for the others.

Ordinary Least Squares (OLS) regression analysis allows for the examination ofthe direction and magnitude of effect for each of the independent variables whileholding the others constant. Table 4 presents the results of a multivariate regression

Table 3. Analysis of variance for information literacy test scores

Factor F p <

Research Methods Class 1.92 .170IL-Oriented Class 10.99 .001Methods � IL-Oriented 0.94 .335Student Academic Standing (fr=so ¼ 0, jr=sr ¼ 1) 22.5 .0001

Note: N ¼ 84, R2 ¼ .30, Adjusted R2 ¼ .27, Model F ¼ 8.55 P < .0001, d.f. ¼ 4

Figure 1. Information Literacy Scores by Subgroup. Line length indicates magnitude of groupdifference.

Information Literacy 185

Dow

nloa

ded

by [

Uni

vers

ity o

f N

orth

Car

olin

a] a

t 15:

53 1

1 N

ovem

ber

2014

Page 13: Information Literacy and the Undergraduate Research Methods Curriculum

analysis of student test scores. The model incorporated dummy variables indicatingwhether or not the student was in a research methods class and whether they were inone with an information literacy orientation. An interval level measure indicating thenumber of college courses completed by the student was also included (in contrast tothe simple dichotomous measure used in the analysis presented in Table 3).

The results indicate that, on average, a student’s presence in an IL-orientedcourse increased by two the number of correct responses on the standardized test,holding all other factors constant. For each college course completed, an averagestudent increased the number of correct responses by roughly .2 questions (or onequestion improvement for every 5 courses taken) holding all other factors constant.Presence in a research methods course was, again, not a significant factor inpredicting test scores. Overall model performance is relatively low (AdjustedR2 ¼ .16), indicating substantial unexplained variance in IL performance. Unmea-sured factors such as orientation toward computers, familiarity with libraries, andgeneral intelligence may play an important role.

Conclusion

At the outset of this paper we articulated the reasons why achieving higher levels ofinformation literacy was a reasonable goal for research methods instructors whowere understandably uncertain about the curricular aims of these courses. We alsonoted that there was a general correspondence between the ACRL competenciesand many of the traditional elements of a research methods class and thus surmisedthat the environment provided by a research methods course could be particularlyfertile for growing information literacy skills.

The results of our experiment belie this expectation. Presence in a research meth-ods class had no appreciable impact on student acquisition of information literacy.Several potential explanations for this might be advanced. As Braumoeller (2003)notes, methods classes are already packed with challenging content and, insofar asthey employ mathematical notation and concepts, teaching them is in many waysanalogous to introducing a new language while teaching new substantive contentin that language simultaneously.

Furthermore, the kind of statistical manipulation, analysis and presentation ofdata taught in a methods class, although it may be conceptually related to infor-mation manipulation and management, may be a distinct subset of this type ofknowledge. It is perhaps too easy for those of us familiar with the techniques andunderlying concepts to equate quantitative=statistical literacy with information

Table 4. Multivariate OLS regression of test scores on methods class, IL-orientedclass and number of classes taken

Independent variables b S.E. b t p<

Research Methods Class �.448 .743 �0.60 0.549IL-Oriented Class 1.977 .670 2.95 0.004Number of Classes Taken .1876 .060 3.12 0.003Intercept 27.75 1.180 23.52 0.000

Note: N ¼ 84, R2 ¼ 0.19, Adjusted R2 ¼ .16.

186 B. G. Marfleet and B. J. Dille

Dow

nloa

ded

by [

Uni

vers

ity o

f N

orth

Car

olin

a] a

t 15:

53 1

1 N

ovem

ber

2014

Page 14: Information Literacy and the Undergraduate Research Methods Curriculum

literacy. At any rate, if we wish to assert that these types of knowledge are related, wemust consciously assist students in recognizing the parallels between gathering,analyzing, and presenting numerical data and other types of information.

On the other hand, there was no evidence that accepting even more curricularresponsibility in the method class environment resulted in overloaded students.Overall performance in the methods class with IL content was commensurate withperformance in the non-IL version (in fact a comparison of grade point scoresreveals that it was slightly, but not significantly, higher). Moreover, including ILcomponents in the methods class appeared to be no less effective than similar inclu-sions in others courses. We cannot conclude, then, that information literacy might bemore effectively incorporated elsewhere.

In addition to this important negative finding regarding methods courses, ouranalysis reflects several positive results. First, overt implementation of infor-mation-literacy-enhancing pedagogical techniques can improve student informationliteracy as reflected by performance on a standardized measurement instrument(at least in the short run). Second, it appears that the impact of these strategies iscontent neutral. This suggests that information literacy can be developed throughoutthe college curriculum and need not be construed as the exclusive province (or bur-den, for that matter) of a particular course and=or department. Third, the results ofour analysis appear to show that information literacy skills accumulate as studentsprogress through their collegiate studies. This pattern might be further accentuatedif a significant proportion of the faculty at an institution were to heed the librarian’sadvice and treat information literacy as not ‘‘simply part of the library.’’

Appendix A

Example of an international relations case study assignmentwith an information literacy component

Case Study Response

Sweating the SwooshIn a paper of not more than 5 pages (typed, double spaced, 12 pt font with reason-able margins) answer the question related to the case below. This paper will be due atthe beginning of class on Friday May 30th. You will need to search for resources tofulfill the external research component of the assignment. I would suggest usingLexis-Nexis http://web.lexis-nexis.com/universe. Remember to employ the ‘guidedsearch’ options to limit the time frame of the search and to allow for more sophis-ticated search parameters. Please refer to our earlier discussion about using onlinedatabases available through the Gould Library or feel free to come and see meduring office hours if you are having trouble with this aspect of the assignment.As part of your Works Cited page include a brief note on the steps in your searchfor information about other companies’ investment practices and employmentnorms indicating the most successful search parameters and the specific sources(publications) you found most useful and why.

Q: One of the problems confronting developing states is their dependenceon capital investment from firms in developed states. This generatescompetition between developing states vie to be the most attractive

Information Literacy 187

Dow

nloa

ded

by [

Uni

vers

ity o

f N

orth

Car

olin

a] a

t 15:

53 1

1 N

ovem

ber

2014

Page 15: Information Literacy and the Undergraduate Research Methods Curriculum

investment target. Often, this means lowering taxes on export-orientedmanufacturers, suppressing domestic wage demands, discouraging localconsumption, ignoring issues of workplace safety and=or child laborstandards, and overlooking environmental degradation. Although mostgovernments in the developing world would prefer to have higher stan-dards on these dimensions, they seem locked in ‘race to the bottom’ withfew other options. One potential solution to this conundrum may be theinfluence of Non-governmental organization (NGOs). Evaluate the roleof the NGO attempting to reform Nike’s labor practices as presentedin the case study. Did the governments of developing states play a role?Did IGOs? Identify the sources of Nike’s reform as described in the casematerial. Using Lexis-Nexis assess whether a ‘race to the top’ of corpor-ate responsibility has emerged since the events of this case transpired. HasNike’s global influence led to the dissemination of new norms of laborpractice among multinational corporations worldwide? Offer yourconclusion as to why some multinational corporations (MNCs) havefollowed Nike’s lead while others have not.

Appendix B

Example of a methods course workshop paper withan information literacy component

Workshop Assignment: Locating Literature & Building an Annotated Bibliography

Part 1: Locate the J-Stor database on the library web site. http://www.jstor.org/cgi-bin/jstor=gensearch

Conduct several searches on topics of interest to you by entering keywords in thefull-text search. Make sure that (at least) the ‘Political Science Journals’ box ischecked. If your topic involves certain geographical areas or economic issues youmay want to select more journal categories. Comb the resulting list of articles forthree or four that seem particularly relevant. Read the abstracts, introductions,and conclusions of these articles. When you’ve found a topic that seems especiallyinteresting and accessible to you, save or print copies of the articles you’ve found.

Recalling our discussion of searching Gould Library Databases, you may needto adjust your search criteria to narrow or broaden your search depending on thenumber of articles found.

Part 2: Locate the Web of Science database on the library web site. http://isiknowledge.com/wos

Pick your most interesting topic search and perform a ‘full’ search of theSocial Science Citation Index using either the author’s name or journal title forthe articles you have selected from J-Stor. Record how many times each of the J-Storarticles you found has been cited since its publication. Find at least one of the citingarticles. You may be able to find it on J-Stor (if it is older), in one of the other

188 B. G. Marfleet and B. J. Dille

Dow

nloa

ded

by [

Uni

vers

ity o

f N

orth

Car

olin

a] a

t 15:

53 1

1 N

ovem

ber

2014

Page 16: Information Literacy and the Undergraduate Research Methods Curriculum

full-text databases, or you may need to venture into the library if it is from a morerecent journal or is a book or part of an edited volume.

Write a one page summary of how your search went. What did you search for?What keywords did you use? How confident are you that you have identified the keyarticles in the discipline related to your research topic? Are there specific journalsthat come up regularly when you search for articles about your topic? On a sub-sequent page(s) include full bibliographic references to the J-Stor and SSCI articlesyou found. Below each article indicate how many times it has been cited (based onthe SSCI) and append a one or two sentence summary of the content of the article.

Come prepared Friday to discuss what you’ve found and to submit your paper.

Notes

1. For an overview of the development of the concept of IL in the college academicenvironment see Behrens (1994).

2. For more extensive discussion of pedagogical practices and IL see Hunt and Birks(2004).

3. For a discussion regarding the cost and consideration surrounding the implementationand assessment of IL as a core competency at the general curricular level see Ragins (2001) andDunn (2002).

References

Association of College and Research Libraries (ACRL). 2000. Information LiteracyCompetency Standards For Higher Education. ACRL Standards Committee reportapproved at the Midwinter Meeting of the American Library Association in San Antonio,Texas, http://www.ala.org/Content/NavigationMenu/ACRL/Standards_and_Guidelines/Information_Literacy_Competency_Standards_for_Higher_Education.htm, January 18,2000.

Association of College and Research Libraries (ACRL). 2001. Objectives for Information Lit-eracy Instruction: A Model Statement for Academic Librarians. http://www.ala.org/acrl/guides/objinfolit.html

Barth, A., A. Bennet, B. F. Braumoeller, J. D. Morrow, K. R. Rutherford, P. Schwartz-Shea, R.M. Smith, and D. Yanow. 2003. ‘‘Methodological Pluralism in Journals and GraduateEducation? Commentaries on New Evidence.’’ PS: Political Science and Politics 36:371–397.

Behrens, Shirley J. 1994. ‘‘A Conceptual, Analytical and Historical Overview of InformationLiteracy.’’ College and Research Libraries. 55(4):307–322.

Braumoeller, B. F. 2003. ‘‘Perspectives on Pluralism.’’ PS: Political Science and Politics36:387–390.

Buchanan, Lori E. 2002. ‘‘Integrating Information Literacy Into the Virtual University: ACourse Model.’’ Library Trends 51:144–167.

Bundy, Alan. 1998. ‘‘Information Literacy: The Key Competency for the 21st Century.’’ Paperpresented at the Annual Conference of the International Association of TechnologicalUniversity Libraries, Pretoria South Africa.

Corbett, Rod and Julie Kearns. 2004. Implementing Activity-Based e-Learning. http://elearn.ucalgary.ca/paper/

Dunn, Kathleen. 2002. ‘‘Assessing Information Literacy Skills in the California State Univer-sity: A Progress Report.’’ Journal of Academic Librarianship. 28(1=2):26–35.

ERIC Digest. 1994. ‘‘Information Literacy in an Information Society.’’ ED 372756. Adaptedfrom ERIC monograph Christina S. Doyle. ‘‘Information Literacy in an InformationSociety: A Concept for the Information Age.’’ ED 372763.

Information Literacy 189

Dow

nloa

ded

by [

Uni

vers

ity o

f N

orth

Car

olin

a] a

t 15:

53 1

1 N

ovem

ber

2014

Page 17: Information Literacy and the Undergraduate Research Methods Curriculum

Fialkoff, Francine. 2001. ‘‘Educate the Educators.’’ Library Journal 126(11):2.Hunt, Fiona and Jane Birks. 2004. ‘‘Best Practices in Information Literacy.’’ Portal: Libraries

and the Academy 4(1):27–40.Ishiyama, John and Marijke Breuning. 2004. ‘‘A Survey of International Studies Programs at

Liberal Arts Colleges and Universities in the Midwest: Characteristics and Correlates.’’International Studies Perspectives 5:134–146.

Julien, Heidi E. 2000. ‘‘Information Literacy Instruction in Canadian academic Libraries:Longitudinal Trends and International Comparisons.’’ College and Research Libraries61(6):510–523.

Lebow, D. 1993. ‘‘Constructivist Values for Instructional Systems Design: Five PrinciplesToward a New Mindset.’’ Educational Technology Research and Development, 41(3):4–16.

Mendelsohn, Susan. 2003. ‘‘Lost in the information age.’’ Information World Review 187:14–16.

National Council of Teachers of Mathematics (NCTM). 1989. ‘‘Curriculum and EvaluationStandards for School Mathematics.’’ Commission on Standards for School Mathematics.Reston, VA: NCTM (ED 304 336).

Nowicki, Stacy. 2003. ‘‘Student vs. Search Engine: Undergraduates Rank Results forRelevance.’’ Portal: Libraries and the Academy, 3(3):503–515.

OCLC Online Computer Library Center, Inc. 2002. ‘‘OCLC White Paper on the InformationHabits of College Students: How Librarians Can Influence Students’ Web-Based Infor-mation Choices.’’ June. http://www5.oclc.org/downloads/community/informationhabits.pdf

Quarton, Barbara. 2003. ‘‘Research skills and the new undergraduate.’’ Journal of Instruc-tional Psychology 30(2):120–125.

Rader, Hannelore B. 2002. ‘‘Information Literacy 1973–2002: A Selected Literature Review.’’Library Trends 51(2):242–260.

Ragins, Patrick. 2001. ‘‘Infusing Information Literacy into the Core Curriculum: A PilotProject at the University of Nevada Reno.’’ Portal: Libraries and the Academy1(4):391–407.

Sellen, M. K. 2002. ‘‘Information Literacy in the General Education: A New Requirement forthe 21st century.’’ Journal of General Education 51(2):115–126.

Shenk, David. 1997. Data Smog: Surviving the Information Glut. San Francisco, CA:HarperEdge.

Thies, Cameron and Robert E. Hogan. 2003. ‘‘The State of Undergraduate ResearchMethods Training in Political Science.’’ Paper Presented at the American Political ScienceAssociation Annual Meeting, Philadelphia, PA.

Thompson, Charles. 2003. ‘‘Information Illiterate or Lazy: How College Students Use theWeb for Research.’’ Portal: Libraries and the Academy 2(3):259–268.

University of Maryland University College (UMUC). 2003. ‘‘Information Literacy and Writ-ing Assessment Project: Tutorial for Developing and Evaluating Assignments.’’ UMUCInformation and Library Services. http://www.umuc.edu/library/infolit/sect14.html.

Whitehead, Martha J. and Catherine A. Quinlan. 2003. ‘‘Information Literacy in HigherEducation.’’ Feliciter 1:22–24.

190 B. G. Marfleet and B. J. Dille

Dow

nloa

ded

by [

Uni

vers

ity o

f N

orth

Car

olin

a] a

t 15:

53 1

1 N

ovem

ber

2014