18
Outcomes of Information Literacy Instruction for Undergraduate Business Students Authors Heidi Julien School of Library & Information Studies, University of Alberta 3-20 Rutherford South, Edmonton, Alberta, Canada T6G 2J4 Email: [email protected] Brian Detlor DeGroote School of Business, McMaster University DSB-A201, 1280 Main St. W., Hamilton, Ontario, Canada L8S 4M4 Email: [email protected] Alexander Serenko Faculty of Business Administration, Lakehead University RB1040, 955 Oliver Road Thunder Bay, Ontario, Canada P7B 5E1 Email: [email protected] Rebekah Willson Mount Royal College Library 4825 Mount Royal Gate SW Calgary, Alberta, Canada T3E 6K6 Email: [email protected] Maegen Lavallee Faculty of Business Administration, Lakehead University 955 Oliver Road Thunder Bay, Ontario, Canada P7B 5E1 Email: [email protected] This paper reports on a national study of information literacy instruction at Canadian business schools. The research question examined was: what is the interplay between factors of the learning environment and information literacy program components on business student learning outcomes? The question was examined in a four-phase study; data focusing on

Outcomes of information literacy instruction for undergraduate business students

Embed Size (px)

Citation preview

Outcomes of Information Literacy Instruction for Undergraduate Business Students

Authors

Heidi Julien

School of Library & Information Studies, University of Alberta

3-20 Rutherford South, Edmonton, Alberta, Canada T6G 2J4

Email: [email protected]

Brian Detlor

DeGroote School of Business, McMaster University

DSB-A201, 1280 Main St. W., Hamilton, Ontario, Canada L8S 4M4

Email: [email protected]

Alexander Serenko

Faculty of Business Administration, Lakehead University

RB1040, 955 Oliver Road Thunder Bay, Ontario, Canada P7B 5E1

Email: [email protected]

Rebekah Willson

Mount Royal College Library

4825 Mount Royal Gate SW Calgary, Alberta, Canada T3E 6K6

Email: [email protected]

Maegen Lavallee

Faculty of Business Administration, Lakehead University

955 Oliver Road Thunder Bay, Ontario, Canada P7B 5E1

Email: [email protected]

This paper reports on a national study of information literacy instruction at Canadian business

schools. The research question examined was: what is the interplay between factors of the

learning environment and information literacy program components on business student

learning outcomes? The question was examined in a four-phase study; data focusing on

student learning outcomes drawn from the first phase is reported here. Library administrators,

librarians, teaching faculty, and business students were interviewed about students'

information literacy instructional experiences, and the outcomes arising from those

experiences, observed by these four groups. Data indicate a wide range of positive outcomes,

including specific skill development and increased confidence; however, expected

transferability of those outcomes beyond the walls of the educational institution remains

doubtful.

Introduction

In the twenty-first century, business decision-making, as well as successful and productive

citizenship in general, requires skills in finding, retrieving, analyzing and using information

(ACRL, 2006a). These skills, which form the basis of information literacy, are critical to

success in both academic and work contexts, as well as to the general quality of life in the

information society. Such skills are vital for success in today's business world where

"information has become the leading business asset" (Kanter, 2003, p. 23). For example, in

his speech to the 1999 graduating class at the University of Toronto, Anthony Comper, then

President of the Bank of Montreal, stated that: "Whatever else you bring to the 21st century

workplace, however great your technical skills and however attractive your attitude and

however deep your commitment to excellence, the bottom line is that to be successful, you

need to acquire a high level of information literacy. What we need in the knowledge industries

are people who know how to absorb and analyze and integrate and create and effectively

convey information — and who know how to use information to bring real value to everything

they undertake" (ACRL 2006b, italics added for emphasis). Thus, in business education in

particular, there is an explicit need to train students how to locate, assess, and interpret

information from a wide variety of information sources so graduates can properly utilize

information for knowledge-building and decision-making purposes when they work in

organizations upon graduation. Strategic advantage over competitor organizations,

productivity, and innovation, are significantly enhanced when workers are information literate.

This view is substantiated by recent writings in the business librarianship literature that

describe the critical importance of teaching information literacy skills to business students

and the dire need for more instruction (Cooney & Hiris, 2003; Hawes, 1994). For example,

Cooney (2005), in her recent survey of nearly 400 libraries of colleges and universities

accredited by the Association to Advance Collegiate Schools of Business (AACSB), identifies

such instruction as still "evolving," where collaboration between librarians and business

faculty is described as "overwhelmingly moderate" and only a third of respondents report

incorporating the ACRL's Information Literacy Competency Standards for Higher Education

into their instruction efforts with business students.

Context

Canadian Academic Libraries

This research was undertaken in Canada, so examination of that context is critical.

The past 15 years has been marked by a serious decline in the resources available

to academic libraries in Canada. A general state of under-funding and lack of

resources, human and otherwise, is the practical context for this study, ironically

precisely when information literacy instruction (ILI) is beginning to be seen as a

necessity (Auster & Taylor, 2004). Despite resource constraints, a primary role for

academic librarians has been the provision of ILI in higher education.

There is evidence for attention to quality assurance in the library sector of

Canadian higher education. Evaluation and assessment topics have increasingly

appeared on the conference program for the Canadian Library Association, and

international conferences are also focusing on these issues (e.g., the Library

Assessment Conference in Charlottesville, VA in late September 2006). A report

by the Canadian Association of Research Libraries concluded that user surveys

indicate that library users were generally satisfied with service quality but

unsatisfied with the adequacy of collections and technology infrastructure (CARL,

2003). However, the surveys analyzed did not address student learning outcomes.

Individual libraries also have participated in other assessment efforts. For

example, in 2008, seven Canadian libraries in colleges, universities, and

government participated in the LibQUAL+TM project, which surveys user

perceptions about service quality related to library collections and services.

Specifically with respect to ILI, up to Spring 2008, seven Canadian university

libraries participated in Project SAILS (Standardized Assessment of Information

Literacy Skills; https://www.projectsails.org/). Operated by Kent State University

Libraries and Media Services, this project was established because information

literacy has become a focal point of many libraries' missions. Assessment also is

increasingly recognized as a key initiative to improve programs and to meet the

obligation of accountability. Despite this involvement in Project SAILS, in Canada

and elsewhere, the evaluation of client instruction in general is greatly in need of

improvement (Ackerson & Young, 1994; Adams, 1993; Julien, 2006; Totten,

1990). Not viewed by the majority of administrations as a funding priority, only a

minority of libraries have formal instructional objectives, library instruction is

typically handled by a small dedicated staff, and instructional evaluation is

informal and formative (Julien, 2006; LaGuardia, 1996; Shonrock, 1996).

Summative evaluations suggest that student grades and program completion

rates for undergraduates are improved by client instruction (Greer et al., 1991;

Hardesty et al., 1982; Selegan et al., 1983). These are indirect measures of

student learning (i.e., they are gross indicators, but not direct evaluations), and are

a valuable tool in assessing instructional outcomes (Lopez, 2002). In addition,

more direct assessment done in specific contexts has found, for instance, that

students who receive instruction increase their searching effectiveness and are

able to select marginally more relevant information sources (Emmons & Martin,

2002). Researchers report that systematic assessment is helping to shape library

services (Seamans, 2002). Further confirmation and extension of these outcomes

is necessary; for example, can other short- and longer-term benefits resulting from

effective instruction be determined? Such benefits might include specific search

skills, improved cognitive understanding, and attitudinal changes (e.g., increased

self-confidence) that promote more effective or efficient use of information

resources. There has been one recent study of outcomes undertaken in Canadian

academic libraries (Julien & Boon, 2004). Those outcomes included increased

confidence, improved searching skills, and changed attitudes towards libraries. In

a separate study conducted at San Jose State University, pre- and post-library

instruction surveys showed that there was a 16 percent decrease in the use of

non-library Web sites after library instruction; more importantly, students

indicated greater confidence and self-efficacy (Roldan & Wu, 2004).

The Business School Challenge

As accreditation with the AACSB International (see www.aacsb.edu) becomes a de

facto standard vital to a business school's viability, international reputation and

long term success, there has been a large movement within business schools in

very recent years to incorporate proper student learning outcome measurements

as a means of demonstrating the achievement of learning goals. Since

information literacy is recognized as an essential learning goal for many business

schools, there is a need to not only work with librarians to teach information

literacy skills to business students, but also a very strong push to develop

standardized measures with librarians that can demonstrate the achievement of

teaching information literacy skills to business students in order to satisfy AACSB

accreditation requirements.

The challenge facing Canadian business schools, as well as business schools

abroad, is how best to work with librarians to incorporate ILI in curriculum and

program designs. Past studies are limited in their contribution, as they primarily

concentrate on the testing of single information literacy program components in

single business school locales (see, for example: Cooney & Hiris, 2003; Feast,

2003; Fiegen et al., 2002; Orme, 2004; Wu & Kendall, 2006). Dewald (2005)

reports that business faculty at one institution tend to require or encourage

students to use the open Web for their assignments, rather than databases. More

encouragingly, a recent longitudinal study, also conducted at a single institution,

was able to demonstrate the value of sustained instructional efforts over a period

of time, which incorporated student feedback into successive iterations of

instruction (Long & Shrikhande, 2007).

Theoretical Framework and Research Questions

Information literacy instruction is complex; it is delivered in many formats by many methods,

some of which may be more successful than others. A theoretical framework for information

literacy assessment proposed by Lindauer (2004) suggests that both the learning

environment (e.g., the curriculum, co-curricular learning opportunities, independent learning

opportunities) and information literacy program components (e.g., courses, workshops,

reference desk encounters, online educational instruction) have a direct effect on student

learning outcomes (e.g., performance measures on tests and assignments, course grades).

However, little is known about these effects. Hence, this study was conceived to address the

question: what is the interplay between factors of the learning environment and information

literacy program components on business student learning outcomes? This question is further

refined into three sub-questions:

1. What are the business school information literacy learning outcomes from the

perspectives of students, librarians and teaching faculty? This question examines the

cognitive, psychological and behavioural aspects of information literacy learning

outcomes in business schools as described by educational assessment theory.

Importantly, the question considers potentially different learning outcomes across

different stakeholders.

2. What are the salient elements of the learning environment which affect business

student information literacy learning outcomes? Of importance to this question is

understanding the more relevant aspects of the learning environment that influence

and impact information literacy learning outcome success. Feast (2003) identifies a

preliminary but substantive list of contextual factors affecting learning outcomes. This

list, accompanied by other factors deemed worthwhile by the research team, include:

student issues (from both the student's and librarian points of views); resources (e.g.,

time, money, staff, infrastructure such as active learning classrooms and technical

infrastructure); staff/student ratios; librarian and teaching staff attitudes toward ILI (this

includes librarian attitudes towards them teaching students); library staff training;

strategic plans; teaching philosophies, AASCB accreditation and its effect on

emphasizing student learning outcomes in the school; and organizational culture (e.g.,

emphasis on teaching and learning, collaborative relationships between librarians and

faculty, attitudes towards change, reward systems, leadership).

3. What program components of ILI promote positive business student information literacy

learning outcomes? This question explores the viability and effectiveness of various

information literacy program components, such as face-to-face instruction, group vs.

individual instruction, and online training, and differences in their application.

The evaluation literature in higher education is vast. However, most authors agree that

evaluation in a general sense appropriately incorporates system inputs (resources),

throughputs (activities — something is done with the inputs) and outputs or results (including,

and especially, outcomes). The theoretical framework for this study is educational

assessment theory, highlighting the importance of evaluating outcomes (summative) with the

goal of instructional improvement (formative). Assessment theory, specifically student

learning outcomes assessment, is reflected in the span and logical sequence of the study.

Sims (1992) suggests a broad definition of instructional assessment, drawn from Boyer &

Ewell (1988), as the gathering of evidence relating to the impact of instruction. Sims notes

that: identifying key audiences and obtaining support for in-depth assessment is key to

successful project completion; valid data are obtained by assessing output, rather than input

measures; and outcomes appropriately assessed include cognitive (gains in knowledge),

psychological (changes in attitudes or values), behavioural (changes in actions), and longer-

term outcomes (program completion rates, changes in grades). These elements, except for

longer-term outcomes, are all incorporated into the design of this study; the research team

plans to address longer-term outcomes in its larger research program. According to Patton

(1997), two of the three primary uses of evaluation findings are to improve programs and to

generate knowledge (e.g., about what works). These uses of evaluation findings are core goals

of the study.

Lindauer's (2004) assessment framework specific to information literacy is consistent with

more general educational assessment theory. The "three arenas" of assessment should

include the learning environment (curriculum, co-curricular learning opportunities,

independent learning opportunities), information literacy program components (courses,

workshops, reference desk encounters, instructional learning opportunities by appointment,

independent learning opportunities), and student learning outcomes (performance measures

on tests, course-embedded assignments, program portfolios, course grades, self-assessment,

surveys of attitudes about the learning environment). Although student learning outcomes are

the focus of this study, baseline data was gathered to contextualize those outcomes within

the learning environment and information literacy program components specific to the

contexts under study.

Data Collection and Analysis

The study presented here examines the current state and success of ILI given by academic

librarians at three business schools in Canada (one small, one medium, and one large)

through: i) a series of interviews conducted with business school librarians, university library

administrators, course instructors and business students; ii) an analysis of strategy- and

policy-related documents; iii) application of a standardized information literacy testing

instrument called SAILS that measures student information literacy competency; and iv) a

Web-based survey of students. Through these well-triangulated methods, the contextual

factors affecting student information literacy outcomes are being explicated. Analysis was

guided by educational assessment theory (Boyer & Ewell, 1988; Sims, 1992) to articulate

student learning outcomes across three distinct areas: i) cognitive (gains in knowledge); ii)

psychological (changes in attitudes or values); and iii) behavioural (changes in actions).

Variations in student results are interpreted through an understanding of the differences in

learning environments and information literacy program components at the three business

schools. Typical control variables from the education literature, like age and gender, are also

used to analyse variation.

The research team is conducting in-depth investigations at three Canadian business schools

representing represent different geographical regions of the country, with different-sized

student populations, different histories with AACSB accreditation, different information

literacy program components, and different emphases on ILI. For example, School A received

AACSB accreditation in 2006 and has incorporated ILI in several of its undergraduate courses

(i.e., in its first-year required Business course, second-year required Marketing course, second-

year required Information Systems course, and fourth-year elective International Business

course) via close collaboration between the business librarians and course instructors; this

instruction is given through face-to-face group consultation, class presentations, lab tutorials,

and reference desk services. As a result of internal university funding, the business librarians

have developed an online information literacy tutorial. A senior library administrator hired last

year is charged with managing and coordinating information literacy instruction across

various academic faculties and departments at School A. In 2009, a business fluency

(literacy) librarian will be hired. Since 2006, SAILS testing has been administered to the entire

cohort of first and second year students at School A. In contrast, School B, which also has

AACSB accreditation, has not used SAILS prior to this research study, but does include

compulsory ILI in a required undergraduate course; that instruction is developed

collaboratively with the faculty instructor. In addition, more informal instruction occurs via

consultation and individual reference service interactions in the library, and through Web-

based course specific research guides integrated into Blackboard (a Web-based learning

management system). School C, which is currently pursuing AACSB accreditation,

incorporates formal ILI in an orientation session for its business graduate students. The

Faculty also has two core undergraduate courses that embrace substantial information

literacy program components including tutorials, information problem-solving tasks, and

research reports. The same data gathering processes are being used at each study site. Ethics

approval for each phase of the study was provided by each university involved in the data

collection. In total, four different data collection methods are included in the study. The first

phase, a series of one-on-one, open-ended interviews with a variety of stakeholders involved in

business ILI, has been completed. At each study site, we sought to interview senior library

university administrators, two to three business school librarians, a minimum of two business

faculty instructors who in their courses call upon librarians to give ILI, as well as 20 to 30

business students who have received information literacy training. The student interviews

lasted 10 to 30 minutes, while the others ran approximately one hour each. Interviews were

taped and transcribed, and later analysed by multiple coders using qualitative grounded

theory techniques (Strauss & Corbin, 1998), via the assistance of NVivo text analysis software.

Following initial inductive analyses, the Lindauer (2004) framework was used as primary

theoretical lens.

The second phase was an analysis of strategy- and policy-related documents within the

business school and library that outline the goals and objectives that may pertain to the

delivery of information literacy skills. The third phase was the application of SAILS to business

students as a means of applying a standardized information literacy testing instrument and

measuring student information literacy competency across the three schools. Applying the

same measurement instrument to test business student learning outcomes allows the

research team to understand how differences in the learning environment and information

literacy program components at the three study sites affects student information literacy

skills. The fourth and final phase (to be conducted in 2009) is a Web-based survey to

approximately 200 students at each school. The survey will collect basic student demographic

information, a history of their ILI to date, their thoughts and perceptions on the effectiveness

of this instruction and its influence on their information behaviours (i.e., including cognitive,

psychological and behavioural changes). The questionnaire will also include close-ended

items pertaining to the factors of student learning outcomes, the learning environment and

information literacy program components. The study results will contribute theoretically to the

information literacy and educational assessment literatures, and generate recommendations

to business school librarians on how to deliver improved ILI and contexts in which online ILI

should be given.

Results and Discussion

This paper reports on the very rich data obtained from the first phase of the study, in which

library administrators, librarians, teaching faculty, and students were interviewed about the

student learning outcomes resulting from ILI.

General Findings

Students agree that ILI leads to a reduction in effort to find information; thus,

convenience is increased, information is easier to find, and time is saved,

especially when deciding which databases to use and how to use interfaces to

those databases. Students also report better grades following training, in large

part due to assignments mandating the use of library information resources. Not

surprisingly, students claim that they are more knowledgeable about what

information is available through library-provided databases, and are also more

aware that library-provided resources are relevant, authoritative, and high quality.

Most students indicated that they expect to use the skills gained in ILI in their

careers, although were unlikely to hold expectations about application in daily life

contexts. For students, expected future use depended on whether they will have

access to the same resources they have been trained to use in their educational

experiences, the job they expect to obtain following graduation, and what they

believe their jobs will entail. Expected transferability of information literacy skills

to contexts other than work, or to use of new information sources, was low. A

minority of students report no behavioural changes following ILI; those who did

fall into this category also expressed a preference for convenient, easy-to-use

sources such as Google and Wikipedia which provide "good enough" information,

they disliked the interface to library-provided information sources (finding these

complicated and cumbersome), and they were likely to have experienced a

negative experience with ILI or use of library resources.

Information literacy instruction reportedly had a positive impact on some of the

students in terms of their ability to find information (i.e., developed their searching

skills, such as specific techniques). In addition, some students reported having

developed strategies for planning their searching, and for evaluating the reliability

and accuracy of the information they find. Interestingly, students at all three

institutions perceived their search skills as the weakest part of their IL skills.

While several students from all institutions mentioned their weak search skills

only one student mentioned that it was often difficult to determine if a source was

credible. The desire to improve search skills was a theme that emerged at all

three institutions and was mentioned by 16 students. In terms of improving these

skills, many students hoped that these skills would reduce the time it takes to

conduct a search, and produce more relevant and precise results.

While many students are concerned with developing their searching skills, fewer

are concerned with developing the skills to evaluate the quality of information

resources, consistent with the finding that very few students cited evaluation as a

problem. Although many students do not recognize the evaluation of resources as

a skill they need to work on, many faculty members expressed concern over the

general level of evaluation skills within the undergraduate business population.

Although it was mentioned less frequently than improving search skills, students

at all three institutions expressed a desire to improve their overall understanding

of the library and how it works. This is consistent with the fact that students tend

to rely heavily on online resources such as Google to complete their assignments.

At all three institutions, faculty, librarians and library administrators all indicated

that business students on the whole have good computer skills and are fairly

technologically savvy. As a result of this, students, again, often rely on Internet

search engines like Google to complete assignments.

At all three institutions, some students, faculty and librarians mentioned that they

believed that students' skills improve over time with training and practice.

However, it seems that students have different perceptions about what skills they

are lacking when compared with faculty. Although students expressed concern for

improving their search skills and rarely mentioned a lack of evaluation skills, the

librarians and faculty seemed more concerned with students' difficulty in

evaluating resources. The difference in students' and faculty's perceptions of IL

skills was perceived by a business librarian at School A who noted that students

often think they have the skills and faculty think they do not. This was also noted

by a professor at School C who said that while half of his students say they know

how to use the library's resources, he does not believe that they do. Overall it

seems that faculty and librarians perceive students as lacking adequate IL skills.

Across all three institutions it seems apparent that students perceive their IL skills

to be better than professors and librarians generally perceive them to be.

In addition, students reported increased confidence in their search skills, as a

result of the training they received. That confidence translated into increased use

of library resources (and reduced use of less-authoritative sources), increased

library visits, and requests for help from librarians, who are viewed as more

approachable, more knowledgeable, and more willing to help following ILI. At

School A, the faculty member interviewed recognized that students' comfort levels

are increased through exposure to new resources, specifically people. When

students are exposed to new resources and have these explained, they become

more comfortable using their newly acquired information literacy skills. This

finding also is reflected in the interviews given by two senior students at School B.

Both of these individuals noted that they felt more comfortable conducting

business research because they had been introduced to people who could help

them with any questions related to this topic. At School C, students mentioned a

different approach within ILI that eased their stress and created a greater level of

comfort when conducting business research. Through ILI sessions, these students

noted that they were able to practice using unfamiliar resources and new skills.

With practice, their comfort levels increased and they became more comfortable

with what they had learned during the ILI session

. Librarians, too, reported more student engagement, following ILI. They also

noted that questions asked by students were more sophisticated following

training, and observed that students used resources earlier in the assignment-

preparation process. Interestingly, students' apparent use of authoritative library

resources following training is especially noteworthy when assignments requiring

that use are worth a large percentage of their final grade. However, that use

appears not to transfer to other areas of life, where consultation of Google or

Wikipedia remains paramount. ILI that incorporates or acknowledges how

information sources like Google or Wikipedia can be used effectively in an

information search may better serve students in winning them over to the

instruction. Further, the influencing effect of mandating students to use

authoritative library information sources for their assignments may only have a

temporary effect. This was manifested in the feedback given by some senior

students who ended up reverting back to the use of Google and other Internet

search engines to find information for their assignments in lieu of library

resources. The second iteration of the Technology Acceptance Model (TAM2)

offers some explanation as to the temporary effect of mandating information

systems use (Venkatesh & Davis, 2000) and how other social factors are more

effective in encouraging permanent adoption of information technology systems

than mandating use (Venkatesh & Morris, 2000). Thus, rather than mandating use

of authoritative library information resource use in assignments, a more effective

strategy may be to focus the adoption and use of library information resources on

persons perceived to be important and/or highly regarded among students in their

social structures or peer-groups; if these types of people are seen to adopt the

technology then it is more likely that others in their social groups will permanently

adopt the technology as well.

Overall, the psychological outcomes for the three universities are more positive

than neutral or negative. Eighteen students reported that ILI had some positive

psychological effect. This is much higher than the overall number of students who

reported no psychological effect.

Gender Differences

There is some evidence from the analysis of the data that suggests that females

place more importance on the benefit of saving time than on other benefits of ILI

(e.g., better grades, less effort). A possible rationale here is that females are

known to be more comprehensive information searchers than males (Hupfer &

Detlor, 2006). That is, females are more likely to conduct their information

searches in as much time as it takes to explore every link and avenue, while

males tend to adopt more selective search strategies when chasing down

information, constraining or limiting the time spent on their information searches

and being choosy when deciding on leads to follow. Thus, it makes sense that any

savings in time afforded through ILI would be better appreciated by females than

by males since females tend to conduct more exhaustive searches than males.

Senior/Junior Differences

Senior students (in their third year of university or higher) are more likely to report

no behavioural outcome changes as a result of the ILI training than junior

students (in their first or second year). Senior students are also more likely to

report a reduction in effort to find information as a result of the ILI training than

junior students. Most students who responded about the benefit of "better grades"

were senior students. They are more inclined to appreciate the benefits of the

library database resources than junior students, and are more inclined to

acknowledge improvements in their ability to evaluate information sources as a

result of the ILI than junior students. Some evidence suggests that senior students

are more inclined to acknowledge that the ILI taught them how to research

information better, and they are more inclined to report an improvement in their

search skills when using library resources because of the ILI. Senior students are

by far more likely to report positive psychological outcomes from ILI training than

junior students. This suggests that positive psychological outcomes may come

over time with experience and practice.

Conclusions

Most librarians and library administrators have high expectations of student learning

outcomes as a result of ILI training. However, these may be unrealistic in terms of what is

happening on the ground. Student feedback suggests that although there are several positive

outcomes from ILI, the magnitude and extent of these outcomes do not match the

expectations of library staff.

Despite librarians' and teaching faculty members' beliefs that ILI teaches students how to

evaluate information better, few students acknowledged an improvement in their ability to

evaluate information as a result of their training. This benefit seems secondary to the other

benefits gained by students. As such, the benefit of improved information evaluation skills

may be a latent skill of the ILI training not easily recognized by students, or it may be a skill

that is manifest only after a significant time period has passed with access to good

information, rather than an immediate benefit of the ILI training (i.e., information evaluation

is a higher-order learning outcome). There is some evidence to suggest that teaching students

to evaluate information better may actually lead to a decrease in use of library databases in

the long term. When students gain confidence in their ability to evaluate information sources

more effectively, they are at risk of reverting back to using more "natural" or convenient

information sources (such as Google or Wikipedia) since they feel confident they have the

ability to filter out low-quality information from those sources more effectively.

The data reported here shows that ILI teaches students ways to find and use information from

library information sources effectively. However, lower-level search skills or tips specifics to

assignments tend to be taught (e.g. specific keywords to use), as opposed to broader search

strategies. Despite librarians' and teaching faculty members' beliefs that students are gaining

knowledge about how to use the library databases effectively through the ILI, the extent to

which students recognize or acknowledge this themselves is much less.

Findings suggest that mandating the use of library information sources in course assignments

leads to increased usage of library resources and is a good way to encourage the

development of student IL skills. Faculty and business librarians are generally pleased with

this teaching strategy. However, there is some evidence to suggest that this may be a

temporary effect and that other more sustainable and lasting strategies may need to be

followed to rally the permanent behavioural changes in students that faculty and library staff

so desire, away from the heavy reliance on Internet sources like Wikipedia and Google and

towards the adoption and use of library information resources as students' preferred and

primary information sources.

Most students identify positive learning outcomes from the ILI training; however, lower

performing students are more in jeopardy of not adopting library information sources after the

training and/or are still relying on broad Internet searches to find the information they need.

Although both students and faculty tend to recognize that IL skills improve over students' time

in the program, it becomes apparent that in general, students perceive their IL skill level

differently than professors and librarians do. Generally speaking, faculty and professors have

higher expectations and believe that students are lacking the necessary IL skills. This stands

in contrast to the perceptions of many students, who tend to see their skills as well developed

or adequate for completing school assignments. In addition to this, students and professors

have different opinions regarding which IL skills students most need to work on. While

students often note that they would like to improve their search skills, professors believe that

students need to work on understanding what constitutes a credible source. This finding is not

surprising considering that professors are not present during the time that students spend

searching and are left to evaluate student's final projects and the sources that have been

used. This may suggest one of two things: 1) that students become frustrated as they search

and due to time constraints and frustration in certain circumstances, they will knowingly use a

less credible source because it is better than nothing; or 2) that students are not aware of

their inabilities to evaluate resources and that this skill needs to be addressed by their

instructors.

Although instructors felt that students need to improve their IL skills, several mentioned that

business students are often technologically savvy and have a good understanding of how to

use Internet search engines such as Google. Because students have developed these skills

and are comfortable using these resources, there is a heavy reliance on them for completing

assignments. This was cited as a problem by teaching faculty at all three institutions.

Acknowledgements

Thanks to our study participants — institutions, libraries, teaching faculty, librarians, and

students; to Research Assistant Kristen Holm; and to our funder, the Social Sciences and

Humanities Research Council of Canada for Standard Research Grant 410-07-2289.

References

Ackerson, L.G. & Young, V.E. (1994). Evaluating the impact of library instruction methods on

the quality of student research. Research Strategies 12, 132-144.

ACRL (2006a). Introduction to Information Literacy. Association of College and Research

Libraries. URL:

http://www.ala.org/ala/mgrps/divs/acrl/issues/infolit/infolitoverview/introtoinfolit/introinfolit.cfm.

Viewed December 9, 2008.

ACRL (2006b). Advocate for IL. Association of College & Research Libraries. URL:

http://www.ala.org/ala/mgrps/divs/acrl/issues/infolit/professactivity/advocate/advocateil.cfm.

Viewed December 9, 2008. Adams, M.S. (1993). Evaluation. in Sourcebook for Bibliographic

Instruction, K. Branch & C. Dusenbury, eds. Chicago, IL:

Bibliographic Instruction Section, Association of College and Research Libraries, American

Library Association, 45-57.

Auster, E. & Taylor, S. (2004). Downsizing in Academic Libraries: The Canadian Experience.

Toronto, ON: University of Toronto Press.

Boyer, C.M. & Ewell, P.T. (1988). State-Based Case Studies of Assessment Initiatives in

Undergraduate Education: Chronology of Critical Points. Denver, CO: Education Commission of

the States.

CARL (2003). The State of Canadian Research Libraries 2001-2002. Canadian Association of

Research Libraries. URL: http://www.carl-abrc.ca/projects/state/state_2001_2002-e.html.

Viewed August 17, 2006.

CCME (1999). A Report on Public Expectations of Postsecondary Education in Canada.

Canadian Council of Ministers of Education. URL:

http://www.cmec.ca/postsec/expectations.en.pdf. Viewed August 17, 2006.

Cooney, M. (2005). Business information literacy instruction: A survey and progress report.

Journal of Business & Finance Librarianship, 11(1), 3-25.

Cooney, M., & Hiris, L. (2003). Integrating information literacy and its assessment into a

graduate business course: A collaborative framework. Research Strategies, 19(3/4), 213-232.

Dewald, N.H. (2005). What do they tell their students? Business faculty acceptance of the web

and library databases for student research. The Journal of Academic Librarianship 31(3), 209-

215.

Emmons, M. & Martin, W. (2002). Engaging conversation: Evaluating the contribution of

library instruction to the quality of student research. College & Research Libraries 63, 545-

561.

Feast, V. (2003). Integration of information literacy skills into business courses. Reference

Services Review, 31(1), 81-95. Fiegen, A. M., Cherry, B., & Watson, K. (2002). Reflections on

collaboration: Learning outcomes and information literacy assessment in the business

curriculum. Reference Services Review, 30(4), 307-318.

Greer, A., Weston, L. & Alm, M. (1991). Assessment of learning outcomes: A measure of

progress in library literacy. College & Research Libraries 52, 549-557.

Hardesty, L., Lovrich, N.P. & Mannon, J. (1982). Library-use instruction: Assessment of the

long-term effects. College & Research Libraries 43, 38-46.

Hawes, D. K. (1994). Information literacy and the business schools. Journal of Education for

Business, 70(1), 54-61.

Hupfer, M. E., & Detlor, B. (2006). Gender and web information seeking: A self-concept

orientation model. Journal of the American Society for Information Science & Technology, 57

(8), 1105-1115.

Julien, H. (2006). A longitudinal analysis of information literacy instruction in Canadian

academic libraries. Canadian Journal of Information and Library Science 29(3), 289-313.

Julien, H. & Boon, S. (2004). Assessing instructional outcomes in Canadian academic

libraries. Library & Information Science Research, 26(2), 121-139.

Kanter, J. (2003). Ten hot information technology (IT) issues and what makes them hot.

Information Strategy: The Executive's Journal 19(3), 23-36.

LaGuardia, C. (1996). Teaching the New Library: A How-to-Do-It Manual for Planning and

Designing Instructional Programs. New York, NY: Neal-Schuman.

Lindauer, B.G. (2004). The three arenas of information literacy assessment. Reference & User

Services Quarterly 44(2), 122-129. Long, C.M. & Shrikhande, M.M. (2007). Improving

information-seeking behavior among business majors. Research Strategies 20, 357-369.

Lopez, C. (2002). Assessment of student learning: Challenges and strategies. The Journal of

Academic Librarianship 28, 356-367. Nadeau, G.G. (1995). Criteria and Indicators of Quality

and Excellence in Colleges and Universities in Canada: Summary of the Three Phases of the

Project. Winnipeg, MB: Centre for Higher Education Research and Development, University of

Manitoba.

OUSA (1996). Improving Accountability at Canadian Universities: Presented to The Council of

Ministers of Education of Canada, Second National Consultation on Education, Edmonton,

Alberta, May. Toronto, ON: Ontario Undergraduate Student Alliance.

Orme, W. A. (2004). A study of the residual impact of the Texas information literacy tutorial on

the information-seeking ability of first year college students. College & Research Libraries, 65

(3), 205-215.

Patton, M.Q. (1997). Utilization-Focused Evalution: The New Century Text, 3rd ed. Thousand

Oaks, CA: Sage Publications.

Rae, B. (2005). Ontario a Leader in Learning: Report and Recommendations. URL:

www.yorku.ca/president/whatsnew/finalReport.pdf. Viewed September 4, 2006.

Roldan, M., & Wu, Y. D. (2004). Building context-based library instruction. Journal of Education

for Business, 79(6), 323-327. Selegan, J.C., Thomas, M.L. & Richman, M.L. (1983). Long-range

effectiveness of library use instruction. College & Research Libraries 44, 476-480.

Seamans, N.H. (2002). Student perceptions of information literacy: Insights for librarians.

Reference Services Review 30, 112-123.

Shonrock, D.D. (1996). Evaluating Library Instruction: Sample Questions, Forms and

Strategies for Practical Use. Chicago, IL: American Library Association.

Sims, S.J. (1992). Student Outcomes Assessment: A Historical Review and Guide to Program

Development. New York, NY: Greenwood Press.

Smith, S.L. (1991). Report: Commission of Inquiry on Canadian University Education. Ottawa,

ON: Association of Universities and Colleges of Canada.

Strauss, A., & Corbin, J. (1998). Basics of qualitative research: Techniques and procedures for

developing grounded theory (2nd ed.). Thousand Oaks, CA: Sage.

Totten, N.T. (1990). Teaching students to evaluate information. Research Quarterly, 29, 348-

354.

Venkatesh, V., & Davis, F. (2000). A theoretical extension of the technology acceptance

model: Four longitudinal field studies. Management Science, 46(2), 186-204.

Venkatesh, V., & Morris, M. G. (2000). Why don't men ever stop to ask for directions? Gender,

social influence, and their role in technology acceptance and usage behavior. MIS Quarterly,

24(1), 115-139.

Wu, Y. D., & Kendall, S. L. (2006). Teaching faculty's perspectives on business information literacy. Reference Services Review, 34(1), 86-96.