2
Scholars’ Information Evaluation Strategies in the Digital Environment Ying-Hsang Liu School of Communication, Information and Library Studies, Rutgers, the State University of New Jersey, 4 Huntington Street, New Brunswick, NJ 08901 .Email: [email protected] Introduction While the behavior of information evaluation has received more attention recently in user studies (e.g., Fitzgerald, 2000; Rieh, 2002; Wang & Soergel, 1998; Wang & White, 1999; Wathen, & Burkell, 2002), we still have little theoretical understanding about information evaluation in the information seeking processes. This study aims to provide a theoretical framework for understanding how scholars use information evaluation strategies for the work tasks in their ordinary work environment, with particular reference to the digital information environment. An analysis of scholars’ reading behavior of electronic documents has shown that the online information environment has facilitated information sharing and reading activities among scholars (Chang, 2002). Two important research questions are: By what criteria the academic researchers evaluate information objects in the digital formats? Do the academic researchers employ different information evaluation criteria when they engage with different information-seeking tasks? If yes, what is the typical information evaluation strategy in the digital environment? Methodology The present study uses the data corpus from a research project entitled “A Study on Reading Electronic Documents in the Digital Environment” led by Dr. Shan- Ju L. Chang of Department of Library & Information Science, National Taiwan University in Taiwan (Chang, 2002). By using the in-depth interview technique in qualitative research approach, the study attempts to explore the scholars’ reading experiences in the digital environment. The snowball sampling method was used to recruit twenty-four scholars ( r a h g from lecturer to full professor) from six schools of five universities in Taiwan. An interview guide was also used addressing the users’ electronic reading materials (including types, sources, contents), their mformation behavior in accessing and processing information, as well as the impact of reading digital materials on their everyday lives. Ths interview guide was pilot tested with three researchers before it was applied to other cases. Particular attention is paid to the sequence of questions, expected answers, and preparation of probe questions in order to elicit resourceful reading ASIST 2003 Poster 524 experiences among these scholars. The author conducted fifteen of the total twenty-four interviews. The interview time ranged from thirty to seventy minutes with an average of sixty minutes. All the interviews were fully transcribed. A grounded theory approach developed by Straws & Corbin (1998) was employed to derive a model of scholars’ information evaluation strategies. One important issue is the role of theoretical sensitivity within grounded theory in which the effects of the researcher’s prior knowledge should be considered. This study did not suspend the researcher’s prior knowledge about the investigated concepts because this approach involves the interaction between the researcher’s prior knowledge and the evolving concepts from interview data. All the theoretical concepts were grounded in the interview transcripts. The unit of analysis is the information-seeking episode that involves the academic researchers’ attention to reading, selecting, organizing and evaluating information- related behaviors in the digital environment. One information-seelung episode is considered a single event that describes the interviewee’s mformation-seeking activity. Tlus distinction is important in the analysis because the interviewee may talk about several different information-seeking activities in one speech turn. Further, this practice would also help characterize these mformation-related behaviors and their relationships with the interacted information sources and types of evaluation criteria. Although Cool & Bellun (2002) have identified Evaluate as one facet of information behaviors that is mutually exclusive of other mformation-related behaviors, this study has employed broad criteria for determining whether an information-seeking episode would be included since some types of evaluation criteria could be applied to other mformation-related behaviors. Preliminary Results As of May 2003 the author has successfully identified the three dimensions of information evaluation strategies that emerge from the selected four of total twenty-four cases (1 07 information-seeking episodes), including information-seekmg tasks, information sources, and types of evaluation criteria. Information-seeking task refers to the information-related behaviors that users engage with when they interact with mformation resources in the

Scholars' information evaluation strategies in the digital environment

Embed Size (px)

Citation preview

Page 1: Scholars' information evaluation strategies in the digital environment

Scholars’ Information Evaluation Strategies in the Digital Environment

Ying-Hsang Liu School of Communication, Information and Library Studies, Rutgers, the State University of New Jersey, 4 Huntington Street, New Brunswick, NJ 08901 .Email: [email protected]

Introduction While the behavior of information evaluation has received more attention recently in user studies (e.g., Fitzgerald, 2000; Rieh, 2002; Wang & Soergel, 1998; Wang & White, 1999; Wathen, & Burkell, 2002), we still have little theoretical understanding about information evaluation in the information seeking processes. This study aims to provide a theoretical framework for understanding how scholars use information evaluation strategies for the work tasks in their ordinary work environment, with particular reference to the digital information environment. An analysis of scholars’ reading behavior of electronic documents has shown that the online information environment has facilitated information sharing and reading activities among scholars (Chang, 2002). Two important research questions are:

By what criteria the academic researchers evaluate information objects in the digital formats? Do the academic researchers employ different information evaluation criteria when they engage with different information-seeking tasks? If yes, what is the typical information evaluation strategy in the digital environment?

Methodology The present study uses the data corpus from a research

project entitled “A Study on Reading Electronic Documents in the Digital Environment” led by Dr. Shan- Ju L. Chang of Department of Library & Information Science, National Taiwan University in Taiwan (Chang, 2002). By using the in-depth interview technique in qualitative research approach, the study attempts to explore the scholars’ reading experiences in the digital environment. The snowball sampling method was used to recruit twenty-four scholars ( r a h g from lecturer to full professor) from six schools of five universities in Taiwan. An interview guide was also used addressing the users’ electronic reading materials (including types, sources, contents), their mformation behavior in accessing and processing information, as well as the impact of reading digital materials on their everyday lives. Ths interview guide was pilot tested with three researchers before it was applied to other cases. Particular attention is paid to the sequence of questions, expected answers, and preparation of probe questions in order to elicit resourceful reading

ASIST 2003 Poster 524

experiences among these scholars. The author conducted fifteen of the total twenty-four interviews. The interview time ranged from thirty to seventy minutes with an average of sixty minutes. All the interviews were fully transcribed.

A grounded theory approach developed by Straws & Corbin (1998) was employed to derive a model of scholars’ information evaluation strategies. One important issue is the role of theoretical sensitivity within grounded theory in which the effects of the researcher’s prior knowledge should be considered. This study did not suspend the researcher’s prior knowledge about the investigated concepts because this approach involves the interaction between the researcher’s prior knowledge and the evolving concepts from interview data. All the theoretical concepts were grounded in the interview transcripts.

The unit of analysis is the information-seeking episode that involves the academic researchers’ attention to reading, selecting, organizing and evaluating information- related behaviors in the digital environment. One information-seelung episode is considered a single event that describes the interviewee’s mformation-seeking activity. Tlus distinction is important in the analysis because the interviewee may talk about several different information-seeking activities in one speech turn. Further, this practice would also help characterize these mformation-related behaviors and their relationships with the interacted information sources and types of evaluation criteria. Although Cool & Bellun (2002) have identified Evaluate as one facet of information behaviors that is mutually exclusive of other mformation-related behaviors, this study has employed broad criteria for determining whether an information-seeking episode would be included since some types of evaluation criteria could be applied to other mformation-related behaviors.

Preliminary Results As of May 2003 the author has successfully identified the

three dimensions of information evaluation strategies that emerge from the selected four of total twenty-four cases (1 07 information-seeking episodes), including information-seekmg tasks, information sources, and types of evaluation criteria. Information-seeking task refers to the information-related behaviors that users engage with when they interact with mformation resources in the

Page 2: Scholars' information evaluation strategies in the digital environment

digital environment, with specific reference to the information-related activity in one information-seeking episode. Categories include access, browse, classify, forward, keep, read, and search.

Information sources refer to the information objects that the academic researchers interact with in the digital environment. Six categories of information sources have been identified, including Web, email, forwarded messages, online databases, BBS (Bulletin Board System), and FTP (File Transfer Protocol).

The types of evaluation criteria refer to the user’s cognitive process in judgmental tasks, which is also influenced by both the personal characteristics and contextual factors in information seeking. More specifically, types of evaluation criteria seem to be a representation of the user’s decision-making behavior that involves the interactions of personal characteristics, information objects (in terms of their source, message and receiver), and contextual factors. Categories of types of evaluation criteria include personal characteristics, context (task-related and urgency), and message (source credibility, surface characteristics, content, text credibility, potential use, and receiver).

ACKNOWLEDGMENTS The research is funded by a grant from the National Science Council, Taiwan, ROC. Grant Number: NSC 90- 2413-H-002-010. Discussions and comments from Dr. Shan-Ju L. Chang and Dr. Nicholas Belkin are also appreciated.

REFERENCES Chang, S.-J. L. (2002). A study on reading electronic documents

in the digital environment. Unpublished manuscript. Retrieved

May 20, 2003, from http://grbsearch.stic.gov. tw/pdf/90/9024 1 3h0020 1 O.pdf

Cool, C,. & Belkin, N. J. (2002). A classification of interactions with information. In H. Bruce, R. Fidel, P. Ingwersen, & P. Vakkari (Eds.), Emerging frameworks and methods: Proceedings of the Fourth International Conference on Conceptions of Library and Information Science (CoLIS4) (pp. 1-15). Greenwood Village, CO: Libraries Unlimited.

Fitzgerald, M.A. (2000). The cognitive process of information evaluation in doctoral students: A collective case study. Journal of Education for Library and Information Science, 41,

Nisbett, R., & Ross, L. (1 980). Human inference: Strategies and shortcomings of social judgment. Englewood Cliffs, NJ: Prentice-Hall

Rieh, S. Y. (2002). Judgment of information quality and cognitive authority in the web. Journal of the American Society for Information Science and Technology, 53, 145-1 61.

Strauss, A., & Corbin, J. (1998). Basics of qualitative research: Techniques and procedures for developing grounded theory (2nd ed.). Thousand Oaks, CA: Sage.

Wang, P., & Soergel, D. (1998). A cognitive model of document use during a research project. Study I. Document selection. Journal of the American Society for Information Science, 49,

Wang, P., & White, M. D. (1999). A cognitive model of document use during a research project. Study 11. Decisions at the reading and citing stages. Journal of the American Society for Information Science, 50,98-114.

Wathen, C. N., & Burkell, J. (2002). Believe it or not: Factors influencing credibility on the web. Journal of the American Society for Information Science and Technology, 53, 134-144.

Wilson, P. (1983). Second-hand knowledge: An inquiry into cognitive authority. Westport, CT: Greenwood Press.

Wilson, T. D. (1997). Information behaviour: An interdisplinary perspective. Information Processing & Management, 33, 551 - 572.

I 70-1 86.

115-133.

ASlST 2003 Poster 525