10
Book Reviews Online Search Analyst, Version 1.0. Stephen P. Harter. Bloomington: Online Consultants of Indiana; 1987. Price: $40.00. This piece of software from the author of Online Information Retrieval: Concepts, Principles, and Techniques, offers much valuable advice in an easy to follow format. A single disk, it is menu-driven and easy to use. The software is designed to teach searchers the tactics or heuristics which can solve many common online searching prob- lems. The user is presented with a menu of five common situa- tions where the retrieval has been less than optimal and instructed to choose a number (or 99 to stop). The situations include an un- expectedly empty set, too many nonrelevant documents, not enough documents, etc. When the user chooses one of these situations, a second menu appears with a list of possible situations which could have caused the problem. Again, the user chooses a number, Then the soft- ware begins an explanation of what alternative actions are pos- sible, first in general terms, then in a window with a more specific discussion. Within the window, the important points are summarized and indicated with an arrow. At this point the user merely “strokes” any key to keep the discussion appearing on the screen. Each situation provides a clear explanation of the problem and suggests possible solutions. In providing these solutions, Harter discusses many of the principles of online information retrieval, such as the need to use many synonyms when searching with natural language or the failure of controlled vocabularies to incor- porate terms for the concepts on the “cutting edge.” With each tactic, Harter points out both the reasons why this technique might be successful and warns the user of the pitfalls associated with using it. Sometimes, however, this format results in a dis- cussion which, while excellent, becomes buried. For example, a discussion of the values and possibilities of citation searching oc- curs under the problem of too many relevant documents, specifi- cally under the suggestion that the searcher might want to change databases. Many might miss this discussion. The discussions of the problems, causes and solutions are ex- cellent, but this software does not exploit the power of the com- puter to teach. In fact, it is little more than an electronic textbook. As such, it is a fine text. But, generally, the user is passive, often only pressing a key to continue the textual com- mentary on the screen. For those who are used to interaction with a computer, such a passive program may not be successful. Though the version currently being sold is version 1.0, there seem to be no “bugs” or errors in the program. It was written in TURBO Pascal for IBM-PC or compatables and requires MS- DOS or PC-DOS 2.0 or higher and 64K of internal memory. The documentation is a simple, three-page set of instructions and dis- cussions printed in compressed type and copied on gray paper, but is not necessary for successfully running the program. In fact, one-and-a-half pages are devoted to a description of the program, and another page to copying and then running the program. While the format of the software may not be ideal, the con- tents are definitely worthwhile and the price is modest. This pro- gram would probably be valuable for those learning online searching. It might also be valuable to new searchers who are un- sure of how to handle certain situations or who feel overwhelmed by searching when the patron is paying. Even experienced searchers could benefit from going through the program once or twice to remind themselves of the many ways to take advantage of the interactive nature of the retrieval. Search Strategies in Mass Communication. Jean Ward and Kathleen A. Hansen. New York: Longman; 1987. 274 pp. (ISBN O-582-9985 l-4). Though not explicitly stated, this book applies the kind of search techniques used by librarians to the activities of those in mass communications: public relations officers, journalists, and advertisers. The authors, a professor of journalism and a librar- ian, recognize the fundamental similarities between the librarian and the mass communicator in gathering information. They, therefore, propose a model for mass communicators which re- sembles in many ways the processes librarians use. Because of the differences, however, they include sources not always ac- knowledged by librarians: observation and interviews. They also include a chapter on “Using Polls and Surveys” and conclude with two chapters on what to do after the information has been gathered: “Selecting and Synthesizing Information,” and “Social Responsibility and the Search Strategy.” The proposed model has several, interdependent steps. First is the question analysis in which the mass communicator should do several things -identify concepts, define language, draw disci- plinary boundaries, refine the scope of the question, and identify possible contributors. These activities sound much like those in the reference interview, especially the pre-search interview. The second step in the model for mass communicators is gathering in- formation by consulting several kinds of sources. The authors label this part of the model “Possible contributors” and list three kinds of contributors: informal sources, institutional sources, and library and data-base sources. Intertwined with all of these, and yet separate from them, are the traditional information-gathering techniques of journalists-interviews, news conferences, and surveys. The third step in the model is the selection and synthesis of the information. Here the authors discuss the standards which communicators should follow, such as credibility, legal and ethi- cal factors, and evidence. The result of the model is the message. A diagram of the model appears inside the front and back covers as well as in the text. Throughout the discussion of the model are examples of how successful mass communicators have used the parts of the process. Probably of greatest interest to those in the library and infor- mation science fields are the chapters on library sources and data- bases. Here the familiarity of the authors with the materials they JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE. 40(2):133-142, 1989 CCC 0002-8231/89/020133-10$04.00

Education for professional librarians

Embed Size (px)

Citation preview

Page 1: Education for professional librarians

Book Reviews

Online Search Analyst, Version 1.0. Stephen P. Harter. Bloomington: Online Consultants of Indiana; 1987. Price: $40.00.

This piece of software from the author of Online Information Retrieval: Concepts, Principles, and Techniques, offers much valuable advice in an easy to follow format. A single disk, it is menu-driven and easy to use.

The software is designed to teach searchers the tactics or heuristics which can solve many common online searching prob- lems. The user is presented with a menu of five common situa- tions where the retrieval has been less than optimal and instructed to choose a number (or 99 to stop). The situations include an un- expectedly empty set, too many nonrelevant documents, not enough documents, etc.

When the user chooses one of these situations, a second menu appears with a list of possible situations which could have caused the problem. Again, the user chooses a number, Then the soft- ware begins an explanation of what alternative actions are pos- sible, first in general terms, then in a window with a more specific discussion. Within the window, the important points are summarized and indicated with an arrow. At this point the user merely “strokes” any key to keep the discussion appearing on the screen.

Each situation provides a clear explanation of the problem and suggests possible solutions. In providing these solutions, Harter discusses many of the principles of online information retrieval, such as the need to use many synonyms when searching with natural language or the failure of controlled vocabularies to incor- porate terms for the concepts on the “cutting edge.” With each tactic, Harter points out both the reasons why this technique might be successful and warns the user of the pitfalls associated with using it. Sometimes, however, this format results in a dis- cussion which, while excellent, becomes buried. For example, a discussion of the values and possibilities of citation searching oc- curs under the problem of too many relevant documents, specifi- cally under the suggestion that the searcher might want to change databases. Many might miss this discussion.

The discussions of the problems, causes and solutions are ex- cellent, but this software does not exploit the power of the com- puter to teach. In fact, it is little more than an electronic textbook. As such, it is a fine text. But, generally, the user is passive, often only pressing a key to continue the textual com- mentary on the screen. For those who are used to interaction with a computer, such a passive program may not be successful.

Though the version currently being sold is version 1.0, there seem to be no “bugs” or errors in the program. It was written in TURBO Pascal for IBM-PC or compatables and requires MS- DOS or PC-DOS 2.0 or higher and 64K of internal memory. The documentation is a simple, three-page set of instructions and dis- cussions printed in compressed type and copied on gray paper, but is not necessary for successfully running the program. In fact, one-and-a-half pages are devoted to a description of the program, and another page to copying and then running the program.

While the format of the software may not be ideal, the con- tents are definitely worthwhile and the price is modest. This pro- gram would probably be valuable for those learning online

searching. It might also be valuable to new searchers who are un- sure of how to handle certain situations or who feel overwhelmed by searching when the patron is paying. Even experienced searchers could benefit from going through the program once or twice to remind themselves of the many ways to take advantage of the interactive nature of the retrieval.

Search Strategies in Mass Communication. Jean Ward and Kathleen A. Hansen. New York: Longman; 1987. 274 pp. (ISBN O-582-9985 l-4).

Though not explicitly stated, this book applies the kind of search techniques used by librarians to the activities of those in mass communications: public relations officers, journalists, and advertisers. The authors, a professor of journalism and a librar- ian, recognize the fundamental similarities between the librarian and the mass communicator in gathering information. They, therefore, propose a model for mass communicators which re- sembles in many ways the processes librarians use. Because of the differences, however, they include sources not always ac- knowledged by librarians: observation and interviews. They also include a chapter on “Using Polls and Surveys” and conclude with two chapters on what to do after the information has been gathered: “Selecting and Synthesizing Information,” and “Social Responsibility and the Search Strategy.”

The proposed model has several, interdependent steps. First is the question analysis in which the mass communicator should do several things -identify concepts, define language, draw disci- plinary boundaries, refine the scope of the question, and identify possible contributors. These activities sound much like those in the reference interview, especially the pre-search interview. The second step in the model for mass communicators is gathering in- formation by consulting several kinds of sources. The authors label this part of the model “Possible contributors” and list three kinds of contributors: informal sources, institutional sources, and library and data-base sources. Intertwined with all of these, and yet separate from them, are the traditional information-gathering techniques of journalists-interviews, news conferences, and surveys. The third step in the model is the selection and synthesis of the information. Here the authors discuss the standards which communicators should follow, such as credibility, legal and ethi- cal factors, and evidence. The result of the model is the message. A diagram of the model appears inside the front and back covers as well as in the text. Throughout the discussion of the model are examples of how successful mass communicators have used the

parts of the process. Probably of greatest interest to those in the library and infor-

mation science fields are the chapters on library sources and data- bases. Here the familiarity of the authors with the materials they

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE. 40(2):133-142, 1989 CCC 0002-8231/89/020133-10$04.00

Page 2: Education for professional librarians

are talking about is apparent. Unlike many books which discuss the electronic sources as a panacea for all problems, the authors discuss the strengths and weaknesses of online searching. They know what is available and how it can be used by those they are addressing. The chapter on traditional, printed sources is almost a summary of an introductory reference text (without all the specific titles) as they discuss the differences between such tools as dictionaries, encyclopedias, and directories. For each kind of

tool, they point out the uses and problems. One unfortunate gap, the result of the rapid emergence of new

technology, is the failure to discuss the new laser retrieval sys- tems, such as the many CD-ROM systems. Since these systems combine many of the strengths of the electronic sources with free access (often), they promise much for the mass communicator. However, the fact that currently they are not updated as quickly as the online sources poses problems for those in this field who often need both the background and the most current information. One would assume that in a second edition the authors would add such a discussion.

This book, which the back cover describes as “written by the developers of the generic information-gathering course. . .” would be of interest to two groups of people: those in mass com- munication and those who work with them. It has already been adopted as a text in an advanced reporting course at the School of Journalism, University of Missouri at Columbia. As information production continues to expand and the need to gather informa- tion becomes increasingly complex, books such as this one that help those in a particular field continue to be valuable. It seems likely that students who learn early that libraries and data-bases can be valuable resources will be more effective communicators

than those who fail to use them. The authors are to be applauded for their attempt to educate mass communicators with this model and their discussion of responsibility.

MaryEllen C. Sievert School of Library and Informational Science Universiry of Missouri

sion, a typical record, size, update frequency, vendors, and an in- dication of price. This is supplemented by references to published papers on the database described, a useful feature.

Indeed, the bibliography, with the multiple access points pro- vided, (subject, author, database covered), is best contribution in the directory. The information professional may also find the statistical profile of database growth and use a handy reference. The other material will be already well known.

If one already holds the third edition should one purchase the fourth? I think not. Descriptions of the seventy additional data- bases can easily be found elsewhere. The greatly increased biblio- graphy is certainly an improvement but I would wait for the fifth edition which presumably is in preparation. One might hope that the descriptive material therein will include the considerable changes of the last two years and that the bibliography will grow even larger. The increase in database coverage alone is not worth the cost.

Bert R. Boyce School of Library and Information Science Louisiana State Universiry Baton Rouge, LA 70803

Education for Professional Librarians. Herbert S. White, Ed. White Plains, NY: Knowledge Industry; 1986: 250 pp. Price: $36.50 hardbound; $28.50 paperbound. (ISBN: 0-86729- 201-6; o-86729-200-8).

The content of the educational experience for library and in- formation science is a topic that inspires a great deal of profes- sional controversy. The recent version of this controversy began with the publication of the “Conant Report” in 1980, and with the evaluation of professional competencies by the King Research group. These studies resulted in a flurry of literature, with practi- tioners and educators accusing each other of perceived weak- nesses in the current educational system.

The book that Herbert S. White edits is not another grand plan for saving library and information science education. It is also

r . . ,-. r . I I not a book written by educators for me oenern of eaucators. tt is “: a volume written by practitioners, students, and educators that

i asks all groups to examine the problems that go into the issue \of professional education in our field. It does not pretend to find ::solutions to the problems, but rather to define the nature of

Columbia, MO 65211

Online Bibliographic Databases: A Directory and Source- book. James L. Hall, Ed. Fourth Edition. London: ASLIB; 1986: 508 pp. Price: $105.00. ISBN 0-8103-2080-O.

Considered as an initial purchase this work makes sense as a tool for introducing the layman to the use of bibliographic databases, as they were in 1985. While the limited number of database descriptions may be sufficient for the uninitiated, con- siderable changes have taken place in two years and any work di- rected to informing the end user today would need to stress the many user interfaces now available, gateways, post search pro- cessing and the optical disc revolution. These topics are barely touched upon in the concluding comments section.

The informative material supplementing the directory entries has little value for the information professional either, since the material presented is well known and the treatment superficial. Only 250 of the 3000 odd databases available are included in the database descriptions. It is, of course, the case that the great ma- jority of searching is done on a small number of databases, but the editor does not contend that the selection was made on any systematic and objective evaluation based on use. Certainly more comprehensive and up to date directories are available.

The information provided in each entry is: name, producer (with address and telephone number), subject scope, printed ver-

: the problems. The content of the book is divided into two sections of roughly

equal size. The first section is entitled “Practitioner Expectations and Needs,” and is a series of chapters on different types of li- braries written by a practitioner in that speciality (e.g. “Univer- sity Research Libraries” by Shelia D. Creth). The second section of the volume is “Educational Preparation Programs,” and in- cludes chapters on graduate education, undergraduate education, continuing education, and the views of students.

The book succeeds in its purpose of examining the major issues surrounding adequate professional education. For example, the practitioners ably presented their ideas for educating students in their speciality, although each contributor’s curricular ideas were widely divergent (e.g. from the need to educate generalists for small libraries, to finding strong subject specialists for larger information organizations). This divergence illustrates the diffi- culties of consensus building in educational curriculum matters when presented by contradictory practitioner demands.

As might be expected of a work with a number of authors, this volume is of uneven quality. Some chapters are little more than personal remembrances, and are best left unread. Con- versely, other chapters, such as Herbert White’s overview of “Graduate Education for the Library Profession,” is superbly

134 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-March 1989

Page 3: Education for professional librarians

crafted. White does an excellent job of describing the forces at work in determining the content of graduate educational pro- grams. For instance, he describes the problems of: low enroll- ment, recognition within the university setting, and pressure from the practitioner community. He also discusses the strategies that have been employed to meet those problems; ranging from expanding the master’s program, to the creation of dual degree programs.

The book is logically presented and well edited, and contains both a bibliography and index. At $28.50 for the paperback edi- tion it is somewhat high-priced, but is highly recommended read- ing for both educators and those interested in the professional education process.

John N. Olsgaard College of Library and Information Science

University of South Carolina

Columbia, SC 29208

Human Aspects of Library Automation: Helping Staff and Patrons Cope. Debora Shaw, Ed. Urbana: University of Illinois Graduate School of Library and Information Science; 1986: 129 pp. Price: $15.00. (ISBN O-87845-072-6; ISSN 0069-4789).

This volume consists of nine individual papers and a panel discussion presented at the Twenty-Second Annual Clinic on Library Applications of Data Processing, held in 1985. The papers provide both theoretical and practical insight into some of the problems faced by librarians and patrons as a result of automation.

In the first paper Sarah Fine examines psychological strain. One of the greatest fears experienced by people as a result of new technology is a breakdown of interpersonal relationships. One way to alleviate much of this fear is by involving everyone con- cerned in the planning process. In reviewing some of the earlier studies of librarians’ resistance to technology Fine reports that there is no librarian personality profile. Apparently librarians are people too. In fact, she indicates that resistance to technology is actually beneficial and probably necessary for survival. The title of her contribution “Terminal Paralysis or Showdown at the Inter- face” might have been more appropriate for the second paper in the volume.

The major part of the second paper deals with physical strain associated with the use of VDTs (video display terminals), in particular those resulting from poor seating and eye strain. Martin Dainoff reviews problems and offers recommendations for work- station design, proper illumination, and other ergonomic consid- erations. Although much of what Dainoff presents is available in the general literature his main point is worth noting. His argu- ment is that since librarians and office workers are both infor- mation handlers they are also susceptible to problems posed by information technologies. This viewpoint might encourage librar- ians to explore solutions provided for the business sector before the information slowly finds its way into the library literature.

The next paper is titled “Personnel Considerations in Library Automation.” Margaret Myers reviews personnel issues from the viewpoint of the employer and the employee in the follow- ing areas: Organizational/staffing patterns, job design, position classification, selection and training, performance evaluation, working conditions, staff welfare, management/labor relation- ships, and professional issues. In order to anticipate both psycho- logical and physical problems, she recommends that management examine existing personnel policies prior to initiating an automa- tion project.

The continuity of presentations is maintained with a paper by Jane Burke on library and vendor responsibilities in planning for automation. She applies some of the theories presented in Fine’s paper to libraries and library staff. Her comments are aimed specifically at librarians who are at the stage of writing RFPs (Requests for Proposals) or who have proceeded to the imple- mentation stage. She makes the point that while automation may change many things it doesn’t change people.

The longest contribution in the volume, consisting of 18 pages, is an edited transcript of a panel discussion on staff in- volvement in library automation. Panelists included Judith A. Drescher, Christopher Syed, Barbara Shaw, and Stella Bentley. Panelists provided brief descriptions of various automation projects in public and academic libraries as they related to staff and patrons. Discussion ranged from an analysis of the planning process through patron reaction to new automated systems.

Involvement of library users continues in the next contribution by Anne Gilliland. In addition to reviewing basic physical con- siderations for online public access catalogs, she focuses on intel- lectual factors influencing a patron’s decision to use the online catalog. Gilliland presents some methods for conducting use evaluations of the online catalog and brief recommendations for heightening managerial awareness in the planning process.

Turning more specifically to the needs of library patrons, the next two papers discuss online catalogs and specialized clien- teles. Susan Roman provides examples of public library services for children and youth. The needs of youngsters are often over- looked because young people cannot always articulate their felt or unfelt needs. Libraries will reap the benefits of specialized ser- vices as children become adult users in later years. The paper by Leslie Edmonds focuses on online services for the physically disabled and intellectually impaired, the elderly, and the non- English speaking. Edmonds carries through Roman’s argument that librarians need to become knowledgeable about the needs and adaptations of each specialized group and to adapt automated systems to meet their needs.

The following paper by Mark W. Arends on designing bro- chures for effective use of online catalogs lends itself to many of the points raised by the previous papers. He provides some useful guidelines for design that can be adapted to a variety of uses.

The final paper on library privacy touches on a concern raised in Fine’s paper, namely that a major fear some people have is that technology will gradually invade and inhabit their privacy. Jonathan Pratter reviews legal aspects of library privacy, what constitutes fair information practice, and some of the problems posed by advent of the computer. The volume concludes with an index.

Despite the April 1985 date of the presentations, this volume still provides timely information and interesting reading for librarians in all types of libraries. A good mix of theory and prac- tice presented in an understandable manner have become trade- marks of the Annual Clinic. Many of the papers contain a wealth of references for additional reading. Greater use of illustrations might have been more effective for some of the papers, such as Dainoff’s paper on ergonomics, but then one seldom finds them in proceedings volumes. The editor Debora Shaw should be con- gratulated for a well-organized volume.

Andrew G. Torok Department of Library and Information Studies Northern Illinois University DeKalb, IL 60115

Scientific and Technical Libraries. Volume 1: Functions and Management. Nancy Jones Pruett. Orlando, FL: Academic Press, 1986; 353 pp. Price: $45.00. (ISBN o-12-566041-3)

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-March 1989 135

Page 4: Education for professional librarians

Volume 2: Special Formats and Subject Areas. 215 pp. Price $45.00. (ISBN 0-12-566042-l). $75.00 for both volumes.

The purpose of the work stated in the preface is to “provide an overview of the functions and management of (industrial, govern- mental and university) sciitech libraries and a starting place for information about any particular function.” The text is directed toward “a professional librarian who is knowledgeable in one area of sci/tech libraries but needs to know about the others.” The preface indicates that the work should also be useful to the man- ager who wants to be sure things are done right, the librarian switching from another type of library, and faculty and students in schools of library and information science. Medical librarian- ship and non-library information resource management are ex- cluded from coverage [I].

“The arrangement of the volumes is as follows: In Vol-

ume 1, Part I, we discuss what sci/tech libraries do, pro- vide descriptions of three s&tech libraries in different settings, and discuss the characteristics both of the s&tech literature and of scientists and engineers as informa- tion users.

“In Volume 1, Part II, we discuss the details of the five primary functions: Information Retrieval, Current Aware- ness, Collection Development, Collection Control, and Document Delivery. In each case we discuss the function first, then the management aspects.

“In Volume I, Part III, we discuss the secondary func- tions, those that every library must perform in order to do the primary ones. These include management, space planning, automation, and equipment selection and maintenance.

“Volume 2, Part I, includes separate chapters on the various special formats of importance in sci/tech libraries: conference literature, dissertations, government docu- ments, in-house information, journals, maps, microforms, numeric data, patents, software, standards and specifica- tions, technical reports, and translations.

“ . . . In Volume 2, Part II, we have chapters focused on particular subject areas. Chapters on the basic sciences in- clude biology, chemistry, mathematics, and physics. The applied sciences included are engineering, geoscience, and pharmaceuticals.” [2].

Pruett’s Scientific and Technical Libraries is the best work to date on s&tech library operations. There is more useful detail on the topics covered for the target readership than in any other recent book. Several chapters have summaries of pertinent lit- erature that are of ARIST or Library Trends quality. The bib- liographies include a substantial number of 1986 items. It is appropriate reading for any librarian in another setting who wonders what is different about operations in science and tech- nology libraries.

As one would expect of a work with such wide scope, library operational procedures arc not laid out at the nuts and bolts level. There is some unevenness of coverage; an author must base writ- ings on personal experience and appropriate literature, and both almost always vary in scope and depth. That is manifested in var- ious ways in this work. Information is occasionally provided at the level of readers who know nothing at all about libraries or students in a beginning course on acquisitions or cataloging. Some of the text is difficult to understand without substantive un- derstanding of library procedures or the guidance and explana- tions from someone with substantial training.

There are several interesting asides on history, discussions of selected library operation technologies, and appropriate digres-

sions to discuss theory and research that can be used in opera- tional situations, e.g., collection evaluation, information-seeking behavior of groups.

Pruett concentrates on the details of library operation and kinds of documents that are more likely to be appropriate for scientists and researchers than for administrators or planners of overall policy. The procedures and policies that librarians need to use to serve managers, e.g., attention to the lost-opportunity cost of services, and materials of importance to management (multiclient studies and business research reports) receive limited coverage.

The chapters on subject libraries are uneven. Some have value even for practicing librarians serving the specified disciplines, in terms of both content and leads to other useful and informative readings. Others provide scant information or few references. Most give an overview of the professions they serve, e.g., biol- ogy, chemistry, as well as brief discussion of library activities and common reference tools. They are all excellent sources of in- formation on possible employment choices for students in schools of library and information science; faculty members advising po- tential future librarians serving sci/tech professionals should refer them to F’ruett for ideas.

The Table of Contents is as useful as the index for most pur- poses.

There are more typographic and editorial errors than one would normally expect from Academic Press. Approximately twenty were noted in the first volume when reading for content, not proofreading. At least twenty more were noted in the biblio- graphy. Fewer factual errors were noted, and none were serious. Examples include: “Abstracting and indexing services such as Science Citation Index, Engineering Index, Applied Science and Technology Index, Mathematical Reviews, Bioresearch Index, and Chemical Abstracts include short but critical book re- views. . .” [3] (Some of these tools index reviews; only Mathe- matical Reviews includes them.); “Originally, all (catalog) card sets were typed.” [4] (Originally, all cards were written.); “The card catalog can only be approached one way, no matter what the experience level of the user is.” [5] (with no explanation of the ‘one way’); “. . a one-sided 12-inch optical disk can store be- tween 10,000 and 20,000 pages of text. . A CD-ROM (compact disc read-only memory) will store about 200,000 pages of text.” [6] (indicating that a 5 %-inch disk holds an order of magnitude more data than a 12-inch disk.)

The physical organization of the work is troubling. Elimina- tion of the redundancy incurred because the text is in two vol- umes would reduce it to fewer than 600 pages-a suitable length for a less expensive single volume. The last half of Volume 2 is made up of seven short chapters by different authors on different types of libraries. Most readers with specific interests are un- likely to take time to read more than one or two of them, and the remaining chapters will have significantly less value. However, the first half of Volume 2 is the most valuable part of the work for the intended audience: It is most likely to provide information that will help with operational decisions. A thirty-five page chap- ter of the other volume is made up of separate sections on an aca- demic, a business, and a governmental geological science library. If this work could not appear as a single volume. it would have enabled more purchasers to get just one of the two volumes if the discipline-specific material were all in one volume and the generic chapters on sciitech libraries were in the other.

Both review copy volumes are well bound. There was a little glue on the top edge of Volume 2 that made it necessary to cut or tear some pages to separate them. The cataloging-in-publication notes indicate that both volumes have “alk. paper.” Volume 2’s paper stock is bulkier, less glossy, and less white than volume 1’s. One wonders if the difference in paper stock is a kind of de- ceptive packaging to make a $45 tome that really has % the text of another $45 one appear to be the same size.

136 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-March 1989

Page 5: Education for professional librarians

Several books have been published on special librarianship in the last few years. The texts’ coverage of library operations, or- dered from most useful to least useful for new sci/tech librarians are Pruett, Mount’s University Science and Engineering Li- braries, Ferguson and Mobley’s Special Libraries at Work, Mount’s Special Libraries and Information Centers, and White’s Managing the Special Library [7-lo]. The texts with coverage of special library management, from most useful to least are White, Mount’s University Science and Engineering Libraries, Ferguson and Mobley, Mount’s Special Libraries and Information Centers, and Pruett.

Scientific and Technical Libraries by Nancy Jones Pruett is strongly recommended for all libraries serving schools of library and information science. It is recommended for practicing librar- ians who want background on why work is done as it is in sci/ tech libraries as well as how the work is most commonly done. Professors teaching students affluent enough to spend $75 on a text in special librarianship should consider it as a required text. The professors certainly should read it carefully, both for content and the excellent leads to other good sources. I intend to put both volumes on reserve for a special libraries class, and assign read- ings in the chapters or sections on information-seeking behavior of scientists and engineers, numeric data, and translations.

Richard I. Blue School of Library and Information Science University of Wisconsin-Milwaukee Milwaukee. WI 53201

1.

2. 3. 4. 5. 6. I.

8.

9.

10.

Pruett, Nancy Jones; Scietiifc and Technical Libraries, Volume One: Functions and Management. Orlando, n: Academic Press; 1986,

pp. ix-x.

ibid; pp x-xi.

ibid; p. 132.

ibid; p. 166.

ibid; p. 167.

ibid; p. 300.

Mount, Ellis. University Science and Engineering Libraries, 2nd

ed. Westport, CT: Greenwood Press; 1985.

Ferguson, Elizabeth; Mobley, Emily R. Special Libraries ut Work. Hamden, CT: Shoestring Press; 1984. Mount, Ellis. Special Libraries and Information Centers: An In- troductory Text. New York: Special Libraries Association; 1983.

White, Herbert S. Managing the Special Library: Strategies for Success within the Larger Organization. White Plains, NY: Knowl-

edge Industry Publications, Inc. 1984.

Text Processing and Document Manipulation: Proceedings of the International Conference, Nottingham, April 1986. J. C. Van Vliet, Ed. Cambridge: Cambridge University Press, 1986. 250 pp. Price: $39.50. (ISBN: O-521-32592-7).

Speech is one-dimensional, and to many people language is still a linear stream of words and is best represented in that way. This book is the proceedings of the international conference on “Text Processing and Document Manipulation” held in Notting- ham April 14-18, 1986, and most of the authors represented be- lieve in a structured representation of documents. Unfortunately, the results presented here do not convince me that introducing elaborate structures is more beneficial than troublesome. Some wish to do this top down, some interactively and bottom up; but most want some kind of data structure.

The book contains twenty papers. Roughly, seven are on “authoring” systems; six on page layout; three on searching and retrieval; and two each on typefaces and special purpose docu-

ments. Thus, the typical paper is either about (a) arranging words into a document or (b) arranging words on a page. Sometimes the discussion is of traditional documents, printed on paper; and other times, of screen displays. In both cases, however, the rec- ommendation is normally to have a structured system rather than simple linear streams. Authoring. Starting with the “authoring” work (and what was ever wrong with the word “writing”?), most of these papers de- scribe a program to maintain a large number of small chunks of text, with links between these chunks. Some of the links may be named, e.g. “glossary” links. The reader goes from one chunk to another with the mouse. Chunks may or may not be named; if they are, there are ways to display lists of chunk names and se- lect from them, otherwise the links are stored in the text which says “if you want X push the mouse here.” Hierarchical struc- tures are often mentioned but usually the system supports a full directed graph (i.e. does not impose the restriction that all links between chunks be a correct tree). An editor is supported to make a reading program into a writing program.

In the first such paper, P. J. Brown discusses the GUIDE sys- tem, whose output is intended to be browsed on screens. Writing systems to be browsed is difficult; you do not know what the reader has read before. The author of a novel assumes that the reader starts at the beginning and goes along in increasing page number order. Teletext systems can deliver their text in many dif- ferent arrangements, so that one speaks of “navigating” through them rather than turning the pages, although I think that a drunk stumbling from lamppost to lamppost is a better description. The GUIDE system supports the display and creation of a structured set of pages. The pages are not necessarily named, most of the Iinks being within the text. I liked this paper because (a) the soft- ware interface presented to the readers and the authors is the same, always a good sign; (b) the software is relatively simple and not tied to anybody’s elaborate mental model; (c) the author writes well and admits we don’t yet know how to produce good structured documents.

V. A. Burrill then described a similar but somewhat more complex system, the VORTEXT system, which is more tied to an imitation of a traditional book. Here the various chunks are named, and the titles are displayed in the margins of the screen. Again, this is a good paper: the system described really exists and it has interesting, if unevaluated, new facilities. My worry is that the VORTEXT features require new writing styles, and it may turn out that the system is complex too learn to use. I would have hoped that a system displaying an imitation of traditional print would be able to use a traditional book as input. Compare, for example, the work of I. Benest, who has produced an almost exact imitation of traditional print, with recta and verso pages displayed.

The most ambitious structured document system seemed to be Quint and Vatton’s “Grif’ project. Writing text in Grif looks as difficult as writing code in a conventional algorithmic lan- guage. The writer must define the hierarchical structure of the document and then understand how it is mapped to a screen dis- play. Considering how much trouble it is to deal with typesetting languages which are mostly one-dimensional, understanding the two-dimensional consequences of the Grif language seems to me excessively difficult.

The next paper is partially implemented; P. King’s “w” sys- tem. Again, a hierarchical structure for documents is proposed and the system permits editing of it. The primary goal is printed paper, not screen browsing. Thus, unlike the first two papers, the readers and the authors see different representations. On the other hand the serious problems of equations and other complex matter are considered and an effort is made to provide a “what you see is what you get” editor for these. When finished, that will be a significant achievement and a great help to many writers.

The “Concept Browser” of Corda and Facchetti is another hy- pertext system resembling Notecards. I felt the paper was rather

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-March 1989 137

Page 6: Education for professional librarians

high level and contained insufficient examples of the system fa- cilities to convince me that I would want to use it.

Finally, there were two proposals. R. Hamlet discusses an- other structured text environment, unfortunately rather vaguely. This is a proposal for top-down hierarchical writing, apparently not implemented at the time of the conference. He does hope to deduce the hierarchical structure from a traditional linear manu- script, but I would have liked the paper more if he had a program that actually did this. Next, Cowan and Smit describe a taxonomy of document preparation systems; again, they don’t seem to have actually built something and the description is fairly vague.

The summary of this section is “use structure instead of just calling your text a linear stream.” I remain to be convinced. Real text often contains several intermingled streams (consider foot- notes in a scholarly work, or the interplay of plot and subplots in many dramatic works) and certainly a hierarchical structure is in- appropriate, nor does even a directed graph necessarily convey the right feeling. But more research is needed to know what the best use of screen displays and computers for reading really is.

Formatting. There were six papers which dealt with for- matting, and they were generally more precise than those on au- thoring. First, D. Harris described the ACE page description language. This was a straightforward and well written paper that explained their design and how it works. What was remarkable about this paper is that it had an overall message, namely that a RISC approach to typesetter drivers had advantages. I agree, and wish I did not have to spend time figuring out how to make low level languages do what I want.

B. Reid, in contrast, presented the Postscript language, which looks as if it may become a general standard for driving laser printers. This is a complex language with many operators, pushing a great deal of work onto the printer. At present, it suffers from an old problem of “if you don’t define typesetting as a general graphics language you can’t do it right and if you do define it as a general graphics language it takes too long.” However, the rapidly increasing computer power of the microchips inside the printers is eliminating the problem, combined with careful opti- mization of fonts and the like. But there will still be the problem that the high-level processor which does page layout must know the details of font kerning and the like to arrange the words on the line, so why ask the laser printer to redo all of that? On the other side are the great advantages of device independence for your output stream.

Moving on to higher-level descriptions, V. Joloboff compared SGML, Interscript and ODA as logical document description lan- guages. The description is not very deep, and there is no clear message to the readers (I don’t know whether I should be fo- cussing my programming efforts on SGML, ODA, Interscript, or what). Also, this paper needs considerable correction and copy- editing (and is printed in an unusually hard to read style).

Coray, lngold and Vanoirbeek again recommend a tree struc- ture for documents. They do have an interesting discussion of the comparison between the “wysiwyg” (what you see is what you get) devotees and those who want separated input and output rep- resentations. They propose that rapid response to change is good (as with those who edit the output representation) but that a device-independent input representation is also good. Question: can they process their document system, which is fairly abstract and interpretive, rapidly enough to provide interactive editing? I doubt it. What we need is a document description language care- fully defined so as to be amenable to fast production on multipro- cessor machines. I don’t see that here. This paper also contained a number of minor mistakes in spelling.

R. Beach described the formatting of tables. Unfortunately, he just described; there was no suggestion of how to do it. This was, however, a very well printed paper; somewhere at Xerox he must have good software for the purpose.

R. Furuta presented a plan for another integrated editor and formatter which manipulates block-structured documents. As I

said above, I wonder at how suitable block structure really is for writing text, and also 1 felt the paper was a bit high-level (I think this reflects the fact that the implementation is still underway).

Retrieval. M. Kay’s “Textmaster” is a very interesting docu- ment storage and retrieval system. It uses special searching hardware to avoid the need for file inversion, and it searches the original form of documents so that what is retrieved can be reprinted or re-used in another document. I only wish there had been some description of the use of the system in practice and what experience had been. There have been many proposals for such systems and we need to know how real users react to Textmaster.

E. Yannakoudakis described a proposal for giving unique codes to journal articles to simplify bibliographic manipulation. Unusually, I wish this article had been a little higher-level, and had talked about the more generally problem of imprecise match- ing. I was also surprised not to see any reference to the Copyright Clearance Center’s existing solution to this problem-it is not suitable for imprecise matching but it is printed at the bottom of a great many first pages of articles. And it is a shame that the pro- posed code includes the journal page number since this prevents its use to detect multiple publication of the same paper.

Goyal and Kotamarti discuss searching compressed files. They make an inverted index to sequences of code words (typi- cally representing n-grams, i.e. letter sequences). This saves time because the total file is shorter and thus fewer bytes are pushed around. They demonstrate, as one might expect, that it is pos- sible to express the search string in the compressed representation and do a search. What I do not understand is why the shorter strings (which have fewer matches) take much less time. The ex- planation is that they are searching for the whole search string; surely it would be simple enough to make a frequency table for each n-gram code and start the search process by looking for the least frequent n-grams that must be in any matching sequence.

Other. C. Bigelow begins the volume with an excellent paper on typeface design that both explains the basic principles of the area and also gives many examples of the Lucida fonts. The paper is also beautiful to look at in comparison with the rest of the book. By contrast, Suen and Komoda have a particularly il- legible presentation of the tables in their paper on legibility. It is also amazing to me that in a comparison of the legibility of three fonts they do not present an example of each font so we could know what was being discussed. They do refer to a Figure 1 that might show this, but there is no Figure 1 in my copy. The mes- sage seems to be that DECfont is hard to read. The references to lack of ascenders and decenders suggest that this might be obvi- ous to any one looking at it, and that it might be unfair to com- pare a 5 X 7 dot font with a conventional font design.

There were two papers on special purpose documents: one was an editor for biological nucleotide and amino acid sequence data by Nanard et al. which described an ambitious system using AI techniques, and clearly went beyond document preparation into research on sequences; the other by Gowan described a sys- tem to produce catalogs of car parts, and emphasizes tools and post-editing, being a very practical approach to a real problem. In the Nanard paper I wonder at the word “pluridisciplinary” - how is this different from “multidisciplinary?”

There is a good bibliography at the end of the book by Van Vliet and Warner (it cites me three times, what more could I ask?)

Summary. I learned a lot from Bigelow’s paper in this book, and I was happy to see much of the other work being done (espe- cially the papers by Brown, Burrill, Harris, Kay and Reid). But I am skeptical of the stress on structured documents right now. For example, librarians have traditionally written highly structured documents as they catalog books; and the trend in that area is towards “short-title” catalogs which eliminate much of the infor- mation to improve efficiency and “user-friendliness.” Since the book contains no evaluated experiment on the use of any of these structured systems, I don’t yet have reason to believe in them.

138 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-March 1989

Page 7: Education for professional librarians

My strongest reaction to the book, however, is not about the words in it at all. For a book made by reproducing the original copy from people working in the area of document production, it is appallingly ugly. Many of the papers are printed in eye-straining typefaces on printers that appear to need their ink supply replen- ished, and contain elementary errors in spelling and grammer. (This does not apply to the papers by Bigelow, Beach, Reid, and a few others). Cambridge University Press does a much better job on any ordinary trade or scholarly book they print in the normal course of traditional composition. The people who produced the originals for this book are hardly promising candidates to produce typesetting software for the rest of us.

Michael Lesk Bell Communications Research Morristown, NJ 07960

MARC for Archives and Manuscripts: The AMC Format. Nancy Ann Sahli. Chicago, IL: Society of American Archivists, 1985. Max J. Evans and Lisa Weber. MARC for Archives and Manuscripts: A Compendium of Practice: Madison, WI: State Historical Society of Wisconsin, 1985.

Over the past decade the advent of integrated library systems has brought major changes to North American libraries and re- search institutions. Administrative and bibliographic control over research materials has greatly improved, as has access by users to information about these materials. But until very recently, admin- istrative resources had been focused primarily on two types of li- brary resources-books and serials. Although the Library of Congress has promulgated other MARC formats, research li- braries are now just beginning to develop automated biblio- graphic access to the lesser used but no less rich scholarly resources of their collections.

By dint of sheer number and volume alone, manuscript and archival materials bulk large in the collections of research li- braries. Quite often the number of individual manuscripts rivals the number of books held by the institution. Yet they have tradi- tionally been second-class citizens in terms of accessibility and use, even though they form the bedrock of much humanistic research.

Nancy Sahli’s MARC for Archives and Manuscripts: The

AMC Format and Max J. Evans and Lisa Weber’s MARC for

Archives and Manuscripts: A Compendium of Practice have be- come almost indispensable guides to this largely uncharted terri- tory for archivists, manuscript curators, and librarians. Archival professionals have found themselves having to learn about auto- mated systems, new standards for description and inventory con- trol of manuscripts, and library cataloging concepts all at once. Sahli’s work is the primer; Evans’s and Weber’s volume summa- rizes current practice and provides helpful examples. Each of these manuals helps archivists and manuscript curators make the transi- tion from their in-house, often idiosyncratic cataloging systems to a national system of standards for archival information exchange.

The Sahli volume is designed as a modular users’ manual of basic information on the MARC Archives and Manuscripts for- mat. A lengthy introduction provides basic information about the structure and use of the AMC format in a question-and-answer format. The author’s simple description of the format-“a struc- tured container for information, similar in concept to a labeled pi- geonhole file”- typifies the clarity and readability of this section of the manual. The reader quickly learns what the AMC format is

and what it is not, how it can be used with automated systems, and how it relates to standards for information exchange in the library world. A glossary, selected bibliography, and well- designed sample AMC forms and AMC records round out the

scholarly apparatus. At the heart of the manual are the definitions of the format it-

self, the variable control fields and data fields in AMC, revised from the Library of Congress’ MARC Formats for Bibliographic

Data, Update No. IO. Each field is listed in order of its associ- ated MARC tag number, and includes a brief statement of infor- mation in the field, definitions of the indicators and subfields used with the field, and several sample field entries. Unfortu- nately, most people interested in automating information on archival and manuscript materials will find this core section be- wildering. The dense, unillustrated text fails to lead archivists from the information traditionally found in manuscript invento- ries to how that information is recorded in the AMC format. All the novice sees are strange symbols (such as delimiters) and few examples ob actual AMC records.

Evans and Weber’s volume of samples and guidelines for us-

ing the AMC format helps demonstrate both the utility and the complexity of the new standard. The product of a 1984 confer- ence sponsored by the National Historical Publications and Records Commission, MARC AMC: A Compendium of Practice documents the practices of eleven archival institutions which pio- neered the use of the format under the guidelines set by the Li- brary of Congress. Also included are summaries for each field of the standards set by the major bibliographic utilities, RLG and OCLC.

Beyond the design merits of the AMC format itself, the Evans/

Weber compendium has been a major reason for the acceptance of the Archives and Manuscripts Format in the library and archival communities to a degree never reached by the old MARC Manuscripts Format of the 1970s. It suggests how par- ticular data elements can be used and whether they should be used, and sets out the myriad options for describing data about manuscript and archival collections in the conference institutions. While almost identical to the Sahli volume in format, the exam- ples and descriptive notes from archivists about each field make the compendium volume far more accessible and intelligible. Several appendices containing lists of suggested terms applicable to various fields complement the main text.

Used together, the Sahli and Weber/Evans volumes provide

the introduction, standards, guidelines, examples, and descrip- tions that any information professional wanting to organize an archives or manuscript collection needs to know. In the larger scheme of automating the universe of documentation, these two tools bring manuscripts and archives-historic artifacts of man’s culture-a step closer to their rightful place in the hierarchy of information resources.

Lynn Roundtree Curator, Sen. Russell Long Papers Louisiana State University Baton Rouge, LA 70803

Misunderstanding Media. Brian Winston. Cambridge, MA: Harvard University Press, 1986. 419 pp. Price: $29.95 hard cover. ISBN O-674-57663-2.

This book will be of interest to anyone associated with the de- sign, implementation, or evaluation of new information systems.

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE- March 1989 139

Page 8: Education for professional librarians

It will also be instructive for students training in the fields of business, computer and information science, electrical engineer- ing, and communication. Winston’s overarching theme is that the enterprise of technological forecasting generally falls short of what it attempts to accomplish, namely, accurate prediction. He argues that forecasters tend to look at technical capabilities and extrapolate from there. Meanwhile, the social environment is viewed as a constant. The result is that we are presented with a one-sided picture of how media technology interacts with society.

Winston is opposed to the idea of an “information revolu-

tion,” preferring instead to view technology development through the lens of a model which depicts how media technologies evolve rather than erupt in the environment. Winston demonstrates that the history of a technology is hardly begun at the prototype stage, a time when any one of four turns might be taken. The prototype can be developed to serve the function for which it is intended; it can be rejected because, though it functions well, there is reason to believe there will not be a demand for it; it can become a par- allel technology-one which is useful but which does not serve

its initial purpose; or it can be a partial prototype, which simply does not work very well in any capacity. Up until this point, the technology has had limited interaction with the social environ- ment. Market tests may have occurred, which may have sent a prototype back for redesign or reconsideration. This process may result ultimately in an invention which will enter into production. But the full story of the technology is rarely told at this stage. Winston notes that too often futurists look no further than what might occur, given a set of technical capabilities. Suddenly, the technology becomes the only force worth recognizing, and the fu- ture realization of its potential is viewed as an inevitable trajectory.

The greatest single contribution of this book is that it moves beyond a focus on the R&D and prototyping environments by way of an explanation and illustration of the concept Winston calls “the ‘law’ of the suppression of radical potential.” Winston’s law is simply a convenient expression to describe the inevitable phenomenon of social forces acting to influence the direction and pace of the diffusion of a new technology, generally in ways that can not be anticipated. Once introduced commercially, a technol- ogy does not remain a static phencmenon. According to Winston, “general social constraints operate to limit the potential of the device” and they also serve to prevent the device from radically disrupting existing “social formations” (p. 23). These social formations include the family, home, church, political leaders and institutions, and the corporation. In five chapter-length analyses, Winston uses this concept to explain how the seven- year delay in the introduction of the television into the United States was due to the Second World War; how delays in the 1960s in the use of integrated circuits meant that microcomputers arrived at least a decade later than when implementation would

have been possible; and how unanticipated events slowed the de- velopment and use of the early computer, the communications satellite, and telephony.

According to Winston, the “futurologists,” the Marshall McLuhans, Alvin Tofflers, and James Martins, would have us believe that technology is the driving force which supersedes all others in modem society. Winston categorizes such hype as a tech- nological determinism which reflects a fundamental failure to even address, let alone understand, the pervasive influence of so- cial institutions. No fantasies about the future information society are left unscrutinized by Winston, and no set of facts presented by the futurists are left undiminished in significance.

At the same time that he is critical of the undying optimism of some futurists, Winston is not a reactionary who opposes the ad-

vancement of technological innovation. The negative connotation of the book’s title does not extend into an all-out condemnation of the very idea of technological enterprise. Reflecting equal skepticism towards both technological nay-sayers and yea-sayers, Winston steers clear of ideological analysis. He argues that the negative popular literature about technology can be every bit as deterministic as the blue-sky fantasies of the propagandists.

Some might find it a weakness that the book does not go into detail about value-laden ethical and policy issues. Value judge- ments about the desirability of realizing the radical potential of the technologies discussed is, for the most part, conspicuously missing. However, what the author does is done well. The topic Winston addresses is a complex one and he is to be commended for not diluting his analysis by attempting to address too many questions. The relationship between technology and human val- ues clearly deserves the primary attention other authors have given it, and Winston gives no reason for readers to think he be- lieves otherwise.

Overall, this book illustrates in a clear and organized way the distinction between promise and performance which should be made and is often ignored in studies of new technology. His con- clusions are a welcome and sobering contrast to the visions of fu- ture shock presented by the “propagandists” of the information

society. Winston’s book seems premised on philosopher George Santayana’s simple maxim that those who ignore history are con- demned forever to repeat it. At the same time, he manages to avoid historicism, the making of history a science for predicting the future. Rather than looking for a method for predicting the fu- ture, readers should expect to find a perspective on socio-technical design which offers a refreshing alternative to technologically de- terministic viewpoints. It can aid the practitioner by illustrating some of the manifestations of clashing interaction between infor- mation technology and society. Whether Winston brings us any closer to understanding media is, of course, subject to disagree-

ment among intelligent readers. But he provides an explicit and systematic analysis of how society causes technology as much as technology causes society. The book is a welcome advance be- yond what most of the scholarly and popular literature has had to

offer.

Andrew M. Calabrese Department of Communication Purdue University

West Lafayette, IN 47907

Federal Information Policies in the 1980’s: Conflicts and Is- sues. Peter Hernon and Charles R. McClure. Norwood, NJ: Ablex Publishing Corporation, 1987. 467 pp. Price: $42.50. ISBN 0-89391-382-O.

In this work we are witnessing nothing less than disciplinary imperialism. Several fields, including management information science, journalism, telecommunications, computer information science, and the social sciences, notably political science, are each claiming academic turf in information policy, an area in which we have an important stake.

By way of background, much of this work was originally pre- pared under a 1984 Office of Technology Assessment funded study (RFP 10/14) which McClure’s Information Management Consultant Services firm won. Since the report’s submission, the authors revised their work so that this published volume now in- cludes an updated bibliography to January 1986, new appendices,

140 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-March 1989

Page 9: Education for professional librarians

and two additional chapters by Harold Relyea and Steven Ballard. Although nowhere so explicitly stated, the purpose of this vol-

ume is to serve as a potential Congressional blueprint for legisla- tive action based on the knowledge synthesized in their report. Five of their explicit objectives, as listed on page x, are either de- scriptive or analytical. The last named is of a higher order-“to offer recommendations for developing more effective federal in- formation policies.” For the most part, the authors more than adequately describe the existing situation. Chapter 4 presents a host of “limitations,” “contradictions,” “gaps,” as well as “ambi- guities.” The conflicts in this power struggle, specifically between the legislative branch (JCP in particular) and the executive (spe- cifically, OMB), are presented quite well.

Chapter 8 as well as Appendix H, however, must be read closely to understand which one of their strategies we should buy into. This section is important because, either piecemeal or wholesale, we are likely to have federal legislation in this area during the 1990s. Hemon and McClure organize their discussion around five topics, presenting the issue, a brief description, the

options and implications. Next, the authors settle upon three strategies: 1) “developmental decentralization, 2) “coordinated disseminator, and 3) “information contractor.” Rather than tell US

their preference for one of these three, they conclude by recom- mending additional research. Along the way, they miss some de- licious, albeit minor, ironies; for instance, why not recommend copyrighting government publications? In that way, reprint pub- lishers of the Statistical Abstract or the Zip Code Directory would not make such outrageous profits at the taxpayers’ expense.

1 have some more serious concerns, however. The first is their ahistorical approach. Much of the literature they cite deals with events since 1980, specifically the Paperwork Reduction Act. Thus, the authors risk missing the fact that some of the ideas or attitudes that they present actually occur in cycles. While they correctly note our profession’s assumption that information is a resource; it only holds true from 1860 or perhaps as late as 1895. The attitude that information might be a commodity is not new. Prior to 1860, our federal government clearly treated information as a commodity to be bought and sold. Miller (1987), Stathis (1980), and Zwirn (1983) which they do cite, have all published on this aspect.

Secondly, the boundary lines demarcating the scope of our concern are very important. Indeed, the authors address the defi- nitions of government information including documents, publica- tions, data, public, private, personal, etc. However, they tend to dismiss the differences despite their recognition that other writers use all these preceding terms interchangeably (p. 2). Closely re- lated to this weakness in their discussion is another theoretical problem. What is government information? Again, they skip over rather lightly the philosophical differences between/among data, facts, information, knowledge, and wisdom. Most of the time when they use the term “information” they really appear to mean, data. Despite the authors’ preference, and mine, for “government information,” those subtle semantic differences should not be lost so easily because there are much richer distinctions than they suggest. Again, a historical approach would show how much in- tellectual baggage some of these terms actually carry.

In some other ways, 1 think our scope of concern should be much wider than what is presented here. The authors do not ad- dress any pivotal role for our national libraries except passing mention of the Library of Congress, a de facto national library on pages 5, 40, 48, and 278 while the National Library of Medicine is covered on pages 153, 178, and 268. The National Agricultural Library is completely ignored. In other words, they do not seem

to think of them as serious players in this arena. By contrast, our new Librarian of Congress appears to understand LC’s function quite well, according to quotations taken from his speech at the swearing-in ceremony. Equally serious, too, is the lack of discus- sion of telecommunications and the role of the Federal Communi- cations Commission. If libraries intend to access and provide information electronically, the decisions they make will have widespread consequences for our field. Future work must include a discussion of their activities.

A final concern is the poor editing of the bibliographical appa- ratus. For instance, the Lacy Report is not spelled Lacey. And while the authors do make reference to important NCLIS work (1982), it does not appear anywhere in the bibliography. Further- more, the divided bibliography which separates articles from books and pamphlets as well as “U.S. Government Distributed Publications” means it takes longer for the reader to find a cited piece; a straight alphabetical listing would be more helpful.

These reservations aside-for this list contains no fatal flaws in my opinion- this work is a landmark in staking our claim to a vitally important area. Society is asking questions about informa- tion resources which we are in a position to answer; we must as- sert ourselves and claim at least part of the turf. We did it quite successfully in the last quarter of the nineteenth century-we can and must do it again a century later.

In deciding upon one of their three strategies, however, we must have a clear understanding about the potential roles of the

other players as well as our profession’s role in the provision of information. What are we best equipped to assume? This delinea- tion may be beyond the authors’ scope, so we must do it. The alternative is for information science to become a quiet little backwater, talking only among ourselves.

John Richardson, Jr. Graduate School of Library and Information Science

UCLA Los Angeles, CA 90024

Information Technology: A Luddite Analysis. Frank Webster and Kevin Robins. Norwood, NJ: Ablex Publishing Corporation, 1986. 387 pp. Price: $29.95 hard cover. ISBN O-89391-343-X.

The message of this book is far from the happy one proposed by such futurists as Alvin Toffler. Differing not only in view- point, but also in style, this is a much more serious book. It is written by social scientists, and it is likely to be read mostly by the same. Ideas and evidence are clearly presented and well-organized. The book contains a 27-page bibliography, good indexes, and useful subtitles. Not only is this book a “good read,” by both aca- demic and popular nonfiction standards, it is the most thoroughly substantiated critique of the “information society” available. Any information professional who wonders about the broader implica- tions of his or her work will find satisfaction in reading this book, if not because they agree with the views expressed, then because many important social issues are identified.

Webster and Robins use the term “Luddism” to represent a necessary rcsponsc to the many blue-sky visions of the future that portray the new information technologies as liberating and decen- tralizing forces. The Luddites were organized bands of English

01989 by John Wiley & Sons, Inc.

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-March 1989 141

Page 10: Education for professional librarians

craftsmen who rioted in 18 11 and destroyed textile machinery de- signed to displace them. Since that time, the viewpoint of the

Luddites traditionally has represented “. . . all that is negative, hopeless, and deluded. It is unintelligent, probably violent; indis- criminate and futile; the action of ignorant, backward-looking workers; anachronistic, brutal and destructive; lacking in imagi-

nation; opposed to progress” (p. 2). However, the authors offer quite a different interpretation of the actions of the Luddites, which represent to them ‘<. . above all else an attempt by work-

ing people to exert some control over changes that were felt to be fundamentally against their interests” (p. 3). As the book’s title suggests, the Luddism defined by the authors is a critical and analytical perspective, not a call for violence.

One of the chief intellectual targets of this book is the work of Harvard sociologist Daniel Bell. Sharing a view expressed by many prominent European scholars, including German philoso- pher Jurgen Habermas and British sociologist Anthony Giddens, the authors consider Bell to be an apologist for a capitalist infor- mation age agenda. Familiar to many for his contention that the chief resource in “post-industrial” society is not muscle power or energy, but information, Bell favorably views information tech- nology as reflective of the realization of Max Weber’s ideal of the rationalization of society. Critical of this perspective, Webster and Robins note that the rationalization of production in manu- facturing, based on the principles advocated by the father of management science, F. W. Taylor, finds its equivalent in the in- formation society. The core objective of scientific management is “the systematic separation of manual and mental labor in order to monopolize the latter” (p. 309). Citing management theorist Peter Drucker, who has called Taylorism “the most effective idea of this century” (p. 309), the authors note his fundamental argument that the key to producing more is to “work smarter” (p. 310). Information technologies remarkably increase the potential to do so. Thus, Drucker’s post-industrial Taylorism becomes the defucto though undesirable outcome of Bell’s optimistic specula- tion, according to Webster and Robins.

The critique presented in this book is unabashedly Marxist. These authors, both of whom are at English Universities (Web- ster at Oxford Polytechnic and Robins at Sunderland Polytechnic) and trained in European intellectual traditions, do not reflect the reticence found in the United States. The authors are aware of the kinds of criticisms that will be leveled at them by American so- cial scientists not rooted in a Marxist tradition. For those who do

not share such a heritage, this book will be difficult but worth the effort. Any observer of the information technology scene needs a clear understanding of the polarized perspectives of Daniel Bell on the one hand, and the response to Bell offered by these au- thors on the other.

From Webster and Robins’ perspective, Bell’s approach to scholarship is one which, in its fervent quest for moderation, at- tempts to remove salient moral issues from the domain of social science explanation and debate. Bell believes that ends do not justify means and, if the means don’t show promise of providing us with a morally justifiable future, then we’ll just have to live

with it. The question they raise in response, however, is: Are the means Bell wishes to protect any more justifiable on a moral ba- sis? Their answer is that they are not.

To defend this position, they offer a great deal of evidence, some of the most interesting coming from a lengthy chapter (ninety-eight pages) on the relationship between information technology and work. The authors provide the greatest attention to the effect information technology is having on work predomi- nantly performed by women. Opposing the philosophy that “high technology = high skill = higher technology = higher skill” (p. 129), they see the following scenario in motion: Low-level cleri- cal work continually is being deskilled, which is having an unfor- tunate impact on women. In noting that they are discussing “the effect of technology on a group of people who, on the whole, share homes and incomes with working men” (p. 178), the au- thors maintain that there are socially-induced reasons why women are reluctant to protest. Webster and Robins are particu- larly critical of the concept of the “electronic cottage” which they see as an oppressive atmosphere for women. They note that women who need to work, but who also need to be at home be- cause of dependents, will be forced to do piece-work rather than work for a weekly rate. Most importantly, these workers will be isolated from other workers and will be in a weak position for ne- gotiation with employers.

In agreement with Bell, Webster and Robins acknowledge a society-wide transformation towards an information base, but they see one in which the power and knowledge divisions of smokestack industrialism are reinforced in subtle and ubiquitous ways. By their view information technology is indeed symptom- atic of Weber’s rationalization of society, although the authors see information technology and information society as social con- structions, not inert forces. In their view technology is designed by people. It is also people who wield power and sustain condi- tions of dominance and subordination, and people can modify those conditions. They believe it is a society-wide responsibility to develop and act on a consciousness of the potential to change and use information technologies in enlightened ways. However,

they “see no serious dissent to what is happening,” and “many signs of resignation to the inevitable” (p. 347).

From among those who read this book, some will raise a very important and valid question: Isn’t it the responsibility of the critic to propose an alternative future? If it is, then Webster and Robins fail on this count. In defense of the absence of a proposed

alternative, the authors would maintain that they have done no disservice to private and public policy makers who need to know how the tacit and explicit information policies in “post-industrial” society are sadly lacking. This book is a valuable resource for those who are serious about doing something with that knowledge.

Andrew M. Calabrese Department of Communication Purdue University West Lufayette, IN 47907

142 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-March 1989