176
DOCUMENT RESUME ED 348 579 CE 061 853 AUTHOR Mikulecky, Larry; Lloyd, Paul TITLE Evaluating the Impact of Workplace Literacy Programs. Results and Instruments from the NCAL Workplace Literacy Impact Project. INSTITUTION Indiana Univ., Bloomington. School of Education.; National Center on Adult Literacy, Philadelphia, PA. SPONS AGENCY National Center for Adult Literacy, Philadelphia, PA.; Office of Educational Research and Improvement (ED), Washington, DC. PUB DATE Jul 92 NOTE 178p. AVAILABLE FROM Language Education Dept., School of Education, Indiana University, Bloomington, IN 47406 ($10). PUB TYPE Reports - Research/Technical (143) -- Information Analyses (070) EDRS PRICE MF01/PC08 Plus Postage. DESCRIPTORS Adult Basic Education; *Adult Literacy; Demonstration Programs; *Education Work Relationship; Employer Attitudes; Employer Employee Relationship; *Evaluation Methods; Literacy Education; Models; *Outcomes of Education; Program Effectiveness; *Program Evaluation; Unions IDENTIFIERS *Workplace Literacy ABSTRACT This two-part document reports the results of a literature review of workplace literacy program evaluation and then details a study of two current workplace literacy programs. The five chapters of the first part summarize findings on what is known about workplace literacy programs, methods for evaluating workplace literacy programs, assessing workplace literacy program results, assessing the impact on family literacy, and assessing the impact on productivity. The second part describes the study, the purposes of which were to develop an impact assessment model for workplace literacy programs and to produce data on the impact of programs at two sites, as well as to refine the model for use at other sites. The two sites chosen were very different but both operate established programs involving technical and communications training, high school equivalency programs, and English-as-a-Second-Language classes. Pre- and post-program data were gathered on learners' job productivity, literacy attributes, and literacy practices with their families through interviews, tests, and questionnaires, and supervisor ratings. Results showed that learners made gains in their literacy self-image, their ability to articulate plans, the amount and range of literacy activity both at work and away from work, and their reading strategies and comprehension. The study concluded that the evaluation process developed can be used as a model for evaluation of workplace evaluation projects. Eight appendices contain project forms, and a bibliography lists 79 references. (KC)

Evaluating the Impact of Workplace Literacy Programs. Results and Instruments from the NCAL Workplace Literacy Impact Project

Embed Size (px)

Citation preview

DOCUMENT RESUME

ED 348 579 CE 061 853

AUTHOR Mikulecky, Larry; Lloyd, PaulTITLE Evaluating the Impact of Workplace Literacy Programs.

Results and Instruments from the NCAL WorkplaceLiteracy Impact Project.

INSTITUTION Indiana Univ., Bloomington. School of Education.;National Center on Adult Literacy, Philadelphia,PA.

SPONS AGENCY National Center for Adult Literacy, Philadelphia,PA.; Office of Educational Research and Improvement(ED), Washington, DC.

PUB DATE Jul 92NOTE 178p.

AVAILABLE FROM Language Education Dept., School of Education,Indiana University, Bloomington, IN 47406 ($10).

PUB TYPE Reports - Research/Technical (143) -- InformationAnalyses (070)

EDRS PRICE MF01/PC08 Plus Postage.DESCRIPTORS Adult Basic Education; *Adult Literacy; Demonstration

Programs; *Education Work Relationship; EmployerAttitudes; Employer Employee Relationship;*Evaluation Methods; Literacy Education; Models;*Outcomes of Education; Program Effectiveness;*Program Evaluation; Unions

IDENTIFIERS *Workplace Literacy

ABSTRACTThis two-part document reports the results of a

literature review of workplace literacy program evaluation and thendetails a study of two current workplace literacy programs. The fivechapters of the first part summarize findings on what is known aboutworkplace literacy programs, methods for evaluating workplaceliteracy programs, assessing workplace literacy program results,assessing the impact on family literacy, and assessing the impact onproductivity. The second part describes the study, the purposes ofwhich were to develop an impact assessment model for workplaceliteracy programs and to produce data on the impact of programs attwo sites, as well as to refine the model for use at other sites. Thetwo sites chosen were very different but both operate establishedprograms involving technical and communications training, high schoolequivalency programs, and English-as-a-Second-Language classes. Pre-and post-program data were gathered on learners' job productivity,literacy attributes, and literacy practices with their familiesthrough interviews, tests, and questionnaires, and supervisorratings. Results showed that learners made gains in their literacyself-image, their ability to articulate plans, the amount and rangeof literacy activity both at work and away from work, and theirreading strategies and comprehension. The study concluded that theevaluation process developed can be used as a model for evaluation ofworkplace evaluation projects. Eight appendices contain projectforms, and a bibliography lists 79 references. (KC)

EVALUATING THE IMPACTOF WORKPLACE LITERACY PROGRAMS

RESULTS AND INSTRUMENTS FROM THE NCAL WORKPLACE LT TRACY IMPACT PROJECT

Larry Mikulecky and Paul LloydSCHOOL OF EDUCATION

INDIANA UNIVERSITY BLOOMINGTON

U.S. DEPARTMENT Of EDUCATIONOffice of Ed hone! Research and improvement

EOUCA NAL RESOURCES INFORMATIONCENTER (ERIC)

his document has been reproduced asreceived from the person or organizationOriginating it

0 Minor changes have been made to improvereproduction quality

Points of view or opinions stated in this docu-ment do not necessarily represent officialOE RI position or policy

BEST COPY AVAILABLE

001

Evaluating the Impact of Workplace LiteracyPrograms

Results and Instruments from the NCAL Workplace Literacy Impact Project

Larry Mikulecky & Paul Lloyd

School of EducationIndiana University-Bloomington

A research project funded by theNational Center on Adult Literacy

University of PennsylvaniaPhiladelphia, Pennsylvania

July, 1992

ACKNOWLEDGEMENTS

We would like to thank the following people for their assistance inconnection with this project in all its aspects: developing instruments,conducting interviews, arranging site visits, collecting data, analysingdata, and gathering research material.

Cumberland Hardwoods:

Janet Davis, Yvonne Fournier, Keith Gird ley,John Keisling, Mary Stevens, Mary Ruth Winford.

Delco Chassis:

Russell Ater, Karin Lotz.

Fin

David Mathes, Daniel O'Connell.

Indiana University:

Kathy Bussert, Julie Chen, Mei-Li Chen, Parsa Choudhury,Ming-Fen Li, Michele Peers, Sharon Sperry, Zhang Hong.

University of Rochester:

Robert Ferrell

CONTENTS

Acknowledgements

Contents

PART I RESEARCH BACKGROUND ON WORKPLACE LITERACY

111

Chapter 1 What we know about workplace literacy programs 3

Chapter 2 Methods for evaluating workplace literacy programs 15

Chapter 3 Assessing workplace literacy program results 25

Chapter 4 Assessing impact on family literacy 39

Chapter 5 Assessing impact on productivity 47

PART II THE CURRENT STUDY

Chapter 6 Structure of current study 61

Chapter 7 Results of current study 75

Chapter 8 Discussion and implications 93

Bibliography 109

Appendix A Interview form and instructions 117

Appendix B Questionnaire form and instructions 133

Appendix C Cloze test samples and instructions 143

Appendix D Family literacy focus group interview 151

Appendix E Classroom observation form 157

Appendix F ESL checklist 161

Appendix G Supervisor rating scale examples and instructions 167

Appendix H Tabular data 175

vr,CJ

Part I

RESEARCH BACKGROUND ONWORKPLACE LITERACY

0 OrU

CHAPTER 1

What We KnowAbout Workplace Literacy Programs

OVERVIEW

Though a growing body of research has identified principles andelements associated with effective workplace literacy programs, few

programs are able to incorporate all elements. Evaluation ofworkplace literacy programs is further complicated by the fact thatthere appear to be a variety of workplace literacy problems, eachcalling for a different sort of instruction. Still, over the last twodecades, we have learned a good deal about what to look for in

effective workplace literacy programs.

For example we have learned that:

there are several different workplace literacy problemswhich call for a multi-stranded approach to instruction;

improvement takes a significant amount of learnerpractice time;

transfer from learning one application to newapplications is very limited;

significant learning loss occurs within a few weeks ifskills are not practiced.

We have also learned that effective workplace literacy programs arecharacterized by active involvement of project partners (including

employees) in systematically determining local literacy needs and

developing programs.

MULTIPLE STRANDS FOR MULTIPLE PROBLEMS

It is important to realize that we face several literacy problemsin the workplace and not just one. The person who can't read at allrequires different support than does the high school graduate who

I 1.4

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

can't meet the new reading demands of his job. People educated in aforeign language and not speaking much English require another sortof support. Providing the same services and programs to suchdifferent clients makes no sense, and yet it sometimes occurs.

Increasingly, programs in business and industry are becomingmulti-strand programs. In such programs, one instructional strandmight be available to English as Second Language learners while otherstrands are available for learners wishing to pursue GED certificates inpreparation for further education or for high school graduatespreparing for technical training. Even the format for instruction mayvary from structured classes to small group instruction to computerguided learning, to individual tutoring.

Bussert (1991) surveyed 107 workplace literacy programsdescribed in the research literature. Of the descriptions .of workplaceliteracy programs providing sufficient info rrnation for judgments to bemade, the vast majority (74%) offered a multiple strand curriculum(i.e. two or more of the following: ABE, GED, ESL, a selection of basicskills/technical courses) while 13% reported self-pacing of learning(i.e. home study, PLATO computerized learning, learning modules).

IMPROVEMENT TAKES SIGNIFICANT LEARNER PRACTICE TIME

Training material and technical reading material in theworkplace tends to range in difficulty from upper high school tobeginning college difficulty levels (Sticht, 1975; Mikulecky, 1982;Rush, Moe & Storlie, 1986). Some learners, like high schoolgraduates who need to brush-up reading skills, can learn tocomprehend technical materials with a minimum of instruction time(about 30-50 hours). Other learners who have extreme difficulty witheven simple reading, such as signs or simple sentences, may requireseveral hundred hours of instruction or, indeed, may never be able tocomprehend some technical material. Gains do not come quickly.The average program takes approximately 100-120 hours of practicetime for learners to make the equivalent of a year gain in reading

4

WHAT WE KNOW ABOUT WORKPLACE LrIBRACY PROGRAMS

ability. Auspos et al (1989) reports that several hundred learners in apre-work literacy program in 13 diverse sites across the countryaveraged 132 hours of basic education. Of the participants tested for

reading gains using the Test of Adult Basic Education, an average .7 of

a year gain in reading ability after approximately 100 hours ofinstruction was demonstrated.

Targeted programs using materials which learners encounterduring everyday activities appear to make more rapid gains, but stilltake from 50-60 hours per grade level gain (Mikulecky, 1989). Sticht(1982) reports that military enlisted men receiving 120 hours ofgeneral reading instruction averaged an improvement of .7 gradelevels in reading ability. Enlisted men being trained with workplacematerials improved 2.1 grade levels when reading work relatedmaterials during the same amount of time.

Computer learning programs may also cut learning time slightly,

probably since there is more reading practice and less discussion.Haigler (1990) indicates that learners gained an average of 1.26 yearsof reading ability in an average of 78 hours of practice usingcomputerized lessons in the JSEP Job related basic skills program.This is equivalent to about 63 hours of practice for a year of gain.

Yet, linking learning gain to practice time can be somewhatdeceptive and misleading. A sense of perspective is needed. A gain ofone year of reading growth in one hundred twenty hours of practice isa bargain compared to the experience of the average school child whospends over a thousand hours for a reading gain of one year.

Furthermore, the more effective workplace literacy programs reportreducing learning time to 50-70 hours of practice for a year of gain. Noprogram, however, has been able to consistently improve readingability from low-level to high school or college standards in 20, 30 oreven 50 hours. This is important to note because in many industriesthe standard training class is less than thirty hours.

The fact that literacy gains usually take more time than istypically allocated in workplace training programs presents a problem.

5

90r. s

WOW

EVALUATING 'THE IMPACT OF WORKPLACE LITERACY PROGRAMS

For gains to occur, more practice time must be found. Effectiveprograms demonstrate at least three possibilities for increasingpractice time. Some programs immerse employees in integratedtechnical/basic skills classes full-time for several weeks (see Delcodescription in Chapter 6). Other programs provide sequences ofcourses allowing learners to move from one course to another andeventually to continue learning at technical schools and communitycolleges. A third program type uses workplace materials in trainingclasses, and thus reaps the bonus of additional practice time aslearners read these same materials on the job.

TRANSFER TO NEW APPLICATIONS IS SEVERELY LIMITED

Research indicates that there is a severe limitation on how muchliteracy will transfer from one type of task to other types of tasks, ifthe new tasks are not part of the training. Reading the Bible isconsiderably different from reading the newspaper which, in turn,differs significantly from the sort of thinking one does while reading amanual. Perkins and Salomon (1989:19), after reviewing the cognitiveresearch from the late '70's through the '80's concluded that:

'To the extent that transfer does take place, it is highlyspecific and must be cued, primed, and guided; it seldomoccurs spontaneously. The case for generalizable, context-independent skills and strategies that can be trained in onecontext and transferred to other domains has proven to bemore a matter of wishful thinking than hard empiricalevidence."

Consistently during the past decade, literacy researchers havereminded us that literacy is not something you either do or do nothave. It is not even a continuum. What we mean by "literacy" is moreaccurately described as "literacies." There is some degree of overlapbetween being able to read one sort of material and being able to readother sorts. The degree of overlap between reading a short story, apoem, a lab manual, an equation, a computer screen, a census report,and government documents may be severely limited, however. Weknow very little about the degree of overlap and the degree of

6

WHAT WE KNOW ABOUT WORKPLACE LITERACY PROGRAMS

difference among these various literacy formats and tasks.

Evidence suggesting the limitations of transferring literacy skillsis found in the results from the National Assessment of EducationalProgress, which surveyed the literacy skills of young adults (Kirsch &Jungeblutt, 1986). This survey measured the literacy abilities of youngadults in three different areas: their use of prose, document, andquantitative forms of literacy. Correlations among subjectperformances in these three areas revealed limited overlap in literacyabilities (i.e. about 25% shared variance). Among those surveyed,being able to read a newspaper was only partially related to being ableto make sense of a document like a chart, table, or form. Someliteracy ability apparently will transfer. Document reading and prosereading, for example, did not seem to be totally separated skills. Formost learners, however, this degree of shared literacy ability appearsto be insufficient for transfer to occur easily. The idea of teachingsomeone to read a poem and expecting that s/he is going to improvealso in ability to read a computer screen is probably a misplaced hope.What we want people to be able to do, we need to teach them. Somepeople are able to make great transfers from one situation to others.Such people, unfortunately, do not appear to be the norm.

The limitations of literacy transfer have serious implications forworkplace literacy programs. This is especially true if programsattempt to use traditional, school-type materials. Sticht (1982) foundthat general literacy training did not transfer to job applications. Henow recommends a "functional context" approach which teachesliteracy by using the materials with which one is likely to function on adaily basis.

SIGNIFICANT LEARNING LOSS OCCURS WITHOUT REGULAR PRACTICE

The problem of lack of transfer is related to the problem oflearning loss. When a person cannot use what they have learned inreal-world situations, they tend to lose the new skill because they losethe chance to practice it. This is important, because new knowledge

7

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

must be used or it is lost. Sticht's (1982) report of military studiesindicates that enlisted men improved in literacy abilities while theywere in general literacy classes, but that within eight weeks, 80% ofthose gains were lost. The only exception to this finding occurredwhen job-related materials were used to teach literacy abilities. In thelatter case, learning gains held up, probably because learners were ableto see transfer and continued to practice the abilities they hadmastered.

This finding is very important. It means that efforts andresources can be squandered if learners are taught with generalmaterials which have no relationship to the materials they see daily. Italso suggests that the timing of workplace literacy training isimportant. Preparing learners for the basic skills which aredemanded by new jobs may be wasted if learners must wait severalmonths before they are able to apply and practice their new learning.

Some programs (Mikulecky & Philippi, 1990) have analyzedspecific job tasks and developed instructional materials using bothwork and everyday materials. For example, in banking, the carefulreading of withdrawal and deposit slips involves reading, computation,and judgment. Similar skills are requi_ ed at home in reading andfilling out forms for mail-order catalogs and in paying some bills.Instruction that alternates applying the same strategies to workplaceand home materials offers an increased possibility for practice. Data isnot yet available on the effectiveness of this strategy in stemminglearning loss.

EFFECTIVE WORKPLACE LITERACY PROGRAMS

To be effective, therefore, workplace literacy programs musthave well designed instruction and they must be flexible enough tomeet both the needs of differing learners and of changing situations onthe job. The discussion so far has highlighted the importance of

designing programs which integrate workplace basic skills instructionwith several other types of instruction, for example, technical training,

8

WHAT WE KNOW ABOUT WORKPLACE 1.11ERACY PROGRAMS

E.S.L. training, G.E.D. instruction, and low level literacy training. Ithas emphasized the importance of countering lack of transfer andlearning loss by providing long-term practice with materials andactivities directly related to the learners' everyday demands.Additional elements of effective programs are apparent as we examinemore workplace literacy training across the country.

Salient Features of Effective ProgramsA recent study of 37 workplace literacy programs funded by the

U. S. federal government (Kutner et al, 1991) identified four keycomponents of effective programs. One of these elements related todirectly linking instructional materials to literacy tasks identifiedduring job analyses. This connection was discussed at some lengthearlier in this chapter. In addition to this clear link betweeninstruction and job tasks, effective programs were characterized by:

1) active involvement by project partners;

2) active involvement by employees in determining literacyneeds;

3) systematic analysis of on-the-job literacy requirements.

Bussert (1991) reported most workplace literacy programsinvolve partnerships of some sort. Bussert analyzed demiptions of

107 U.S. workplace literacy programs and found 92% to xivolve 2 ormore partners. Sometimes the partners were multiple unions ormultiple businesses; or a school and business; or a government agency,a business, and a union. The most common types of partnershipsamong the programs she surveyed were the following:

Employers working with others 88%

Schools (public school, community college, and 51university) working in partnership with others

Unions working with others 34%.

Recruitment and retention were reported to be effective wheneach partner played an active role during the early stages of program

development and a continuing role in supporting program goals.

9

EVALUATING THE IMPACT OF WORKPLACE LrIERACY PROGRAMS

Involvement went beyond leaders, however, to include learnersthemselves, who helped gather materials and made suggestions forexpanding the collection of custom-designed materials. It was usuallythose closest to the job who knew what strategies would be mosteffective in gathering information and solving job problems. Activeparticipation of partners sometimes meant supervisors and top jobperformers helping to analyze job tasks and suggesting materials andapproaches which they found effective in preparing new workers.

A few specific examples of effective workplace programs canhelp illustrate these elements of effective programs. Earlierreferences to the military programs described by Sticht (1982) and tothe computerized JSEP program described by Haig ler (1990) touchedon these elements. Now we will examine examples provided byHargroves (1989) and Mikulecky & Strange (1986).

B. n F I R- ry B ki Dev-le em n n -rHargroves (1989) describes a well-established workplace basic

skills program in the banking industry. She presents the results of a15 year comparison of the Federal Reserve Bank's basic skills traineesto a peer group of entry level workers in terms of: 1) the effectivenessof training in helping under-educated youth catch up, 2) retention,3) job performance and 4) earning power. The Bank's skillsdevelopment program integrated basic skills with clerical training,supervised work experience, and counseling. Trainees came into theprogram because they lacked basic skills which were needed in mostclerical jobs. Though 50% of the trainees had graduated from highschool, half read at or below the eighth grade level. Two out of threeSkills trainees attended long enough to complete an extensive classand on-the-job training program leading to job placement at the Bank.

Hargroves (1989) gathered information on 207 Skills Centertrainees from 1973 to 1988 and compared employment data to that of301 Bank employees hired for entry-level positions from 1974 to1986. Her results indicated that several months of formal trainingcombined with on-the-job experience and counseling enabled under-

10

WHAT WE KNOW ABOUT WORKPLACE LITERACY PROGRAMS

educated youth to catch up to typical entry-level workers. Two thirdsof the trainees (who would not otherwise have been eligible foremployment) were placed in jobs. The trainees, on the average,stayed longer than their entry-level peers, despite the fact that in thelate 1980's there was a low unemployment rate and ample jobopportunities outside the bank. The majority of Skills Centergraduates earned as much as their entry-level peers who were bothmore educated and more experienced. "In summary, the programproduced a supply of employees who were trained as well or betterthan other new entry-level employees and understood the Bank'semployment practices; it also provided trainees to departments onshort notice for extra clerical help" (Hargroves, 1989: 67).

Several elements key to program success were highlighted bythe Hargroves study. These include: 1) integrating basic skills, clericalskills, work experience and intensive counseling, 2) self-paced andoften one-on-one instruction focusing on competence, 3) connectionsto community agencies for recruitment, and 4) good communicationswith Bank supervisors in order to develop job placements.

Two Long-Term Integrated Skill ProgramsMikulecky and Strange (1986) reported on a program to train

word processor operators and a second program to train wastewatertreatment workers. Each program involved extensive training time.The word processor operators were paid to attend between 14 and 20weeks of training, 40 hours per week. The number of weeks wasdetermined by their ability to function at levels comparable to those ofaverage word processor operators who were currently employed. Thetraining program for the wastewater treatment plant involved 20 fullweeks of voluntary training which alternated classroom training withon-the-job training. The word processor training program screenedapplicants and accepted no applicants who read more than 3 gradelevels below the difficulty level of the business materials read, typedand edited by existing word processor operators. These materialsranged from high school to college level in difficulty. The wastewatertreatment training program, on the other hand, provided

1 l' 5

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

approximately 100 hours Of special literacy support for the leastacademically able of its workers. This support focused on preparingthe use of job and training materials which averaged from 11th gradeto college level in difficulty. Employers and top-performing workershelped to analyze job tasks and provide benchmarks for acceptableperformance.

The average learner in the word processor training programreached job-level competence in 20 weeks. Some of the trainees wereable to find employment in 14 weeks, while a few took nearly 28weeks. The program concluded in the middle of a recession, duringwhich 1/3 of the cooperating companies stopped all hiring. In spite ofthese economic difficulties, 70 percent of the program participantsfound employment as word processors within two months ofcompleting the program.

The wastewater treatment program focused on the least literate20 percent of its workers. Nearly 1/2 passed their technical trainingpost tests. The consensus of technical instructors was that less than 5percent would have passed without the support of the literacyprogram. Of students attending special training sessions, nearly 70percent were able to summarize job materials in their own words bythe end of training. Only about 10 percent of the learnersdemonstrated gains in general reading abilities and those werestudents who invested 5 or more hours weekly outside of class ongeneral reading materials. Retention of students receiving specialbasic skills training was higher than that of more able students whoattended technical training only.

CONCLUSION

No single class or course seems able to meet the demands of thediverse populations within a workplace nor to provide a sufficientamount of instruction to move very low-level literates to the functionalliteracy levels called for in today's workplace. Multi-strandapproaches, which involve several different types of courses and

12

WHAT WE KNOW ABOUT WORKPLACE LIIERACY PROGRAMS

strings of educational experiences leading to long-term training goals,

appear to offer the highest probability of success. Such programs needto encourage learners to practice and retain new skills by linkingtraining materials to Job and home literacy demands. The activeinvolvement of workplace partners appears to be key to establishingthose links and to systematically analyzing program effectiveness.Relatively few workplace literacy programs meet all .theseeffectiveness criteria, but the degree to which these criteria can beaccommodated appears directly related to program success.

CHAPTER 2

Methods for EvaluatingWorkplace Literacy Programs

OVERVIEW

The previous chapter presented examples of effective workplaceliteracy programs and identified key program parameters. To evaluateworkplace literacy programs effectively, two types of evaluation aredesirable -- formative evaluation and summative evaluation.

Formative evaluation of a workplace literacy program takes place

during beginning and middle stages of program operation. Thepurpose is to identify problem areas which can be addressed andmodified while change is still possible and productive. This type ofevaluation usually involves the use of interviews, document analysis,

and observations to determine:

the degree to which all involved with the programunderstand and share program goals;

whether the resources in terms of personnel,materials, learning environment, and learner timeare sufficient, given current knowledge, to achievethe goals;

whether the learning processes and methodsemployed appear to be sufficient to accomplish thegoals.

Summative evaluation of workplace literacy programs usually

takes place at the end of program operation and is designed to assesshow well the program has succeeded. This sort of evaluation requiresgathering pre and post program data and then analyzing that data.This implies using and developing measures directly related to

program goals. Typical goals for workplace literacy programs include

improved learner literacy abilities, improved literacy practices at work

and elsewhere, changed learner beliefs about literacy, self, and

Its

EVALUATING THE IMPACT OF WORKPLACE =RACY PROGRAMS

education, and improved learner productivity on the job. Assessmentis often accomplished through use of formal standardized tests,informally constructed tests related to the workplace, questionnairesrelated to literacy practices, and interviews with learners andsupervisors. In addition, company records and ratings on productivity.safety, attendance, and enrollment in subsequent classes can expandthe evidence available for assessing program impact.

CURRENT WORKPLACE LITERACY PROGRAM EVALUATIONS

Only a few workplace literacy programs described in theresearch literature report rigorous program evaluations or carefuldocumentation of learner gains, impacts on productivity, and detaileddescriptions of effective program practices. Some of these programshave been cited in the previous chapter. See, for example, Sticht(1982), Mikulecky & Strange (1986), Hargroves (1989), Haig ler(1990) and Philippi (1988, 1991).

The above examples are atypical. Mikulecky and d'Adamo-Weinstein (1991) observe that the majority of workplace literacyprograms described in the available research literature tend to reportno rigorous evaluation data. Many programs which do reportevaluation data simply provide superficial information limited tosurveys of learner satisfaction and anecdotal reports of effectiveness.Occasionally a pre and post administration of a standardized readingtest (usually the Test of Adult Basic Education - TABE or the AdultBaa& jeanug Examination ABLE) provides an indication of learnergain in general reading ability. Only a few evaluations provide follow-up data on the impact of programs on learner job performance,retention, or earning power.

Kutner, Sherman, Webb & Fisher (1991) recently reviewedworkplace literacy programs funded by the U. S. Department ofEducation to determine the elements of effective programs. Theauthors examined 29 of 37 projects funded by the National WorkplaceLiteracy Program to determine which programs were effective and

16 is

METHODS FOR EVALUATING WORKPLACE LITERACY PROGRAMS

merited further examination in order to identify components ofeffective programs. The authors reported that:

"Due to the absence of quantitative data necessary to identifyparticularly effective projects (i.e. improved productivity, lowparticipant attrition, or improved test scores), study sites wererecommended to OVAE staff. These sites were reported byproject directors to have a high retention rate." (1991:26)

Even in federally funded workplace literacy programs, for whichprogram evaluation was an expectation for receiving funding, it wasnot possible to find six programs which had been rigorously evaluatedfor effectiveness. Selection of "effective" programs was based uponundocumented reports of retention from program directors.

FORMATIVE AND SUMMATIVE EVALUATIONS

It is possible to evaluate workplace literacy programs effectively

using a combination of formative and summative evaluation. Formativeevaluation of a workplace literacy program takes place at the beginningand during program operation. The purpose is to identify problem

areas which can be addressed and modified while change is stillpossible and productive. Summative evaluation of workplace literacyprograms takes place at the end of program operation and is designedto assess how well the program has succeeded. Assessing this sort ofsummative program impact requires gathering pre and post programdata and then analyzing that data.

Mikulecky, Philippi and Kloosterman have performed several

such formative /summative evaluations using a version of Stufflebeam's

(1974) Context, Input, Process, Product evaluation model modified for

use with workplace literacy programs. In brief, the evaluation modelemploys the use of interviews, document analysis, observations, and

test data to determine:1) the degree to which all involved with the program

understand and share program goals;

2) whether the resources in terms of personnel,materials, learning environment, and learner time

1 T

EVALUATING 'ME IMPACT OF WORKPLACE LITERACY PROGRAMS

are sufficient, given current knowledge, to achievethe goals;

3) whether the learning processes and methodsemployed are sufficient to accomplish the goals.

These three evaluation goals provide information about the program inits formative stages. Results can be reported to program providerswhile there is still time to make program adjustments. A fourth,summative evaluation goal of this technique addresses:

4) what evidence exists that program goals haveactually been accomplished.

Formative EvaluationA significant portion of the formative evaluation occurs early

during program planning and operation (i.e. during formative stages ofprogram development). Formative analyses usually employ interviews,the examination of program documents, and on-site observations tofocus upon the degree to which program goals are shared, theadequacy of resources for achieving those goals, and the degree towhich program execution appears to match stated program goals.

S

Program GoalsInterviews, analysis of memos and planning documents, and

early program observations often reveal that significant differencesabout program goals exist among funders, supervisors, instructors,materials designers and learners. Evaluation feedback during earlyprogram stages often initiates necessary clarification among programplanners and participants. In some cases goals are expanded, in somecases goals are refined, and in some cases new vendors are sought.

Examples of interview questions designed to reveal the variousviews of program goals among the participants and leaders follow

below.

/ 18 ai

METHODS FOR EVALUATING WORKPLACE LITERACY PROGRAMS

Shared Goals

1. What do you consider to be the main purposes, goals, andobjectives of the basic skills training program(s)?

2. Given the situation you find yourself in, what do you think arethe most important things for an instructor to be doing?

Additional information can be gathered from published programdescriptions and from program planning documents.

ResourcesResources refers to the expertise of key personnel, the available

instructional space and materials, as well as the time available forinstruction. Early examination of resources sometimes reveals thatresources are insufficient to accomplish goals espoused by programplanners. Typical deficiencies are: 1) insufficient learner time toaccomplish purported goals, 2) lack of appropriate learning materialsor lack of resources to develop custom-designed materials whichmatch workplace literacy program goals, and 3) difficulty in findinginstructors with knowledge or expertise about workplace literacyrequirements. Information about resources can be gathered byexamining program facilities and from interviewing key programpersonnel.

Examples of interview questions designed to elicit informationabout program resources follow below.

Resources

1. List training and experience you've had related to this job.

2. What is your assessment of the following:MaterialsFacilities

3. Please describe the following parts of the training program:Materials for diagnosing and testing learner abilities.How a learner's class and out of class learning time should be divided.How records are kept and what use is made of records.

1 9- 22

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

Learning Processes and MethodsWe know from research reported in the previous chapter that

literacy improvement takes a significant amount of time and thatgeneral literacy instruction is not very effective for workplaceapplications. Observation of classroom instruction, materials, andschedules sometimes reveals potential problems with the learningprocesses and methods offered by the program. Examples are: 1)Insufficient learner practice time with literacy or too much class timeallocated to discussion, 2) Teaching general reading instruction withschool books, off the shelf materials, or sometimes materials andactivities selected because the instructor has found them useful inother settings, and 3) Little feedback from instructors about learneraccomplishments (sometimes instructors do not or cannot commentupon what individual learners can and cannot do).

Effective programs typically instruct learners using workplacerelated instructional activities and real or modified workplacematerials. Teachers are familiar with Job-literacy demands throughdirect observation or documented analyses of the Jobs. Wheninstruction using more general approaches or materials occurs, theteacher is usually able to relate the instruction to workplace needs. If

instruction is not related to the workplace, it is because the programhas simply elected to use a workplace classroom to address generalliteracy goals. In effective programs, no matter what the goal,sufficient learner practice time is available to allow reasonableexpectation of success. Some effective programs even manage toexpand practice time through homework.

Examining the processes used in a workplace literacy programcan be accomplished through classroom observation, examination oflearner records and assignments, and through interviews withlearners and instructors. A great deal can be learned by askinginstructors and learners to describe how they spent time during theprevious class period. Information from such interviews can helpdetermine if learning activities and time allocation match programgoals or if learning time is insufficient to meet these goals.

202

METHODS FOR EVALUATING WORKPLACE LITERACY PROGRAMS

Classroom observation also allows the observer to gatherinformation on how much time both instructor and learner spend invarious activities. This information can then be analyzed to determinewhether instructors are allocating time in ways that reinforce statedgoals and in ways which are likely to be productive for learners. Aform for recording such observational information follows below.

Time

0

05

Classroom ObservationStudent Activity Teacher Activity Comments

Make note of time spent by students actually reading or doing things. Also notetime learners spend listening to the instructor. When learners are in smallgroups or working individually should be mentioned. Special note should bemade when the instructor or a student demonstrates how to do something.

Summative EvaluationWhile the formative evaluation provides early information about

the effectiveness of program operation, the summative evaluationprovides information about whether the program achieved its goals.

Evidence of Goal AttainmentWell-evaluated workplace literacy programs gather baseline data

before instruction begins. Typically data is gathered on the readingabilities, practices, and beliefs of learners. In addition, pre-programdata is gathered on worker productivity or any other goal espoused bythe program. Data-gathering is accomplished using formal tests,informally constructed tests related to workplace expectations,questionnaires, and interviews with learners and sometimes withsupervisors. In addition, company records on productivity, safety,

21 24

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

attendance, and enrollment in subsequent classes can expand theevidence available for assessing program impact.

This information establishes a base for later comparisons to end-of-program performance. At the end of the program, all learners areonce again assessed using the same instruments. In some cases, it ispossible to compare the performances of learners in a workplaceliteracy program to those of a control group of comparable employeeswho haven't yet been able to receive workplace literacy training. Todo this, the control group takes pre and post assessments whichparallel the instructional group.

Program goals determine the types of information gathered toassess program impact. For example, if the program is to improve theability of learners to perform more effectively in quality assurancegroups, evidence needs to be gathered on such performance beforeand after training. If training is supposed to have a positive impact onlearner reading habits at home and at work, these, too, need to beassessed before and after the program.

Chapters 3, 4 and 5 will provide samples of methods andinstruments for assessing the impact of workplace literacy programson learner literacy abilities, practices, plans, and beliefs. In addition,methods for assessing the impact of workplace literacy programs uponproductivity and upon the families of learners will be discussed andsample measures will be provided.

CONCLUSION

Only a few workplace literacy programs have been wellevaluated, even though millions of dollars have been invested in theirdevelopment and operation. To evaluate workplace literacy programseffectively it is desirable to perform both formative and stunmativeevaluations. Formative evaluation takes place during beginning andmiddle stages of program operation and is designed to identifyproblem areas which can be addressed and modified while change is

22

METHODS FOR EVALUATING WORKPLACE LITERACY PROGRAMS

still possible and productive. This type of evaluation usually involvesthe use of interviews, document analysis, and observations. Summativeevaluation of workplace literacy programs usually takes place at theend of program operation and is designed to assess how well theprogram has succeeded. This sort of evaluation requires gathering preand post program data and then analyzing that data. This impliesusing and developing measures directly related to program goals.Typical goals for workplace literacy programs include improvedlearner literacy abilities, improved literacy practices at work andelsewhere, changed learner beliefs about literacy, self, and education,and improved learner productivity on the job.

2 3-

...1.1...

CHAPTER 3

Assessing WorkplaceLiteracy Program Results

OVERVIEW

The summative evaluation of the impact of workplace literacyprograms is best performed using a combination of standardassessment tools and custom-designed measures. The custom-designed measures usually reflect the types of reading done on the joband in training courses. In addition, they can focus upon specialobjectives central to the workplace literacy program (e.g., increasedproductivity, comprehending safety information). This chapter willdiscuss several standard and custom-designed measures and provideexamples.

Among the topics discussed are:

the advantages and disadvantages of standardized tests;

the use of custom-designed measures, employing literacytask analyses to develop Cloze tests using reading passagesfrom the workplace and job scenarios involvingapplications of literacy;

obtaining broader indications of adult literacy growth bymonitoring changes in learners' literacy beliefs, practices,processes and plans.

Model custom-designed measures, including Cloze tests, interviewsand questionnaires are available in Appendices A C.

STANDARDIZED TESTS

Standardized reading tests are sometimes used in workplaceliteracy programs as a means of identifying the general readingabilities of learners. These tests often employ multiple-choicequestions and short reading passages (from a few sentences to a

(); 5

I

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

paragraph or two). Some are based on tests developed for use inelementary and secondary schools.

The most commonly used tests are the Test of Adult BasicEducation (TABS) and the Adult Basic Learning Examination (ABLE).Occasionally a workplace literacy program operating in conjunctionwith a Community College may use higher level general reading andstudy skills tests provided by the Community College.

AdvantagesThe advantages of using such tests are two-fold. Initially, the

tests can provide information on the general reading abilities ofpotential learners. Many Community Colleges offering technicaltraining courses, for example, will not enroll students with generalreading or computational abilities below the eighth grade level. TheNational Assessment of Educational Progress results (Kirsch &Jungeblutt, 1986) indicated that approximately 20% of Americanadults read below the eighth grade level -- including a significantnumber of adults who have graduated from high school. In someindustries, more than half of the hourly employees score below aneighth grade level. Such individuals are prime candidates for basicskills support before and during technical training. Sometimes,standardized tests can be used to help identify such individuals.

Secondly, standardized tests can be used as program pre andpost assessments to measure gains in general reading abilities.Comparison of pre- and post -teat scores can indicate the degree ofeffectiveness of a program. Also, post-test scores can indicatewhether learners are ready to go on to textbooks and other generalmaterials in technical training classes. These scores are generallyindicative of how well someone can understand material with whichhe or she has little familiarity. For example, adults scoring at the 10thgrade level on a standardized test would be very likely to have somedifficulty with a textbook on an unfamiliar topic which was writtenabove the 10th grade level. With some background knowledge on thetopic, such people might be able to comprehend material a few grades

26 2S

ASSESSING WORKPLACE LITERACY PROGRAM RESULTS

above their standardized test scores. It is extremely rare to find anindividual who can comprehend material more than a few grade levels

above his or her standardized test scores (i.e., even extensivebackground knowledge will nearly always be insufficient to allow a 6thgrade reader to comprehend a manual written at the 1lth-12th gradelevel).

DisadvantagesThe disadvantages of using only standardized tests in workplace

literacy programs has been mentioned in Chapter 1. These testsmeasure general reading abilities and not the special sorts of literacyskills required in the workplace. A learner in a general basic skillsclass may improve in general reading abilities. For example, a learnercould move from a 5th grade level (i.e. understanding the comics andvery simple stories) to an 8th grade level (i.e. understanding thesports page and USA Today news stories). Though the improvedreading ability may be of some use, the learner is not likely to be ableto transfer those skills easily to reading an SPC chart, a technicalmanual, specialized work-orders, and industry-specific textbooks. Themost efficient way to ensure improvement in these areas is to teachusing these materials. Unfortunately, gains made in reading job-related materials may be only partially reflected in a standardized testwhich evaluates general reading skills.

More subtle criticisms have been leveled against standardizedtest use for evaluation of workplace literacy programs. First, thequality of information revealed by these tests presents incomplete

pictures of adult learning (as described above, an adult may readfamiliar materials somewhat better than general standardized testscores indicate). Second, the effects of such test scores on teachingare considered to be adverse by some (i.e., when teachers teach to the

test and sometimes ignore materials that learners need for the job).Third, the way in which standardized test scores are reported can behumiliating to adults and counterproductive to learning. Someeducators argue that when adults are informed that theirperformances are equivalent to low grade levels (i.e. 6th grade or

27 2J

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

lower), it becomes a reminder of their failure rather than an objectivedescription of abilities.

RecommendationSome workplace literacy programs find that the disadvantages of

standardized tests outweigh the advantages and rely instead oninterviews, questionnaires, and other indicators to assess programeffectiveness. Such assessments are custom-designed for the programbeing evaluated. Other programs use standardized tests as part of amix of assessments. If standardized tests are used, they should neverbe the sole measure of learner gain in a workplace literacy program.

CUSTOM - DESIGNED ASSESSMENTS

An alternative or a supplement to standardized tests are custom-designed instruments which are based on workplace materials andactivities. To design such instruments and, indeed, to custom-designtraining programs, one first needs to determine how workers useliteracy in a particular workplace. The first step is to perform aliteracy task analysis.

Literacy Task AnalysisLiteracy task analysis is a way of identifying those aspects of job

tasks which require reading and problem solving. These analyses areperformed using a combination of observations of workers, interviewswith top performers, and gathering samples of print used in theworkplace and training classes. The goal is to determine the mentalprocesses used by top performers as they solve problems andcomplete tasks which involve literacy. This information can be used toconstruct both test scenarios and instructional materials. It isimportant that these two be developed together, so that tests canassess what learners are really taught and both can be linked directlyto the workplace.

Observations and interviews with supervisors and workers areused to identify the areas in which performance needs to be improved.

28 3)

ASSESSING WORKPLACE LITERACY PROGRAM RESULTS

Prime targets for literacy task analyses are tasks where basic skillsdeficiencies cost money or threaten health and safety. Other tasks canbe identified by noting changes in the workplace (e.g., newtechnology, changed Jobs or promotions) which confront someworkers with new and sometimes troublesome literacy tasks.

A good deal has been written about how to perform literacy taskanalyses (see Mikulecky, 1985; U.S. Departments of Education andLabor, 1988; Drew & Mikulecky, 1988; Philippi, 1988 & 1991). Mosttechniques involve determining the elements of a task and thestrategies (both visible and mental) employed to accomplish the task.For example, filling in forms in some quality assurance proceduresinvolves the elements of reading two-column charts, computing usingdecimals, knowing special vocabulary and abbreviations, and being ableto summarize sequences of events. Within each of these elements,top performers employ a variety of strategies (i.e. skimming,estimating, interpolating, etc.)

Philippi (1988) has identifiedstrategies which are listed below.

VocabularyRecognize common words andmeanings

Recognize task related words withtechnical meanings

Identify word meanings from sentencecontext

Recognize meanings of commonabbreviations and acronyms

Recognizing cause and effect. predictingoutcomes

Use common knowledge to avoid hazardor injury

Apply preventive measures prior to taskto minimize security or safety problems

Select appropriate course of action inan emergency

29-

a number of such elements and

Inferential ComprehensionDetermine figurative, idiomatic, andtechnical meanings of terms, usingcontext clues or reference sources

Make an inference from text that doesnot explicitly provide requiredinformation

Organize information from multiplesources into a sequenced series of events

Interpret codes and symbols

ljteral comprehension

Identify factual details or specificationswithin text

Follow detailed, sequential directionsto complete a task

Determine the essential message of aparagraph or selection

31

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

1,ocating.informatitgi within a text

Use table of contents, index, appendices,glossary, subsystems to locateinformation

Locate page, title, paragraph, figure, orchart needed to answer questions orsolve a problem

Use skimming or scanning to determinewhether or not text contains relevantinformation

Cross reference within and acrosssource materials to select informationto perform routine activity

Use a completed form to locateinformation needed to complete a taskactivity

Comparing and contrasting

Combine information from multiplesources

Select parts of a text or visual materialsto complete a task

Identify similarities and differences inobjects

Determine presence of a defect or extentof damage

Match objects by size, color, orsignificant marking

Classify objects by size, color, orsignificant marking

Distinguish between relevant andirrelevant information in texts orvisuals

Using charts diagrams and schematics

Obtain a factor specification from atwo-column chart to find information

Obtain a factor specification from anintersection of row by column on a tableor chart

Use a complex table or chart requiringcross-referencing within text material

Apply information from tables orgraphs to locate malfunctions or toselect a course of action

Use simple linear path of anorganizational chart to list events ina sequential order

Use the linear path of a flow chart toprovide visual and textual directionsfor a procedure, to arrive at a decisionpoint or to provide alternative paths inproblem solving

Isolate each major section presented ina schematic diagram

Isolate a problem component in aschematic and trace it to the cause of theproblem

Interpret symbols to indicate directionof flow, text points, components, anddiagrammatic decision points

Identify details, labels, numbers, andparts from an illustration or picture

Identify parts from a key or legend

Interpret drawing of cross-section forassembly or disassembly

Interpret a three-dimensional, orexploded view, of an object for assemblyor disassembly

Follow sequenced illustrations orphotographs as a guide

Materials and information gathered during literacy task analysescan be used to develop instructional materials as well as to developcustom-designed assessment instruments for workplace literacy

303')

ASSESSING WORKPLACE LI1ERACY PROGRAM RESULTS

programs. Examples of such instruments (i.e., job-related Cloze testsand literacy scenarios) are discussed below.

Job-related Cloze TestsStandardized tests reveal an individual's general reading ability.

A Cloze test is a custom-designed measure to assess how well a personcan comprehend a particular type of reading material (e.g., job-relatedinformation). From the materials gathered during the task analysis,representative prose passages of about 150 words can be selected forthe construction of Cloze tests. This is done by omitting every fifth

word from a passage, usually leaving the first and last sentences intact.This results in a passage containing about 25 blank spaces which thetest-taker is asked to fill in, using the surrounding context of senseand grammar.

The ability of readers to replace missing words accuratelycorrelates very highly with scores on traditional readingcomprehension tests (Bormuth, 1969; Mikulecky & Diehl, 1980). A

general rule of thumb is that being able to replace less than 35% of themissing words indicates that the passage is beyond thecomprehension of the test-taker. In other words, if the reader canreplace less than 9 out of 25 missing words, the reading is toodifficult. Replacing 50% or more of the missing words indicates theability to read and comprehend the material independently. So, in apassage with 25 blanks, a score of 13 shows that the reading is of asuitable standard for the reader. Scores between these values reflectthe degree to which the reader needs some instructional help tocomprehend fully what is being read. Nobody is ever expected toreplace all the missing words correctly. A score of 50% is consideredquite good, and making test-takers aware of this may defuse thefrustration that they are likely to feel when unable to guess satisfactory

words for a number of the blank spaces.

A sample Cloze test (with answers included) is provided below.The instructions include a practice example, since many readers have

never taken a Cloze test before and sometimes require guidance in

31 33

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

getting started. Appendix C includes instructions for developing Clozetests and some additional examples.

Name Date

Ooze Exercise

In a doze exercise, you try to guess which words are missing. Forexample, in the sentence below, a word is missing.

She looked before she the street.A good guess for the missing word is "crossed."She looked before she crossed the street.

In the story below, try to guess and replace the missing words. Don'texpect to get them all. Some are nearly impossible.

G.IVI Designs Safety for All AgesWe all like to think about the old days. Life seemed simpler and, in

some ways, better then. But when it comes to , the

good old days offer the same degree safety

as today's cars trucks. Advancements in

technology the G.M. vehicle you

today among the safest the world. Each G.M.(continues to approximately 25 blanks)

Cloze Exercise key: automobiles, didn't, of, ans4 mgc, purchase, in

Job Problem Solving Simulations and ScenariosSimulations and scenarios can be constructed by using actual

materials from the workplace to assess the job literacy abilities ofworkers. Information and materials gathered during the task analysisform the basis for constructing job-like scenarios in which the learnerreads and makes decisions based on written materials. Scenarios areusually constructed to reflect a range of material types (i.e. prose,documents, graphic material), and sometimes involve both readingand computation. If the range of learner reading abilities is likely tobe wide, it is useful to construct scenario questions which range from

fairly easy to fairly complex, so that all test-takers can experiencesuccess at some level.

32 341

ASSESSING WORKPLACE LITERACY PROGRAM RESULTS

Appendix A contains samples of job scenarios and directions forconstructing such scenarios. For full range testing purposes, it isrecommended that scenarios include process questions, factualquestions, inference questions, and application questions. Processquestions determine how the reader reads a passage: that is, the rangeand sophistication of reading strategies employed. Factual questionsshould have answers based directly on the reading material, answersto inference questions can involve deductions from several places inthe reading, and application questions should relate the reading to theinterviewee's background knowledge. (See the examples below.)

Process question

I am going to show you a newspaper article about your industry.

Explain to me how you would read this story in order to find outwhat the writer thinks.

Describe what you would look at. What would you be thinkingabout? How would you go about reading this story? What wouldyou do first, then next, then next?

Factual question

How many employees does ASMO have in Statesville?

(Answer: 400. Listed in article )

Inference _question

From the information provided about products, what do all four,companies have in common?

(Answer: All of them make some sort of motor. Requires theinterviewee to search for commonalities not readily apparent. )

Application question

What company makes products closest to your job at thisfacility? Why do you say so?

(Answer: Relate a product on the list to what the employeemakes. Requires the employee to sort through theinformation and then to apply it to his/her backgroundknowledge. )L

33 3:;

EVALUATING THE IMPACT OF WORKPLACE LI1ERACY PROGRAMS

In addition to their use in a pre-test to establish base-line datafor assessment, job scenarios can be used at the beginning of aprogram to diagnose areas of learner difficulty. If the information onthe scenarios is also part of a training curriculum, the scenarios canprovide instructors with valuable information. For example, if alearner consistently has difficulty with inference questions acrossscenarios, the instructor can adjust instruction to provide moreguidance and practice in this area. The instructor should not,however, provide detailed feedback to learners about theirperformance on the scenarios if the program intends to use thosescenarios again as a post-test to assess learner gain and program

effectiveness.

A test can be used a second time to indicate learner growth ifthe learner has not been taught or given feedback using the actual test.It is also important that sufficient time has passed between pre- andpost-tests (six weeks is usually sufficient) for detailed memory todecay. If such time is not available, it is possible to develop two verysimilar tests and establish the comparability of the two scenarios bynoting how a pilot group scores on them. This is a fairly lengthyprocedure, but worthwhile if the tests will be used with many learnersfor several years. Once comparability has been established, the twoforms of the scenario can be used as pre- and post-measures.However, using the same scenarios for both tests provides a morereliable means of establishing comparability.

ASSESSING A BROADER CONCEPTION OF ADULT LITERACY LEARNING

Lytle (1990 a & b) suggested that performance measures (tests

and exercises) miss a good deal of important information about adultliteracy learning. In addition to gains in literacy skills, adults maymake changes in what they believe, how they behave, and in theiraspirations. Lytle suggests several dimensions which constitute afuller understanding of adult literacy and adult literacy growth. Thesedimensions are learner beliefs about literacy and themselves, learnerliteracy practices, the literacy processes employed by learners while

34iii

ASSESSING WORKPLACE LITERACY PROGRAM RESULTS

reading, and the plans a learner has which may involve literacy use.

Lytle's conceptual framework has been adapted to the presentworkplace literacy project to test the importance of these aspects ofadult learning: beliefs, practices, process and plans, and in order toseek out ways to enhance learning. Information about thesedimensions of learner literacy were gathered using a combination ofquestionnaire items, interview questions, and requests that learnersexplain their literacy strategies or processes while doing job scenarios.

BeliefsIn the interview, learners were asked to describe themselves as

readers and writers and to describe someone who seemed to be verygood at reading and writing. They were also asked to provide reasonsfor their answers. Changes in these beliefs are likely to precedechanges in literacy abilities. Sample questions from the interviewfollow below and are available in Appendix A.

Beliefs

1. Describe someone you know who is good at reading and writing.What makes you choose this person?

2. How good do you consider yourself to be at reading and writing?What makes you think so?

3. Describe how you would like to be in terms of reading and writing.(Probe : Could you give me some examples?)

PracticesLearners were asked orally and in the questionnaire for

information about the types of reading and writing they do on the job

and off the job . They were asked to rate the difficulty they had inreading each item on a list that included books, signs, trainingmanuals, pay stubs, charts and cartoons. They were also questioned

about the frequency of their literacy-related activities: how often, for

example, they read a newspaper, made a shopping list or visited alibrary, as well as how many books they owned. Information was also

3 5,37

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

sought about literacy practices in workplace situations which rangedfrom departmental meetings to handling broken equipment, fromreading instruction manuals to reading a health insurance policy.

Sample interview and questionnaire items follow below and areavailable in Appendices A and B.

Practices

Interview item

Tell me the sorts of things you read and write away from workduring a normal week.(For probe, ask: "Can you give me more examples?")

Questionnaire items

1. First check only the things you've read in the past month.

Now go back and rate your ability to read the items you've checked.

poor excellentlocal newspapers 1 2 3 4 5classified ads 1 2 3 4 5telephone bills 1 2 3 4 5TV guide listings 1 2 3 4 5magazines 1 2 3 4 5

In the last 7 days how many times have you read a newpaper?

0 1 2 3 4 5 6 7 8 9 10+

You talk a lot in team or department meetings, asking questionsor sharing ideas.

very like me 1 2 3 4 5 very unlike me

ProcessIn order to seek information about the processes which learners

go through when reading work materials developed as part of the jobscenarios (see description on pp. 32-34 above), some questions in

363 8

ASSESSING WORKPLACE LTIERACY PROGRAM RESULTS

each scenario asked students to think aloud about the way they werereading the material. The purpose of these questions was todetermine whether learners were employing sophisticated readingstrategies (i.e. skimming, focussing, asking questions, etc.) andwhether the choice and use of reading strategies improved as a resultof training. Sample questions follow below and are available in

Appendix A.

Process

I am going to show you a newspaper article about your industry.

Explain to me how you would read this story in order to find outwhat the writer thinks.

Describe what you would look at. What would you be thinkingabout? How would you go about reading this story? What wouldyou do first, then next, then next?

PlansSome questions in the interview sought information about the

learners' plans, especially in relation to education and goals requiringincreased literacy abilities. These questions asked for informationabout learner plans for 1 year, 5 years, and 10 years ahead. Samplequestions follow below and are available in Appendix A.

Plans

Now I'd like to ask you about your plans. Explain how you seereading and education as part of these plans:

A What are your plans for the next year?

R What are your plans for the next 5 years?

C What are your plans for the next 10 years?41.

3 '1 3'3

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

CONCLUSION

Workplace literacy program impact is best measured using amixture of standard assessment tests and custom-designedinstruments. Standardized tests provide useful information aboutgeneral reading ability, but may be misleading with regard toworkplace literacy skills.

Custom designing starts with a literacy task analysis to identifyaspects of job tasks which require reading and problem-solving, and inwhich performance needs to improve. Cloze tests based on workplacematerials can be used to assess workers' abilities at job-relatedreading. Job scenarios can test their skills in using what they read,through process, factual, inference and application questions.

A broader conception of adult literacy learning can be assessedby seeking information about the learners' literacy beliefs, practices,processes and plans, using interviews and questionnaires.

38

CHAPTER 4

Assessing Impact on Family Literacy

OVERVIEW

Chapters 2 and 3 have considered the evaluation of workplaceliteracy programs in relationship to impact at the workplace.Workplace literacy programs also have effects on workers' families andchildren. This chapter examines means for evaluating those effects.

The chapter considers the factors which can be used to measurefamily literacy impact. A review of previous research on this topic isfollowed by a discussion of questionnaire and interview items used inthe current evaluation of family impact of workplace literacy programs.The complete instruments appear in Appendices B and D.

Topics discussed are:

socio-economic level of parents;

education level of parents;

aspiration of parents for their child's education;

ability of parents to act as role models;

promotion by parents of literacy activities.

WORKPLACE LITERACY PROGRAMS AND FAMMY LITERACY

It is possible for workplace literacy programs to affect not onlylearners' literacy levels and productivity on the Job but also literacy inthe workers' families. Home literacy activities can both benefit theemployees' children and also increase the employees' literacy practice

time. Program descriptions provide many anecdotal examples of thesebenefits. A young mother in a workplace literacy program at PlantersLife Savers in Virginia reports that she enrolled in the company's basiceducation program, not only to be able to help her seven children with

39

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

their homework, but also to persuade her oldest son that it isimportant to finish school (Cooper et al., 1988). Gross, Lee and Zuss(1988) reported that one workplace literacy student began to help hereight-year-old son with homework and was able to leave handwrittenmessages for her children.

The effect of literacy programs on the children and families ofworkers is often neglected in evaluating program effectiveness,however. At both sites in the current study, assessment of familyliteracy was conducted through pre- and post- questionnairesmodified from survey questions used by Greer and Mason (1988). Thequestions covered parental guidance, literacy artifacts, and child-initiated literacy behavior. In addition to individual questionnaires for

parents, some parents were interviewed in focus groups. The FamilyLiteracy Focus Group Interview was administered only to theparticipants of the program at one site and was based on the work of

Fitzgerald, Spiegel, and Cunningham (1991). Samples of the questionsfrom each of these instruments will accompany the discussion of theimpact of parent literacy on their children which follows.

IMPACT MEASURES

At least five factors have been identified by research as related tothe ability of parents to affect a child's achievement in literacy: thesocio-economic status of the parents, their educational level, theaspirations parents have for their child's education, the ability of aparent to act as a role model, and the parents' promotion of literacy

activities. Some of these factors are more easily altered than othersthrough a workplace literacy program.

The first two, the direct correlation between the educational orsocio-economic levels of parents and the child's literacy ability, havebeen identified by researchers as solidly linked (Chan, 1984; Laosa,1984; Sticht, 1983; Sticht & McDonald, 1990). However, a briefworkplace literacy program is not likely to affect income and generaleducation levels directly nor very quickly. The latter three measures

404

ASSESSING IMPACT ON FAMILY LITERACY

are more likely to be affected by a workplace program.

Parents' aspirations for the best education for their childrenappears to be important in the child's own aspirations, asMarjoribanks found (1984a & b). Chall and Snow (1982) have shownthat children whose mothers set high educational goals for themachieve higher levels of reading comprehension and word recognition.

Some research indicates that high educational aspirations forone's children may be connected to a parent's own educational level.

Laosa (1982) found a significant relationship between a mother'seducational aspirations for her child and the level of schooling of bothparents, However, Lujan & Stolworthy (1986) found that theeducational aspirations of lower socio-economic status families werejust as sincere and ambitious as those of parents from middle tohigher levels. Unfortunately, as important as having aspirations maybe, parents who are unable to help their children reach such goals areat a disadvantage.

The ability to model reading and to engage in interactions with achild which encourage and teach literacy is important. However highthe aspirations of a parent might be, Nickse et al.(1988) suggest thatilliterate adults cannot model what they do not know. Fitzgerald et al.

(1991) found that low-literacy parents did not even mention adult rolemodeling as important in interviews about how to help their children,whereas high-literacy parents talked about the need to have theirchildren see them reading. Work with middle school students byFielding, Wilson, and Anderson (1986) showed that readers amongstudents tended to have parents and siblings who read. A parent'sability to model oral language skills also seems to affect a child's abilityto read in school (Sticht, 1983; Loban, 1964; Chall & Snow, 1982).

In the current study, questionnaire and interview items weredeveloped to measure effects in these areas. Examples follow below.

41 4 r 7%

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

Questionnaire item

In the last 7 days how many times has your child seen youreading or writing?

0 1 2 3 4 5 6 7 8 9 10+

Interview item

At home, do your children see you doing any reading or writing?(i.e. books, magazines, papers, recipes, directions, letters, lists,notes, etc.)

Closely related to the parent as a role model is the activity of aparent to encourage a child in literacy activities. Included in suchactivities are the creation of a literacy environment in the home andthe use of a community library. According to Fielding, Wilson andAnderson (1986) readers in middle schools come from homes inwhich there are many books and many opportunities to go to a library.Similarly, Greer and Mason (1988) found that the children who scorehigher on tests of reading recall are children who frequent a library,have someone at home who reads to them often and helps them read,and who have books and magazines purchased for them.

Parents who directly promote their children's reading havechildren who seem to do better in school. McCormick and Mason(1986) sent simple-to-read Little books home for parents to read totheir preschool children. The parents were given instructions inhelping their child learn to recite. That activity had a significanteffect on the child's later reading in kindergarten and first grade.Furthermore, Chain and Snow (1982) discovered that readingcomprehension was higher for the second, fourth, and sixth gradersthey studied if their homes provided more literacy experiences andreading materials which were both interesting and appropriate for thechild. Stewart (1986) administered a reading test to 56 children andcompared their scores to the answers their parents had given to a

4244

ASSESSING IMPACT ON FAMILY LITERACY

questionnaire that assessed home support for early reading. He founda significant relationship between borrowing books from a publiclibrary and the children's performance on the test.

Assessing the literacy environment in terms of reading materialsavailable in the home or trips to the library appears in questionnaireitems.

. In the last month how many times have you bought or borrowedbooks for your child?

0 1 2 3 4 5 6 7 8 9 10+

2. In the last month how many times has your child gone to apublic library?

0 1 3 4 5 6 7 8 9 10+

Being a role model and providing materials are not the only waysparents improve children's literacy. Activities in which parents andchildren interact together are also important. Such activities includereading aloud to a child, encouraging the child to ask questions andmake predictions about the text, allowing the child to initiate aliteracy event, and parental involvement with the school. Bothquestionnaire items and questions from the Family Literacy FocusGroup Interview address such activities.

Time spent reading with a child, particularly prior to' the school-

age years, can affect the child's later success or failure in reading.Stewart (1986) visited the homes of four children several times over atwo-month period and learned that stimulation from parents made

more of an impact on children's reading abilities than merely havingseveral books around the house. In fact, the effect of reading aloud tochildren has been widely studied. Chomsky (1972) revealed that themost important activity for building the knowledge required forliteracy success is reading aloud to children. Laosa (1982) foundsignificant correlations between mothers who read to their childrenand the child's literacy skills in preschool. Buchanan-Berrigan (1989),

43 45

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

Anderson (1985), Tea le (1984), Tea le & Sulzby (1986), andFitzgerald, Spiegel and Cunningham (1991) also indicate that readingaloud to children helps the development of preschool literacy, which,in turn, enhances school learning, especially when the child is anactive participant.

Below are questionnaire items which assess such activity:

1. In the last 7 days how many times have you read/looked at bookswith your child or listened to him/her read?

0 1 2 3 4 5 6 7 8 9 10+

2. In the last 7 days how many times have you helped your childwith homework and/or with school projects?

0 1 2 3 4 5 6 7 8 9 10+

Several studies have revealed that parents who have beenobserved reading to their young children are also found to encouragethem to label pictures, ask questions, and relate text information totheir own experiences (DeLoache & De Mendoza, 1985; Harkness &Miller, 1982; Snow & Ninio, 1986; Pellegrini, Brody, & Seigel, 1985;Yaden, 1982). As Mason & Stewart (1988) suggests, these parents areleading their children towards the use of inference andcomprehension monitoring strategies. The benefits of reading aloudto children, therefore, seem to be greatest when the child is an activeparticipant who engages in discussions about stories, learns to identifyletters and words, and talks about the meaning of words (Anderson,1985).

An interview question assessing such interactions follows below:

Do you do any reading or writing activities with your children?(i.e., visit library, hear stories, read to them, watch educationaltelevision, look at magazines or books with children,point out words to them, play school, show them howto read or write, etc)

444 6'

ASSESSING IMPACT ON FAMILY LITERACY

From reading with a child and encouraging the child'sinteraction with the text, a next step is to attend to whether the childever initiates the reading activity. McCormick and Mason (1986)found that those parents provided with inexpensive books for theirchildren reported significantly more child-initiated use of books andchild-initiated attempts to print than did a control group. Moreimportantly, they indicated that the children had invited their parentsinto literacy activities, such as asking to read stories to their parentsand asking for help with new stories, to a greater extent than thechildren of a second group of parents who were not given books.Tea le (1983) discovered that as children become more adept, theytake over more and more of the interaction until they can read thebook alone or write on their own without help.

Child-initiated behavior was more thoroughly examined by Lujanand Stolworthy (1986), who found that the most significant resultfrom parent training was a positive change in most children's literacybehavior. For example, the children began to attend more closely tostory time and parent instruction. They showed increased self-direction in organizing personal time so that there would be time atnight for story reading.

Questionnaire items addressing these issues follow below.

1. In the last 7 days how many times has your child looked at orread books or magazines?

O 1 2 3 4 5 6 7 8 9 10+

2. In the last 7 days how many times has your child asked to beread to?

O 1 2 3 4 5 6 7 8 9 10+

3. In the last 7 days how many times has your child printed, madeletters, or written?

O 1 2 3 4 5 6 7 8 9 10+

45

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

CONCLUSION

Workplace literacy providers want to get the most for theirinvestment. Effective programs may be able to improve the abilities of

workers on the job while at the same time providing a benefit tochildren in the home. Longer term effects of increasing workers'literacy abilities can include examining the effects on a worker's family

and children. Yet workplace literacy program evaluations oftenneglect such impacts. We know, too, that as workers are encouragedto carry newly-won literacy abilities home, they benefit from theopportunity to increase their own practice of these skills.

In assessing the effects of workplace programs on workers'families, five factors have been identified. These are:

1) socio-economic status of the parents;2) parental educational level;3) parents' aspirations for their child's education;4) the ability of parents to model literacy practices;5) parental encouragement of literacy practices with their

children.

The first two are not as readily affected by short-term workplaceprograms and, therefore, are less desirable assessment targets.

Measurement of parental aspirations, modeling, andencouragement were conducted during the current study throughquestionnaires given before and after the program and through FamilyLiteracy Focus Group Interviews.

464S

CHAPTER 5

Assessing Impact on Productivity

OVERVIEW

A review of the literature on productivity assessment shows thatlittle is known about the effect of workplace literacy programs on jobperformance, but there is some evidence of the value of such programsand of the costs associated with lack of training. Methods that can beused to assess workplace literacy programs include the following.

A training program can be assessed for its impact onproductivity using employee output, and such indicators as safety,absenteeism and retention, with these measures being taken bothbefore and after training. Also, employees can be rated by theirsupervisors on various aspects of job competence and attitude, andchanges in these ratings could be used in the calculation of the dollarvalue of the program to the company.

Such methods and others directly related to literacy wereincorporated into this study. To measure changes produced by aprogram, the following instruments were used both before and aftertraining:

Employee performance was assessed using records ofabsenteeism, safety, discipline, grievances and suggestions.

Interviews and questionnaires were used to assess employeepractices and processes of job-related literacy.

Supervisor ratings were obtained on various aspects ofemployee lob competence and attitude.

LITERATURE ON PRODUCTIVITY ASSESSMENT

Workplace literacy programs have been offered by many

organizations, both government and private, but not much is known

4 7 43

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

about the effect of such programs on the job performance of theemployees involved. For the most part, these organizations haveregarded literacy programs more as philanthropic than as businessenterprises, and so have not considered it appropriate to subject themto their usual cost-benefit analyses.

There are, however, a number of indications that such programscan have a positive influence on the effectiveness of the workersinvolved. Co llino et al (1988: 19, Note 17) mention a Blue Cross/BlueShield program that has decreased turnover, improved performanceand promotion prospects. as well as increasing motivation and self-confidence among employees. Also, the Federal Reserve Bank's SkillsDevelopment Center has had considerable success in training under-educated school dropouts up to a standard of job performancecomparable to qualified entry-level workers (Hargroves, 1989).

In addition, there are a number of cases on record of the costsassociated with employees' Jack of basic skills:

the inability to read a ruler wasted $700 worth of steel in onemorning;

the inaccurate use of new scheduling equipment cost $1 m tocorrect the resulting errors;

employees at a lumber camp imitated the illustrations on safetyposters because they could not read the text describing theseas dangerous practices to be avoided (Collino et al, 1988: 11-12).

However, the fact remains that, as noted earlier in this book,there has been very little systematic evaluation of workplace literacy,even of its effect on the employees' more general ability to cope witheveryday literacy demands. So it is perhaps hardly surprising that tworecent reports should come to the following conclusions.

"Very little research exists about the relationship of literacy tojob performance. Much of what exists is sketchy and based oninformation obtained from studies conducted in the military"(U.S. Depts of Education and Labor, 1988: 37).

Collino et al (1988) found that, even when companies do conduct

48

ASSESSING IMPACT ON PRODUCTIVITY

assessments of their literacy programs, the results are not madepublic. Furthermore, such assessments rarely involve a study of how

productivity might be affected:

"At best management relied on informal feedback of supervisorsregarding employee performance" (Collino et al, 1988: 9).

METHODS OF USE FOR WORKPLACE LITERACY PROGRAMS

A workplace literacy program should have a positive andmeasurable impact on productivity. However, most companies do nothave an evaluation methodology and therefore can not easily recognize

the impact on productivity of training workers.

Impact on productivityThough little research exists on methods to assess the impact

on productivity of workplace literacy programs. more research anddiscussion is available on the general topic of the impact of trainingupon productivity (National Research Council, 1979). When workersare producing an actual physical output, any change in the quantity orquality of that output can be measured for the same workers before

and after training or a comparison can be made between the output oftrained and untrained workers. Programs that make such assessmentsare usually broad range training programs which can compare theoutput of a trained plant, division or work-team to a comparablecontrol group. Assessing productivity impact at levels below the work-team is often precluded because many industries do not collectproductivity information (i.e. production and defect rates) at theindividual level.

A broader definition of productivity allows for some informationto be collected at the individual level. For example, other factors thatmay be affected by a training program are

retention and promotion;absenteeism and punctuality;dishonesty;

49 U.

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

accident rates;use of suggestion boxes.

In addition, if productivity is broadly defined as supportingcorporate goals, increased participation in voluntary activities (e.g.,additional training or employee quality participation groups) can alsobe included among productivity indices. (See for example Co llino etal, 1988; U.S. Depts of Education and Labor, 1988.) All of thesefactors can be made the subjects of comparison, both of employeesbefore and after a program and of those employees with others notattending the program.

Supervisor ratingsAnother way of obtaining information about the effect of training

on individual workers is to use supervisor ratings. These can be asingle score for each employee or, preferably, a set of scores coveringvarious specific skills and attitudes associated with job performance.Depending on the nature of the work concerned, these aspects arelikely to include:

setting up and operating machines;keeping up-to-date with paperwork;taking responsibility for their own work;having the initiative to solve problems as they occur;communicating with other workers;being committed to company goals.

For each aspect, a rating scale can be set up with descriptions ofworker perforniance at low, average and high levels. For example, inorder for supervisors to rate, on a scale from 1 to 10, workers'initiative in dealing with machine errors, the descriptions might be:

rating of 2 ignores machine errors and lets them build uprating of 5 - realizes machine errors and attempts solutionrating of 8 monitors machine errors and deals with them

50

ASSESSING IMPACT ON PRODUCTIVITY

These descriptions serve to anchor the rating scale to specific workerbehaviors, in order to produce consistent ratings between supervisorsand from the same supervisor in pre- and post-training assessments.Developing the descriptions with the help of workers and supervisorsenables them to be a realistic reflection of job prn_ctice. For examplesof supervisor ratings, see Appendix G.

Such job performance scales anchored to validated behaviorshave proven to be useful in lowering error, increasing reliability, andbeing efficient in terms of job performance ratings (Borman, 1977;Latham, Wexley & Pursell, 1975). Job performance scales anchored tobehaviors have proven to be most effective when special care is takenin describing the job dimensions to be evaluated (Dickinson, 1977)

and when unambiguous anchor descriptions are developed withinvolvement from job incumbents and the supervisors who are toparticipate in rating job performance (Norton, Balloun &Konstantinovich, 1980). Mikulecky & Winchester (1983) andMikulecky & Ehlinger (1986) have successfully used such anchoredsupervisor ratings to assess job performance in the nursing profession

and the electronics industry.

An alternative supervisor rating approach is to use an overallassessment of the performance of each employee, as rated by theirsupervisors, to calculate the "Utility" of the training or literacyprogram in terms of its benefits minus its costs. (See Sheppeck &Cohen, 1985; Schmidt, Hunter & Pearlman, 1982; Cascio, 1982.) Forthis calculation, other factors required are an estimate of thedifference in dollar value to the company between an outstanding andan average employee, the likely duration of the training's effect, andthe cost of the program. (See the Endnote to this chapter for anexample of a utility calculation and further details concerning the useof this method.)

51

EVALUATING THE IMPACT OF WORKPLACE LrIERACY PROGRAMS

METHODS INCORPORATED INTO THIS STUDY

For this project, indicators relating to productivity weregathered on each employee both before and after the training programor, in the case of on-going programs (such as GED classes), at suitablyspaced intervals. These included statistics on attendance, safety andsuggestions. Interviews with employees and questionnaires that theyfilled out assessed their attitudes to the workplace and various job-related skills. Also, supervisors assisted in the development ofanchored rating scales which they then used to assess each employeebefore and after training.

Productivity dataIt was not possible to obtain data on the actual output of the

individual employees involved in training, for two different reasons atthe two companies participating in this project. As mentioned above,companies do not gather such data for units below that of the workteam; so, at both sites, the attempt was made to have a whole teamtake part in training at the same time. However, at one site, thecomposition of the class could not in the end be arranged in that way,as teams were re-organized and some individuals could not be releasedfor training. And, at the other, although a team did go throughtraining together, data for that team could not be separated out fromplant-wide figures for individual analysis. Thus, in order for thegatherir.g of output data to be successful, it must be possible for acompany to arrange training for a whole work team and formechanisms to be put in place, perhaps specially for this purpose, toobtain the output data for that team.

The following measures were, in fact, used to evaluate changesin employee performance, each measure being taken both early andlate in each program for the employees involved, so as to assess theimpact of that program. In addition, comparisons were made in onecase with a control group of employees who had not yet participated inthat program.

52

ASSESSING IMPACT ON PRODUCTIVITY

Data relating to employee attitudes were collected on:absenteeism;grievances submitted;discipline records;workplace safety records;suggestions made;suggestions accepted.

Interviews and questionnairesTo supplement these company records, employees were

interviewed and also filled out questionnaires (see Appendices A and

B). Their purpose was, in part, to assess attitudes toward theworkplace and competencies associated with the workers' jobs. In theinterviews, the employees were asked about the types and amounts ofreading and writing they do on the job; this was to assess the quantityand quality of their workplace literacy activity. They were also askedto demonstrate specific skills related to their work, such as using a jobaid or written information sheet, and a graph or table. Questions herewere of two types: process and content. Process relates to how theworkers go about using the item; for example,

do they use job aids regularly;what parts of them do they look at;how long does it take them to read one.

Content questions are more specific to the particular item, asking forinformation that the workers should be able to obtain from the sheet

in front of them, such as:

what components do you need to make this part;how do you carry out this procedure;what does this graph show as the inventoryvalue on a certaindate.

Some content questions called for interpretation by the interviewee,drawing on the given information to make inferences about the

situation; for example,

53

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

why do you think the value fell during this particular month?;what might have caused this type of wastage to occur?

.40 The questionnaire dealt, in part, with reading and talking inrelation to the workers' jobs, particularly their abilities and confidencein reading instructions and talking in meetings. Among items of amore general nature that they were asked to rate as easy or hard toread were a number of work-related ones:

job aids;part specifications;safety rules;benefit information;the plant newspaper.

In addition, they were asked to use a scale from "very like me" to "veryunlike me" in rating such statements as:

your ideas are often discussed in meetings;

when written information is handed out, you read it tosee what it is about;

when paperwork comes to you about your job, you often havetrouble reading it.

Supervisor ratingsTo obtain another perspective on this information gathered

directly from the employees, supervisors were asked to assess eachworker on various aspects of job performance that contributed toproductivity and were related to task competence, communication,teamwork and paperwork skills. These assessment instruments weredeveloped with the assistance of those who would be using them,through interviews that determined what aspects should be coveredand how to describe behaviors typical of top, average and bottomperformers. Specific aspects included were the ability to

set up and calibrate a machine;use recording forms;trouble-shoot machine errors.

54u

MN+

ASSESSING IMPACT ON PRODUCTIVITY

Also assessed were attitude indices such as:

how much they took responsibility for their own work;

how well they worked as a member of a team;how committed they were to company goals.

For each of these aspects, anchoring descriptors for Bottom, Average,and Top performance were related to a scale from 1 to 10. guiding thesupervisors in making their assessments.

Thus, the final supervisor rating form could contain instructions

such as:

An average employee would be rated 5.

A top employee would be rated 8 or higher.

A bottom employee would be rated 2 or lower.

And one item on a form could appear as follows (see Appendix G for

further examples).

Paperwork

Bottom Average Top

intimidated does job-related completes allby job-related paperwork, job-relatedpaperwork and simply paperworkdoes it poorly keeping pace and tries to

improveprocedures.

2 3 4 5 6 7 8 9 10

5 5 o

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

CONCLUSION

In order to assess the impact of a workplace literacy program onemployer objectives, measures of productivity should be taken beforeand after training. Such measures include company records. employee

interviews and questionnaires, and supervisor ratings.

Company records can supply information on output, safety.dishonesty, discipline and grievances, absenteeism and punctuality.retention and promotion, and productivity suggestions.

Employee interviews and questionnaires can supply informationon attitudes and job practices and skills. These include how muchreading and writing employees do in the workplace, how competentthey are at various types of reading, and their confidence with readingand in meetings.

Supervisor ratings can also supply information on employees'job-related skills and attitudes. Using anchoring descriptors for top,average and bottom performers, rating scales can be developed tocover such aspects as task competence, communication, team workand paperwork skills.

Endnote - Calculation of "Utility" of Training.Calculation of the "Utility" or cost effectiveness of a training program requires:

1. An overall measure of the job performance of each employee trained and of acomparable group of untrained workers. (This could be either a supervisor rating orbe based on production outcomes.)

2. A measure of the dollar value to the company of the difference between outstandingand average employees. (This estimate of the standard deviation of performance isknown as the "value".)

3. The expected duration of the training's effect.

4. The cost of the training.

As an example of a "Utility" calculation, let us suppose that the 20 employeeswho have completed a training program are rated by their supervisors, on average, at 65out of 100. The untrained employees received an average of 50, with a standarddeviation of 10. The trained workers are at a level of 1.5 standardized units above theuntrained -- this is the performance difference. If it is estimated that the average

565 -'

ASSESSING IMPACT ON PRODUCTIVITY

employee is worth $18,000 to the company and an outstanding one is worth $26,000.then an estimate of the "value" or standard deviation of employee performance is$8.000 (the difference between these two amounts). Suppose also that training costs$2,000 per employee and the effect of training is likely to last 3 years. Then we have:

"Utility" =

Years duration x Number x Performance x Value - Number x Cost perof effect trained diffetrnce trained trainee_

= 3 x 2C) x 1.5 x $8,000. - 20 x $2,000

= $720,000 $40,000

= $680,000 net utility to the company

This formula was originally developed by Brogden (1949) and revised into itspresent form by Schmidt, Hunter, McKenzie & Muldrow (1979). For examples of its use,see Sheppeck & Cohen (1985), Schmidt, Mack & Hunter (1984), Schmidt, Hunter &Pearlman (1982) and Cascio (1982).

Criticisms of the method are to be found in Boudreau (1983) and Cronshaw &Alexander (1985); these are refuted by Hunter, Schmidt & Coggin (1988). Modificationsto the procedure are suggested by Bobko et al. (1983) and Cascio & Ramos (1986); see alsoCascio (1982). Comparative studies of such modifications are contained in Greer &Cascio (1987), Burke & Frederick (1986), and Weekley et al. (1985).

57 5

Part II

THE CURRENT STUDY

6' J

CHAPTER 6

Structure of Current Study

OVERVIEW

The purposes of this study were to develop an impactassessment model for workplace literacy programs and to producedata on the impact of programs at two sites. A secondary goal was torefine the model for use at other sites.

The two sites chosen are very different, but both operateestablished, effective and diverse programs, involving technical andcommunications training, GED and ESL classes.

Pre- and post-program data were gathered on learner jobproductivity, learner literacy attributes, and literacy practices withtheir families. Instruments and methods used to gather data included:

learner interviews, tests and questionnaires, based on Lytle's"Beliefs, Practices, Process, Plans" adult literacy model;

questionnaire items based on key practices for developing homeliteracy;

productivity indicators such as attendance, safety and supervisorratings of on-the-job use of literacy and communication skills.

The data were analysed using statistical comparisons ofquantitative information, as well as qualitative and quantitative analysesof categories emerging from open-ended responses to interviewquestions.

PURPOSE

Although federal and private support funds thousands ofworkplace literacy programs, very few programs have been evaluatedbeyond a superficial level (Mikulecky and D'Adamo-Weinstein, 1991).

Typical workplace literacy program evaluations involve anecdotal

61

6.1

EVALUATING TIE IMPACT OF WORKPLACE LITERACY PROGRAMS

reports, learner satisfaction questionnaires or pre and post resultsfrom a standardized basic skills test such as the TA.B.E. or the A.B.L.E.

In late 1990, the National Center on Adult Literacy funded aproject to develop and pilot a model for evaluating the impact ofworkplace literacy programs. During Year 1, parallel pilot studies oftwo workplace literacy programs were used to:

(1) develop an impact assessment model for workplaceliteracy programs;

(2) produce data on the impact of two quite differentworkplace literacy programs.

The goals of this first year's efforts were to refine the impactevaluation model so that it could be transferred to additional sitesduring subsequent years and to establish base-line data for the level ofimpact one could expect from established workplace literacyprograms.

POPULATIONS

The evaluation model to assess the impact of workplace literacyprograms was piloted at two sites: Delco Chassis of Rochester, NewYork and Cumberland Hardwoods of Sparta, Tennessee. Though thesites were chosen for their differences in size, demographics, locationand industry, each site has a well-established workplace literacyprogram which addresses several different populations (i.e. technicalcommunication and basic skills training, G.E.D. preparation, andEnglish as Second Language preparation at Delco). Leaders at bothcompanies see it as necessary for survival to increase employeeinvolvement in the decision-making processes of day-to-day business.Each firm intends that those actually producing the goods will be ableto decide, for example, whether machines require adjustment orwhether their production line has stockpiled a sufficient quantity ofproduct X and should switch to product Y.

Both companies run education programs judged by regular state

2

//

STRUCTURE OF CURRENT STUDY

and federal acknowledgement to be effective, model examples ofworkplace literacy education. Since we were piloting newinstruments for assessing the achievements of such programs, wewished to establish benchmarks with programs which had beenindependently judged to be good programs.

Classes and individuals at each site provided information throughinterviews, tests, checklists and questionnaires to assess the impact ofprograms upon learners, their productivity, and family literacy inlearners' homes. In addition, curriculum materials were examinedand classroom instruction observed.

Subjects and Locations: DelcoSite #1, Delco Chassis, is a large, unionized (International Union

of Electrical Workers, Local 509), electrical motor manufacturingplant with over 3600 employees located in Rochester, New York.Employees are enrolled in an education program jointly operated byunion and management in conjunction with state and regionalagencies that provide some funding and help in providing instructorsfrom the local school system. All learners were from the productionteams. They participated in one of three types of class:

1) Technical Preparation - a 6 week, 7 hours per day coursedesigned to prepare employees for subsequent technicaltraining,

2) a GED preparation course which met in slightly varyingtime frames but generally 4 hours per week, and

3) an ESI course which met for 8 hours per week.

For the first of these, there was a control group, made up of workerswho had not yet begun the Technical Preparation course. Each ofthese four groups consisted of 12 - 15 employees. In addition, a small

control group (of 5) was available for the ESL group.

The Technical Preparation course is designed to preparelearners for the mathematics, reading, oral communication andblueprint reading skills judged to be prerequisite for further technicaltraining. Readings and activities are a mixture of some plant-specific

63- 63

EVALUATING THE IMPACT OF WORKPLACE =RACY PROGRAMS

materials and carefully selected off-the-shelf materials related tocourse objectives. Activities in the reading component of the courseinclude study skills exercises, reading rate exercises, and in-classactivities designed to increase learner motivation to read. Aninstructors' manual of several hundred pages outlines course objectivesand suggests materials and activities. Instructors are provided by thelocal school system, after screening by union and managementrepresentatives. Those who have been retained to teach the coursehave been able to demonstrate to these representatives that they canstructure their teaching to meet course objectives and receive highinstructor ratings from learners.

The GED course involves a good deal of individualized studydirected toward passing regularly scheduled GED tests. Learners usedpublished test preparation materials as well as traditional schoolmaterials and work-book exercises from an extensive in-plant library.Use of individual learner folders, seat-work, some full-class discussion,and regular individual feed-back from experienced GED instructorscharacterizes how class time is spent.

The SL class is team taught by an experienced English asSecond Language instructor and a Delco employee able to speak Italian(the first language of many but not most employees in the class).Activities follow exercises in several published English as SecondLanguage materials available in the Delco training center. Class timeincludes teacher demonstration of how to do language exercises, seat-work with both instructors providing individual feedback, and full-group discussion of correct answers and why answers are correct.

Demographic data on the Technical Preparation class revealedmost students to be in their late 20's, averaging more than 12 years ofeducation. The Test of Adult Basic Education is used by Delco toscreen learners for placement in classes and provide some diagnosticinformation to instructors. Students in the Tech Prep class averagednear the top of the reading portion of the T.A.B.E. at between 11th -12th grade levels in ability before entering the class.

64

STRUCTURE OF CURRENT STUDY

A control group of employees, not yet enrolled in the Tech Prepcourse, was interviewed and tested. Analysis of demographic datarevealed the control group to be slightly older, with more males andmore years of plant experience than the class group. In most otherways, the two groups were similar, however. No significant differencewas found for education levels or for the reading comprehensionscores on a Cloze test.

Demographic Information of Tech Prep and Control Groups

Characteristic 1 Tech Prep Control

Age (mean in years)* 27.9 34.6

Sex (M : F) * 6 : 8 11 : 1

Service (mean in years)* 5.9 10.8

Education (mean in years) 12.28 12.33

Cloze Test ScoresI

10.86 9.58

Significantly different at the p< 0.05 level of significance

Subjects and Locations: CumberlandSite #2, Cumberland Hardwoods of Sparta, Tennessee, is a non-

unionized, rural wood processing plant with approximately 300employees. The plant produces several hardwood products for thefurniture industry including drawer parts and components for kitchen

cabinets. New technology and an ambitious quality assurance programhave changed the nature of the work environment and of manytraditional Jobs. Cumberland has several small on-going trainingprograms. Employees enrolled in the classes participating in thisstudy were all from the plant floor.

65

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

One course at this site was entitled CommunicationCollaboration designed to train teams of employees involved in agiven phase of the firm's operation. Several teams had alreadycompleted training in communications skills needed to workcooperatively as self-directed teams. The pilot study involvedassessing two learning teams, each of 10 - 12 members, which theplant CEO described as the most difficult group of learners attemptedso far.

A second program at this site was an on-going GED course with6 students currently enrolled. The class is taught by an experiencedAdult Basic Education instructor employed by the company.Instruction follows the demonstration and seat-work patterndescribed above for the Delco GED course. Earlier cycles of the GEDcourse had allowed nearly 20 employees to complete the GED.However, because of the small number of current students and the factthat not all of them could be tested, insufficient data for useful analysiscould be obtained for this group.

The small size of this company prevented the formation of anycontrol groups for either class. Also, because Cumberland has had anactive and successful education program for more than three years,only a small fraction of employees had not yet passed through thesmall firm's training courses.

INSTRUMENTS

Following a literature review for instruments and techniquesemployed to evaluate previous workplace literacy programs, a menuwas constructed of available techniques for gathering data related toprogram impacts on productivity, learner gain and learner families.

At each site, plant-gathered indices of productivity weresurveyed and discussed until an agreed upon list could be developedfor the site. In addition, supervisors participated in developinganchored rating scales on information processing tasks which were

66 66

STRUCTURE OF CURRENT STUDY

plant specific ( i.e., participation in meetings, doing quality assurancepaperwork, etc.). These rating scales were used to rate learnersbefore and after training. (See Appendix G for samples of these ratingscales.)

Interview, test and classroom questionnaire data were collected

for each learner before and after each course, or at suitable intervalsfor on-going classes. Lytle's conceptual framework for changes inadult literacy (i.e., Beliefs, Practices, Process and Plans see Chapter3) was used as an organizational principle for the interview andquestionnaire. Information was gathered on learners' beliefs aboutliteracy in general and their own literacy in particular. In addition,interviews and questionnaires focused upon literacy practices, theliteracy processes and abilities demonstrated with workplace literacytasks, and learners' plans for 1, 5 and 10 years ahead.

The instruments developed for the first phase involve a mixtureof interview and questionnaire items which were to be used for alllearners at all sites, and custom-designed tasks or job scenariosappropriate for particular sites and classes. For the Practices sectionof the questionnaire, site personnel added plant specific items to amore general list of reading material, which learners were to rate fordifficulty. Questions related to literacy practices in work teams and inthe plant were worded to reflect local language use. Questionnaire andfocus group questions reflecting literacy practices with familymembers were also worded to reflect local use. For the Processsection of the model, personnel at each site participated in analyzingworkplace literacy tasks, and constructing Cloze tests and job scenarioliteracy tasks (i.e. reading plant newspapers or using job aids, forms,

graphs, etc.) related to that workplace. (See Appendices A D forsample instruments. The research rationale for construction of theseinstruments has been provided in Chapters 3 to 5.)

A significant amount of instrument development occurred at the

Delco site. Considerable time was saved at the Cumberland site by

using the Delco instruments as models for modification or to stimulate

67 6"

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

the thoughts of plant personnel about what might be useful tasks forthe custom-designed portion of the assessment.

InterviewAn interview protocol was devised to cover the four aspects of

Ly-tle's model. Concerning their Beliefs, learners were asked todescribe a literate person they knew at work and elsewhere, as well ashow literate they saw themselves now, and as becoming in future.

Concerning literacy Practices, learners were asked what readingand writing they had done recently, both at work and away from work.Literacy Process was tested using three different job-related items (i.e.a newsletter article, a graph and a job aid), which were all selectedwith the advice of the site coordinator. The subjects were asked todescribe how they read or used the items, as well as answeringquestions about the specific contents. Finally, learners were askedabout their future Plans, for 1, 5 and 10 years ahead, and how they sawreading and education as part of those plans.

Questionnaire and Cloze TestA written questionnaire was also administered to participants

during one of the first few class meetings, and again near the end ofthe course. Items dealt with the areas of literacy Beliefs and Practices,included a Cloze test based upon the local plant newspaper, and inaddition contained questions about family literacy for those learnerswith children between the ages of 3 and 17. To complement theBeliefs questions in the interview (see above), the learners were askedto write down 4 or 5 words that described themselves as a reader andwriter and to do the same for someone they saw as good at readingand writing. Further information about Practices was sought through achecklist of 20 possible types of reading material (e.g., books, signs,training manuals, pay stubs, charts, cartoons), for which the subjectswere asked to rate, on a scale of 1 5, the difficulty they had inreading those of the items that they had read recently. They were alsoquestioned about the frequency of literacy-related activities: how often,for example, they read a newspaper, made a shopping list or visited a

68

STRUCTURE OF CURRENT STUDY

library, as well as how many books they owned. In relation to literacyat work, they were asked to respond, on a scale of 1 (very like me) to5 (very unlike me), to 10 statements such as "I just listen inmeetings", "My ideas are discussed in meetings", "I read informationwhen it is handed out", and "I have trouble reading information sent

out by management".

The questions about family literacy concentrated on literacypractices, particularly frequency of literacy activities: how often doesthe child look at books, read or ask to be read to, or visit a library; howoften does the parent read to the child or help with reading; howmany books has parent or child bought in the last year.

For each site, the coordinator helped to select, from workplacematerials, a suitable passage for use in a Cloze Test, in which every

fifth word was left blank. These passages were of a page in length,with about 25 blanks to be filled in.

Family Literacy Focus Group InterviewAt the Cumberland site, a group of learners with children were

interviewed about literacy Beliefs and Practices in the home. Theywere asked, for example, why they thought some children did betterat school than others and what kinds of literacy-related materials theyhad available for their children. Questions used in the focus groupinterviews reflect categories developed by Fitzgerald, Spiegel andCunningham (1991) in assessing home and parental factors related tochildren's success in school.

ESL ChecklistEvaluation of English as Second Language proficiency is not

easily done with paper and pencil measures since speaking, listening,reading and writing are all involved. Typically, teacher checklists of a

wide variety of behaviors serve as a diagnostic record and instructional

guide, and as an informal assessment of progress.

Bronstein (1991) has developed an extensive workplace-

specific ESL checklist entitled: Benchmarks and student learning

6.9

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

profile for the workplace ESL program of the Labor Education Centerat Southeastern Mass. University. Instructors at Delco reviewed thischecklist, selected items appropriate to their site, modified otheritems, and added a few items specific to their classes and workplace.This resulted in a list of competencies at three levels (Beginner,Intermediate, Advanced) dealing with such areas as followinginstructions, looking up information, and filling out forms. (SeeAppendix F for a sample of this modified checklist.)

Class Observation SheetClassroom observations were performed by research personnel

and on-site coordinators using a class observation form developed byMikulecky (1990) and utilized by Mikulecky and Philippi (1990) andPhilippi (1991) in school and workplace settings (see below and inAppendix E). The form requires observers to describe instructoractivities and student activities, and make comments about the natureof class activities on a timed basis. Notations are then shared with theinstructor to corroborate the accuracy of what has been observed andto make note of purposes for some activities.

Classroom ObservationTime Student Activity Teacher Activity Comments

0

05

Make note of time spent by students actually reading or doing things. Also notetime learners spend listening to the instructor. When learners are in smallgroups or working individually should be mentioned. Special note should bemade when the instructor or a student demonstrates how to do something.

70 7 14

STRUCTURE OF CURRENT STUDY

Productivity InformationInformation on productivity needed to be custom selected for

each worksite, though there was a small degree of overlap (i.e.attendance and safety records). In addition, each site participated inconstructing plant specific supervisor ratings.

Plant Gathered Productivity IndicatorManagement at Delco Chassis gathers a significant amount of

employee data related to achieving corporate goals. Researchers,working with management and union personnel, reviewed this data toselect productivity indicators which could possibly be influenced bysuccessful learning experiences in the workplace literacy program.Learner and control group pre- and post- data was collected onabsenteeism, suggestions submitted, suggestions approved for awards,grievances submitted, discipline records, and workplace safetyrecords.

Supervisor ratingsExtensive interviews were conducted with supervisors and

workers to determine aspects of jobs that contributed to productivityand were related to communication, teamwork and paperwork skills.Ten aspects of job performance emerged from interview data at theDelco plant and ten aspects were also used at the Cumberland plant.Supervisors then provided examples of behaviors which separated topfrom middle from bottom performers on each scale. These behaviorswere used to develop anchored rating scales for each of theproductivity categories. Supervisors then rated each worker on thesescales both before training and after training. (See Appendix G for

samples of these rating scales.)

DATA GATHERING PROCEDURES

Procedures for data gathering varied from instrument toinstrument. Some were written directly by the learner or indirectlyfrom the learner's comments, and others by the learner's teacher orsupervisor, or by the researcher.

7 I,-

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

The interview and job scenarios were conducted by a researcherone-on-one with a learner. The researcher asked each question andmade notes on the learner's responses, pausing long enough to obtaina considered answer, and using standard non-directive prompts andprobes to elicit a more extensive response. The time taken for eachindividual interview was in the range 20 - 30 minutes. The FamilyLiteracy Focus Group Interview was conducted in a similar fashion, andtook about 10 15 minutes.

The questionnaire and doze test were administered by theteacher during the class period. Each learner filled out the answersindividually, with the teacher available to explain or clarify items whenthe learner was unsure what to do.

The ESL checklist was completed by the teacher of each studentin an ESL class, and the class observation sheet was completed by aresearcher while a class was in progress.

In some cases, supervisor productivity ratings were completedin conjunction with a researcher. At other times, rating forms neededto be left with supervisors. This divergence in procedure may havecontributed to difficulties at the Delco plant in obtaining consistentsupervisor ratings. (See Chapter 7 for more details on this.)

DATA ANALYSIS TECHNIQUES

Data analysis was of two types. Cloze test scores and quantifiablequestionnaire and interview responses were recorded and analysedstatistically. Responses to open-ended interview questions wererecorded, and then methods of analysis were developed to fit thenature of the responses.

For some open-ended interview questions, categories ofresponses were allowed to emerge from data. These categories werethen used to label subject comments. When category refinementallowed for acceptable levels of inter-rater agreement (90% orhigher), category responses were recorded and statistically analysed.

72

STRUCTURE OF CURRENT STUDY

For other open-ended interview questions, a holistic comparison wasmade between pre-test and post-test responses, and the change wasrated as positive, neutral or negative. As with the category schemes,the criteria for assessing this change emerged from data, and theapplication of the scheme was subject to the same levels of acceptableinter-rater agreement.

Examples of both category and holistic rating schemes arose inconnection with the interview question:

"How literate do you consider yourself to be?What makes you think so?"

Responses to this open-ended question nearly always includedspontaneously some kind of self-rating, using words such as "average","very literate", "below average", "poor". These were categorized fromlowest to highest on a scale of 1 - 5, to produce a score for each self-rating. In addition, a holistic rating was applied to the full response,in which change from pre-test to post-test was judged as positive,neutral or negative according to the subject's reported self-image andthe reasons given for it.

For any of the responses which resulted in numerical scores,statistical tests were applied to the set of scores for each group of

subjects. Pre- and post-assessments were compared for theindividuals in a class using a paired-sample t-test, in order to detectgains brought about by the program. Where a class had a controlgroup, the changes for the two groups were compared using a two-

sample t-test. In addition, for the holistic change scores, theallocation of values +1, 0 and -1 to positive, neutral and negativeallowed the use of a one-sample t-test to find if the changes weresignificantly different from 0. In all cases, as the tests were of "no

difference" versus "improvement", the statistical tests were one-tailed.

CONCLUSION

This study's objective was to develop an evaluation model that

can be used with most workplace literacy programs. A pilot evaluation

73 73

a

EVALUATING 1HE IMPACT OF WORKPLACE LITERACY PROGRAMS

was conducted at two very different workplaces, where data wasobtained on productivity, learner literacy attributes and learners'families. This data was gathered, before and after each course, usinglearner interviews and questionnaires, company records, andsupervisor ratings of employees. Analysis of the data included coding,scoring and categorizing items, and applying statistical tests to detectimprovements that had taken place during the time learners were inclass.

14 7.;

CHAPTER 7

Results of Current Study

OVERVIEW

The main purpose of this chapter is to indicate which of theevaluation techniques employed have been most successful indetecting pre/post program changes in learners and their families,and in employer objectives. This will be illustrated using examples

from

the Technical Preparation class at Delco(often contrasting it with its control group);

the GED class at Delco;

the ESL class at Delco(making some comparisons with its small control group);

the Communications and Collaboration class at Cumberland(to a lesser extent, as less data gathering could be done there).

Pre-test and post-test results were compared statistically andanalytically for each class studied, on each aspect of measurementused: learner beliefs, practices, processes and abilities, and plans,family literacy and employer objectives. Program impact on learnersfrom the various Delco and Cumberland classes is summarized below.

(Tabular data are available in Appendix H.)

These results provide the principal basis for revising someaspects of the evaluation instruments and retaining others, asdescribed in Chapter 8.

Learner LiteracyChanges in Beliefs

View of a literate person - no changeView of self as literate - significant gain for TechnicalPreparation, but not for control

75 75

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

Changes in Practices

Reading and writing at work significant gain for ESL, but notfor control

Participation in meetings significant gain for TechnicalPreparation, but not for control

Asking questions at work - significant gain for ESL (but not forcontrol) and significant gain for Cumberland

Reading and writing away from work - significant gain forTechnical Preparation, but not for control

Range of reading - significant gain for GED and ESL, but notfor ESL control

Chan es in Re orted Readin Process and in Reading Comprehension

Job-related doze test - significant gain for TechnicalPreparation, but not for control

Prose reading process - significant gain in responses(particularly topics mentioned) for GED and ESL, but notfor ESL control

Job scenario questions significant gains for all Delco classeson questions of various difficulties

Use of job aids significant gain for ESL, but not for control

Changes in Plans

Plans for 1 and 5 years - significant gain in focus and literacygoals for Technical Preparation, but not for control

Reading and education in plans significant gain for ESL, butnot for control

Family Literacy

No change for GED and ESL; number of parents in othergroups too small for statistical analysis

Employer Objectives

Attendance no significant change for Delco groups

Safety, suggestions, etc - numbers too small for statisticalanalysis

Supervisor ratings - significant gain for Cumberland

76

RESULTS OF CURRENT STUDY

CHANGES IN LEARNER LnERAcY

Learner beliefs about their own literacy and about what it means to bea literate person were assessed with both questionnaire items andopen-ended interview questions.

BeliefsSubjects' views of what constitutes a literate person did not

change significantly from pre-test to post-test, but it was noteworthythat their comments ranged quite widely, beyond the area of readingand writing, to include mention of broad-based intellectual and socialqualities and the sorts of things literate persons were able to do.Examples of types of comments used to describe a literate personincluded mentioning such attributes as "college education", "knows alot", "experienced", "has a better job", and such abilities as "well-organized", "competent", "helpful", "concerned", "good at solvingproblems". There were also more expected comments like "reads allthe time", "understand what they read". and "writes well".

In response to the interview question "How literate do youconsider yourself to be?", the Technical Preparation group showed astatistically significant improvement, and the ESL group also showedsome numerical improvement. This was measured in two ways.Responses to this open-ended question nearly always includedspontaneously some kind of self-rating, using words such as "poor","average", and "very literate", which was scored on a scale of 1 5.

(The three examples just given would score 1, 3, and 5, respectively.)A holistic rating was also applied to the full response, in which changefrom pre-test to post-test was judged by the reported self-image andthe reasons given for it. These changes were rated as negative, zero orpositive. For example, one learner's responses that received a positive

rating were:

Pre: "Not very literate- not much education."

Post: "I'm average. I'm not stupid. I have common sense andcan read and write."

77 '7 7

F

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

Another made gains at an apparently different level:

Pre: "Fair or average- a bit above. I understand somewords, but others I don't. I'm not sure if it's literacyor memory."

Post: "I'm more literate than I was before class - I understandmore. I'm getting more interested in fiction, and fact. I

look up words in the dictionary and thesaurus."

Pre/post changes for the self-rating and the holistic scores werestatistically significant at the p<0.02 and p<0.01 levels for theTechnical Preparation group. Control group scores showed nosignificant change.

A question about literacy beliefs, corresponding to that in theinterview, was included in the written questionnaire. It asked thesubjects to write down several words that described themselves as areader and writer. This format did not produce the same result.Here, pre/post variation was due more to the number of words writtendown than to any change in the nature of the response. Thisillustrates an advantage that the interview had over the questionnairein gathering richer data - a difference to which we shall return in latersections.

Thus it appears that access to learners' beliefs about literacy ismore easily obtained through interviews than questionnaires, and theirbeliefs about themselves are more likely to change than their beliefsabout others. So it is probably most useful to see the opening question

on the interview protocol in Appendix A ("Describe a person you knowwho is good at reading and writing") more as a warm-up question tostart the learner thinking about literacy than as one likely to produceevidence of change brought about by a program. The later questions,asking how good at reading and writing subjects consider themselves

now, and likely to become in future, appear more responsive to changebetween pre- and post-assessment. and provide a useful measure ofthe effect of a program on a learner's beliefs about literacy.

78

RESULTS OF CURRENT STUDY

PracticesLearners were questioned about their literacy practices, both at

work and at home. Concerning work-related activities, they wereasked in the interview to describe the kinds of reading and writingthat their work had involved during the past week, and in thequestionnaire to rate on a scale from 1 (very like me) to 5 (very unlikeme) a number of statements relating to contributions in meetings andthe reading of work-related materials.

The interview responses were assessed by a count of itemsmentioned, and by holistic pre/post change judged by the breadth,frequency and difficulty of the reading mentioned. In general, thesemeasures showed pre/post gains. For the ESL class (but not for itscontrol group), the changes were statistically significant (p<0.05 andp<0.01 for the two measures). The nature of the gains is illustrated bythese sample responses.

Pre: "Newspaper during break and lunch."

Post: "Read check sheets for parts, suggestions, bulletins,QUILS, monthly quality paper. Writing: Check off onsheet."

Pre: "Nothing really - just put parts on the line."

Post: "Bulletin at work. I can really read it now. Theinformation is important. I read the magazine at work,also; it's new."

The one exception to this pattern of pre/post gains for workplacereading was the Technical Preparation group, which was in class full-time and therefore had not been doing their normal work for theduration of the course. For such full-time classes, it would be better toconduct the post-interviews a few weeks after subjects' return to

normal work, in order to register any changes in work-related readingbehavior resulting from their training.

Learner self-ratings on the statements about meetings and work-related reading showed very little pre/post change overall, but a fewaspects are noteworthy. For the Technical Preparation group, two

7979

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

items showed significant increase (p<0.05 for both): those concernedwith talking in meetings and with having one's ideas discussed inmeetings. For the ESL group, but not its control, the following itemshowed a significant increase (p<0.05):

When you need to know something at work, you usually ask someone about it.

very like me 1 2 3 4 5 very unlike me

This reflects an emphasis on oral work in the ESL class and shows again in confidence by these workers. This item also showedsignificant gains (p<0.02) for the Cumberland Communications andCollaboration class, which put much emphasis on workingcooperatively with other workers. Thus, in both cases, skills dealtwith in class produced changes in workplace behavior.

Turning now to literacy activities away from work, the learnerswere asked in the interview to describe the reading and writing thatthey did away from work, and in the questionnaire to rate themselveson their frequency of several literacy-related activities and theirownership of reading materials.

The interview responses were assessed by a count of itemsmentioned, and by holistic pre/post change Judged by the breadth,frequency and difficulty of the reading mentioned (as described abovefor workplace reading). This showed statistically significant increasesfor the Technical Preparation class in both measures: the count ofitems and the holistic rating (p<0.02 and p<0.005); the control groupshowed no such increases. The GED group also registered somenumerical gains, but the ESL group did not. The lack of change inreading and writing for the ESL class may be due to the emphasis onoral work already mentioned, or to their separate uses of English atwork and their native language outside the workplace.

For the section of the questionnaire about frequency of literacy-related activities away from work and ownership of reading materials,no items showed significant change and it may be that changes inthese areas of behavior and ownership are slow to take effect,

uV

RESULTS OF CURRENT STUDY

requiring more than the few weeks of time available between pre- andpost-testing. In their post-interviews, a number of the learnersexpressed their positive intentions in such areas, but the stimulusfrom attending the course was then only beginning to producechanges in behavior. Comments of this kind included:

"I have to do more reading for my daughter - especially now thatI have more incentive from this class. It's like a spark."

"After this course, I'm going back to night school. I'm reallyimpressed with this class."

Though the Technical Preparation class did not improvesignificantly in home literacy behaviors, their improvement scores

were significantly better (p<0.01) than those of the control group,which actually reported less home reading in the post-test. Perhapsthis reflects a baseline of less general reading in summer (when thepost-tests were conducted), because of the availability of other leisure

activities.

In the questionnaire, learners were also presented with a list of20 types of reading-- some general (e.g., newspapers, books, bills) and

some plant specific (e.g., Delco Doings, suggestion forms, routesheets, paycheck stubs). They were asked to rate each on a scalefrom 1 (easy) to 5 (hard), also indicating which of the types they hadread in the last month. This revealed a statistically wider range of

reading in the post-assessment for the GED and ESL groups (p<0.01

and p<0.002). The ESL control group did not show such gains.

Few of the individual items showed significant change, but overhalf (about 11 of the 20 items, averaged over the 4 classes) were rated

by learners to have greater perceived difficulty. This may mean thatlearners were being more realistic in the post-assessment, aftergreater exposure to reading generally, or Just that they were unable to

apply the scale consistently over the time gap between pre- and post-

test. For those not accustomed to using scoring schemes, there may

be a problem in such assessment, particulary self-assessment. (See

also supervisor ratings in the Productivity section.) One aspect of this

81 81

EVALUATING THE IMPACT OF WORKPLACE LZIERACY PROGRAMS

question's wording may also have contributed to the difficulty: if thelearners had been asked first to indicate which of the types they hadread recently and then to rate only those items, it is possible thatmore consistency might have been obtained.

These difficulties point up once again the inflexibility of aquestionnaire and its dependence on the abilities and willingness ofthe person filling it out, compared with an interview in which theinterviewer can explain a question and probe for further informationto clarify the learner's intentions.

Process and Ability

In the interview, workers were asked to respond to bahprocess and content questions on a plant newspaper article, amoderately complex graph, and a job instruction sheet. They werealso given a Cloze test constructed from plant reading material.

The Cloze test used at Delco came from a plant newspaperarticle. The Technical Preparation class made statistically significantpre/post gains (p<0.02), while its control group did not. Nor did theGED and ESL groups, but the reason here appeared to be that thereading passage was too difficult for them. These two groups hadmean scores of about 7 (out of 23 blanks to be filled), which indicatesa frustrational reading level, but the Technical Preparation class andits control group had means of 10 or 11, well above the frustrationallevel. (50% replacement indicates an independent reading level,while below 35% indicates a fiustrational reading level.) Given thisrange of reading ability, It would have been better to have had availabletwo (or even three) Cloze tests of different difficulty levels, to be usedas appropriate.

The Cumberland Cloze test used the plant safety rules, and herethe mean score was 14 (of 25 blanks), showing that the test was wellwithin the reading ability of those taking it. However, no pre/post

8-28 4:

RESULTS OF CURRENT STUDY

comparison was available at Cumberland, since the on-site coordinatormanaged only a single administration of the test.

A portion of the interview involved responding to a newspaperarticle, a graph and an instruction sheet in job-related scenarios.Learners tended to answer correctly the simpler fact-level contentquestions on the pre-test, so no gain in this area was apparent on the

post-test. Significant gains were demonstrated on more complexitems, which often called for the use of inference and interpretation.For example, the Technical Preparation group showed significantgains on the most difficult article and graph questions (e.g. "Whathappened to the inventory value in August and September?", whichrequired learners to describe a trend), while the ESL group did so onthose of medium difficulty (e.g. "What is the inventory value for theweek of August 197', which required reading from between two scalevalues). Since the levels of competence of the different groups (andindividuals) varied, a greater range of difficulty in the sets of questionswould have allowed improvements to show at an appropriate level.

This would also have been assisted by a wider range of item types ineach section, from the simple factual to more difficult interpretationquestions.

When asked to read a plant newspaper article and describe how

they read it (i.e. the processes they were using), learners' responsescovered two main areas: reading strategies and topics of interest.Strategies included skimming, starting with headings and bold print,and reading the first and last paragraphs. Topics of interest includedthe products manufactured by competitor companies and the wagesthat they paid. Responses to this process question (reproduced

below) included the following examples:

Describe what you would look at. What would you bethinking about? How would you go about reading this story?

Pre: "Check each heading and decide whether to go further."

Post: "Read the headings, get ideas about the companies, skim,know what they make, and know their customers andcompetitors."

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

Pre: "Read the first paragraph, read the headlines and boldprint, and pick out what hits my attention."

Post: "Read the dark print first, then break it down from thereand read what gets my attention. Also, I'd find out whatit's about and what they are telling in it. Then, I'd read itin depth."

Pre: 'Title, first paragraph, through the whole thing - analyze it."

Post: "Title, subject ("Delco Doing: "), and function and operationof companies. I would look at Asia and Europe (competitivemarkets) to see how their prices are lower and higher."

Pre: "First the title, then I'd read from beginning to end."

Post: "Start at the beginning. Look at the areas and read all theway through. I'd also read about how Delco is trying tocompete, its main customers and the percentage ofwages and benefits. I'd read "Delco Doings" to find outabout Delco's further needs. Reading these things makesyou familiar with other companies."

These responses were analysed by counting the number ofseparate items mentioned by the interviewee. For all the Delcoclasses, the total number of responses and the number of topicsmentioned increased numerically, and these were statisticallysignificant for the GED and ESL groups (p<0.005 in both cases); thecontrol groups showed no gains in these areas. The increase in thenumber of topics that the learners mentioned shows a greater ability

to make connections between what they read and their ownknowledge, as well as showing a growth in confidence arising from

their time in class. For the most part, increased discussion of

strategies included comments one would expect from moresophisticated readers.

In connection with the job instruction sheet, learners wereasked about their use of such job aids, how long it took them to read

one, and how difficult they found it. The only case of a statisticallysignificant gain was for the ESL class (but not for its control group) inreply to a question about how likely they were to use ajob aid. It

appears that ESL learners' confidence in approaching job-relatedreading had been increased by their attendance of a class. Other

8 4

RESULTS OF CURRENT STUDY

questions that involved self-reporting of reading skills were notsuccessful, because of the interviewees' inability to gauge their owncapacities. A question about the length of time it took to read a jobaid produced a wide spectrum of answers: from 1-2 minutes up to aweek. Responses at the bottom end of this range were clearlyunrealistic: just one of the content questions tended to take morethan 2 minutes. And responses like a day or a week seem to refer tothe length of time needed, not only to read the job aid, but also tolearn\the job it relates to. Ambiguity in connection with such itemsmay well preclude their effectiveness.

In all of the job scenarios (newspaper article, graph andinstruction sheet), the content questions showed a considerablevariation in response, both between groups and between individuals.To accommodate this, the set of questions for any one section needsto range in difficulty and in nature (involving fact, inference andapplication), so that there is room for improvement from pre- to post-test, at some level, for all those interviewed.

PlansIn the interview, learners were asked about their plans for the

future: 1, 5 and 10 years ahead, and to explain how reading andeducation formed part of those plans. Assessed on how definite anddetailed the plans were, the Tech Prep students showed significantpre/post improvements for 1 and 5 years (p<0.02 and p<0.05); thesedid not occur with the control group. The ESL class showed asignificant increase in references to reading and education as part oftheir plans (p<0.005), which was not repeated for its control group.

Responses to planning questions ranged from learnersmentioning their prospects for advancement in the company, or outof it, to their intentions regarding marriage, children, housing andretirement. The following are typical responses to the question:

What are your plans for the next year?

Pre: "Finish degree."

8-5 85

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

Post: "Getting married. Going to school 4 nights a week in thefall."

Pre: "Have another child, learn new jobs in the samedepartment, and get a new car.

Post: "Have another child, lose weight, take some course (I don'tknow what kind), and probably finish a degree in retail."

What are your plans for the next five years?

Pre: "Apprenticeship completed."

Post: "Have electrical apprenticeship, have kids - maybe, (and)perhaps buy another house."

Pre: "Go through the apprenticeship. This will take up a bigamount of time."

Post: "Be done with the apprenticeship and definitely move upand off the assembly line."

Pre: "Ending an apprenticeship - journeyman or greaterposition."

Post: "I'd like to have a combination of school and owning mybusiness."

The connection with literacy was made more explicit throughthe follow-up question about the role of reading and education in theirplans, as these comments from the post-interview reveal in responseto the prompt:

Explain how you see reading and education in these plans.

"Reading helps with everything. As you grow you learn. I wantmy life to grow."

"I need to develop and build confidence do more reading -that's definitely important. Get my kids to do more of it."

"Get a better job by taking more classes. Help my kids readmore - help them in school."

"If you can't read, you can't trouble-shoot the machines. "

"The more I learn, the easier it is to make suggestions aboutthings and to apply for better positions."

8 6 -

RESULTS OF CURRENT STUDY

"I feel better after being in here and I want to learn more. I haveto read a book on game for hunting. To retire, I need toread about benefits."

Overall, the learners were very positive about their experience inclasses and saw them as opening new doors, both for further educationand for a life of greater opportunity.

CHANGES IN FAMILY LITERACY

Measures of family literacy mainly involve information about how

parents interact with their children in literacy-related activities. In

addition, some of the parents' reading behavior away from workreflects upon changes in family literacy.

Learners with children between the ages of 3 and 17 were askedin the questionnaire about the frequency of such activities as readingto their child, helping the child with reading, and buying books forthe child. Also, they were asked how often the child read alone, andwhat kinds of books or other materials (if any) the child borrowed

from a library.

These questions revealed no statistically significant changes for

the GED and ESL groups, the only ones for which the number ofparents was large enough to draw any conclusions. These groups eachcontained 12 parents of children in the relevant age range, while theTechnical Preparation class and the Cumberland group contained only

4 each.

However, the responses to these questions on family literacy didshow slight overall gains, even though not statistically significant ones.It may be that the time between pre- and post-tests was not enoughfor any changes brought about by the classes to have much effect. It

may also be that a larger sample size would have revealed the trendfor improvement to be statistically significant. In addition, there issome evidence in the oral interviews about reading practices toindicate a movement towards more literacy activity for oneself andone's children. During post-test interviews, class members were

87 87

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

more likely to report newspaper, magazine and novel reading. Twoparents, who had not previously mentioned reading to children.,during post interviews mentioned "reading to my child" or "reading achildren's book to my son." Another reported reading child carebooks and magazines on parenting. These had not been presentduring pre-interviews. One class member commented, "I definitelyread a lot more since I started taking this course."

CHANGES IN MEETING EMPLOYER OBJECTIVES

In relation to worker productivity, measures used at Delco wereattendance, safety records, suggestions submitted, suggestionsapproved for awards, grievances submitted, and discipline records. Inaddition, each site participated in constructing plant-specificsupervisor ratings.

No significant changes occurred in learner attendance, but itwas noted that, due to small sample size, the absence figures of a fewindividuals could affect the total quite markedly. For example, in theTechnical Preparation group, half the absences in the post-trainingperiod are attributable to three employees. So, with samples thissmall, extreme caution should be used.

Other measures such as productivity suggestions and accidentrecords involved numbers too small for statistical testing,Suggestions were made by only a few members of each group duringthe periods concerned and these followed no apparent pattern.Accidents were even rarer; for example, no Technical Preparation orcontrol group member had an accident during the six weeks prior totraining. Such figures do not allow statistical analysis.

At both Delco and Cumberland, supervisor ratings were devised

to measure aspects of jobs that contributed to productivity and wererelated to communication, teamwork and paperwork skills. Extensive

interviews were conducted to determine relevant skills and, at eachsite, ten aspects of job performance emerged from interview data,

88 4;)s. C.)

RESULTS OF CURRENT STUDY

Supervisors then provided examples of behaviors which separated topfrom middle from bottom performers on each scale. These behaviorswere used to develop anchored rating scales for each of theproductivity categories. Supervisors then rated each worker on thesescales both before and after training.

At Delco, the supervisor ratings of the workers' job-related skillsproduced some anomalies that cast doubt on the consistency of theratings from pre-test to post-test. Even though supervisorsparticipated in scale development, some seemed to rate someworkers exactly the same on all scales. Some of the ratings appearedto be carelessly done. Even with certain items and individualsremoved to correct fcr this, no change was apparent. It may be thatsupervisors need more training or instruction before doing ratings, orthat a time period of six weeks may be too short to registerimprovements.

At Cumberland, however, all ten aspects of the assessmentscheme used there showed significant improvements (p<0.0001), over

the 11 weeks of the classes. Here, just two individuals made theassessments and made them for the same workers in both pre- andpost-tests, whereas at Delco up to four supervisors assessed themembers of each group and there had been some personnel changesbetween pre- and post-test. Also, the Cumberland assessors hadslightly more education and were not shop-floor supervisors, as at

-4Delco; they may have had more experience in malting judgments and

ratings.

Another factor that may have contributed to the Cumberlandresults is the choice of assessment categories. These were very

closely related to the objectives of the Communications andCollaboration course, covering such items as communication skills,problem-solving ability and conflict resolution. The Delco assessmentreferred mainly to specific job skills such as machine setting andrecord-keeping, but Delco courses were of a more diffuse job training

nature, not relating directly to these skills. This tends to confirm the

89

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

notion that learners gain knowledge and skills only in the areas thatare taught.

CONCLUSION

The instruments used in this study to measure the impact ofliteracy programs varied in their success at detecting pre/postchanges in learner literacy, family literacy and employer objectives.The main results and some observations about assessment utilityfollow.

Learner LfteracyBeliefs

Learners reported improvements in their view of themselvesas literate, but not in their view of a literate person.

Interview questions were more successful than questionnaireitems in detecting changes in self-image.

Practices

Reading practices at work improved in areas that related tothe class attended.

Full-time classes need to be post-tested some time afterlearners return to normal work, so that changes in work-related reading can take effect.

Reading practices away from work improved for classes wherehome reading had been encouraged.

Interviews were more sensitive to changes in readingpractices than were questionnaires.

Self-assessment of reading difficulty produced someinconsistencies that cast doubt on this questionnairesection.

Process and Ability

Cloze test scores improved only when the passage was atan appropriate reading level, suggesting a need for severaldifferent test passages.

Answers to process questions on job-related reading materials

Plans

RESULTS OF CURRENT STUDY

showed improvements in reading strategies and topicconnections.

Answers to content questions on job-related reading materialsshowed improvements at various levels for differentclasses, suggesting a need for a range of difficulty and typein the questions.

Learners were generally more definite and detailed in theirplans for the future after attending classes.

Family Literacy

Questionnaire items showed slight gains in some areas, but thetime may have been too short for sitrnifigAnt improvementsto occur.

Employer Objectives

Attendance showed no significant changes; with smallgroups, there is a problem of a few individuals' absencedistorting totals.

Safety, suggestions, etc were too infrequent for analysis.

Supervisor ratings showed significant gains when areascovered related to the class, as well as the job.

Consistent supervisor ratings are difficult to obtain acrossseveral supervisors and when personnel change from pre-to post-test. Education and experience levels ofsupervisors may also be a factor in obtaining consistency.

91 91

CHAPTER 8

Discussion and Implications

OVERVJEW

This pilot study has shown that it is possible to perform a broad-scale assessment of workplace literacy programs, in order to measurethe impact on learners, their families and their productivity. Theresults of the study demonstrate some improvement in each aspect ofthe assessment model. However, gains appear to be limited to what istaught; there is very little transfer to other areas not addressed byinstruction.

Learner change was measured in the areas of Beliefs, Plans,Practices, and Processes and Abilities. Where these formed a part ofclass instruction, learners made gains in the following areas:

their literacy self-image;

their ability to articulate plans;

the amount and range of literacy activityboth at work and away from work;

reading strategies and comprehension.

Classes did not address directly issues of family literacy, and littlechange was evident in this area. Productivity measures proved, on the

whole, to be unsatisfactory for the small numbers of learners studied,although supervisor ratings showed increases when areas assessedwere closely related to instruction and company goals.

The evaluation model itself was also under scrutiny in thisproject. Several points of interest have arisen from the pilot

assessment.

1) Questionnaires, although time-efficient, seem to be less effectivethan interviews in gathering accurate information.

93.9`'

EVALUATING THE IMPACT OF WORPLACE LITERACY PROGRAMS

2) Because of the range of learner abilities, workplace scenarios needto include questions of a variety of difficulties; doze tests ofvarying difficulty may also be necessary.

3) It would be desirable to have direct measures of learnerproductivity, as well as more reliable ways of obtaining supervisorratings.

PRINCIPAL .A.CHIEVEMENTS

A good deal has been learned from this pilot assessment.Initially, the pilot study has demonstrated that it is possible toperform a broad-scale assessment of workplace literacy programsusing learner interviews, tests, and questionnaires within reasonabletime-frames (i.e. 20-30 minutes of interviews and 10-15 minutes oftests and questionnaires, both before and again after instruction), aswell as company records and supervisor rating scales. Duringsubsequent studies, it will be determined if the assessment model canbe transferred to additional workplace literacy programs with aminimum of technical assistance.

Secondarily, results from the assessment provide indications ofwhat effective workplace literacy programs can accomplish and maynot be able to accomplish. Discussion of these results will reflect andsubstantiate two major generalizations. These generalizations are:

1) Workplace literacy program instruction is able to demonstratepositive improvement in each area of the assessment model (i.e.beliefs, practices, processes and abilities, plans, productivity,and family literacy).

2) Gains seem to be limited to areas directly addressed by instruction(i.e. programs and classes accomplished gains only in areaswhere there was direct instructional activity). No clear carry-over or transfer to other areas is apparent in evaluation results.

IMPACT OF PROGRAMS AND LINK TO INSTRUCTION

Data were gathered for the learners in a range of classes:Technical Preparation, GED, ESL, and Communication and

94P 3

DISCUSSION AND IMPLICATIONS

Collaboration. Program impact results will be summarized anddiscussed in direct relation to the types of instruction in classeswhere gains were made. The types of instruction in classes notdemonstrating gains in particular areas of the assessment model willalso be discussed for comparison purposes.

Beliefs About Self as LiterateChanging adult learners' beliefs about their own literacy abilities

is important for several reasons. An adult with a negative self-

impression of his or her own ability is not likely to attempt literacyaway from the supportive environment of a classroom and nurturinginstructor. Significant growth in literacy abilities requires hundreds ofhours of practice -- more than most programs can ever provide inclass time. For this reason, it is important that learners become moreindependent by developing more positive self-beliefs about theirliteracy abilities. They need to see themselves as being capable ofattempting more literacy and practicing more on their own, asopposed to avoiding literacy tasks and literacy practice. (Incidentallyit is important that learner beliefs about their own literacy are alsoaccurate or learners will feel betrayed when they realize they havebeen wrong, and that betrayal can lead to abandoning altogether muchof what has been learned in classes).

Data about learner literacy beliefs were collected in all classes.Except in the GED class, learners demonstrated improved views of

themselves as literates. This was mainly revealed during interviewsthrough more positive self-descriptions and self-assessments.

In the Tech Prep class, learners were regularly able to monitortheir own progress on reading comprehension and reading ratethrough class tests and discussions. In addition, class discussion timeduring seven-hour learning days often addressed future learning plansand why the skills students were mastering would be of use in future

training.

The ESL class, similarly, used class discussion both to provideEnglish practice and to highlight the relevance of what was being

95 194

EVALUATING THE IMPACT OF WORPLACE LITERACY PROGRAMS

learned to future use. Learners were asked to share, in journals andlater oral discussion, personal accomplishments in written and oralEnglish. This served both as an instructional tool to improve languageuse and also as a feedback mechanism for reinforcing learners' viewsabout their own growing language and literacy competence.

The Communication and Collaboration class revolvedsubstantially around the concept of joint and personal goal setting,planning for accomplishing goals, and monitoring effectiveness. Somegoals related to direct job performance, but a substantial numberrelated to improving individual communication abilities. Considerabletime was expended on both individual and group monitoring of gains.An apparent result was expanded and improved beliefs about learners'own literacy abilities.

The GED class demonstrated no gains in the area of improvedlearners' beliefs about their own literacy abilities. The structure ofclasses did not lend itself to either substantial instructor feedback,group feedback, or individual monitoring in this area. Most work wasindividualized and directly related to completing practice exercisesfor the GED tests. Interviews with learners often indicated aworkmanlike attitude toward how many exercises they had gonethrough with little sense of improved individual abilities beyond theclass. Instruction did not focus on learners internalizing a sense ofexpanded personal abilities, and assessment did not reveal suchchanges in belief to have occurred. It is important to note that otherassessment measures indicated that GED students were learning.They had actually improved in some literacy abilities and practices.The significant factor here is that tittle class time was directed towardidentifying and reinforcing growth in this area and concurrently nochange in individual beliefs about changed literacy abilities wasdemonstrated.

PlansInterview questions about future plans and the relationship of

literacy to those plans were asked of learners in the Tech Prep, GED,

96

DISCUSSION AND IMPLICATIONS

and ESL classes. In the Tech Prep class, which was designed as aprelude to further training, a good deal of time was spent addressingstudy skills and the demands of future training. In this class, learners'post-class interviews revealed plans which were articulated with morefocus and detail than had been true of pre-class interviews. A similarpattern occurred in the ESL class which sometimes used discussionsof learners' futures as an activity for improving the use of oral English.Learners in the GED class, who primarily focused on passing the GEDtest and were involved mainly with individual seat-work, demonstratedno measurable change in the clarity or focus of their plans and madeno greater mention of education and literacy as parts of future plans.

Literacy PracticesChanges in the amount and types of literacy practices used by

learners were assessed by a combination of interview questions andquestionnaire checklists and rating scales. The same pattern of gainsdirectly related to classroom focus areas was revealed in assessmentresults in all classes.

The Tech Prep classes emphasized increased reading habits forlearners. One instructor even took learners to the library, readportions of books and magazines aloud in class, and emphasizedimproved literacy habits. In the Tech Prep classes, even though thecourse title implies workplace applications, learners demonstratedincreased reading and writing at home in post-interview questionscompared to the literacy behaviors of the control group. Employeesfrom only a few work-teams were enrolled in these classes andlearners participated in a significant amount of group work in classwith team members. Learners also reported statistically significantincreases in willingness to offer ideas and discuss them in qualityassurance meetings.

The ESL learners spent some class time on workplace literacy

habits and practices by reading bulletin board items, newsletters, andjob materials in class. Significant gains in pre/post literacy practicesat work were reported by ESL learners. These gains were also

EVALUATING THE IMPACT OF WORPIACE LITERACY PROGRAMS

significantly greater than those of .a small ESL control group whoserved as an indicator of changes resulting from simply living for acomparable period of time in an English-speaking environment. Littleclass time was allocated to home literacy activities and no significantimprovement in this area was noted. Oral discussion of class exercises(such as how to convert statements to questions) took a significantamount of class time. The heavy emphasis on oral usage and askingquestions led to significantly higher ratings for asking others forinformation in the workplace.

Learners in the Communication and Collaboration class alsoparticipated in a significant amount of group activity. They, too,showed gains on items involving question-asking and communicatingin the workplace. Lack of post-assessment interviews prevented moreextensive examination of changes in other literacy practices.

The GED class, which did not emphasize literacy practices athome or at work, did not show gains in these areas on interviewitems. Questionnaire checklists, however, revealed a statisticallysignificant tendency for learners to report attempting a wider varietyof reading and writing.

An interesting phenomenon was noted for GED and ESLlearners' responses on these checklist items. In addition to indicatingwhat types of materials they had read recently, learners were asked toindicate the difficulty they had reading these materials. Uamers inboth classes indicated that materials were harder to read at the end ofthe classes than they had been at the beginning of the classes. Thissuggests that low literates without much experience with reading mayinitially over-estimate their own abilities. It further suggests thatinstructors should not simply propose extra reading without providingsupport. When low literates realize that even simple materials aremore difficult than they anticipated, they may become discouraged andretreat from reading them.

9S -

DISCUSSION AND IMPLICATIONS

Reading Processes and AbilitiesThe most psychometrically rigorous measure of reading ability

used in this study was the Cloze reading test. The Tech Prep class,which spent the most time in reading practice, was the only group todemonstrate statistically significant gain on this measure. GED andESL learners, who found the newsletter story used in the Cloze test tobe at their frustrational reading level when instruction began, foundthe high-school difficulty article still at frustrational reading level atthe end of instruction. It is unlikely that learners in either groupreceived enough reading practice to bring this brief story aboutGeneral Motors vehicles within their comprehension ranges.

Job literacy scenarios provided a more diverse range ofindicators of learner literacy processes and abilities across severaltypes of workplace materials. Questions assessing the strategies usedby learners in reading newsletter stories, graphs, and job aids revealedsome change in the sophistication with which learners read. TheTech Prep class, which was comprised of high school graduates andseveral learners with some post high school education, scored veryhigh on pre-class measures of how they went about reading. This classspent a good deal of time addressing study skills and readingstrategies, and learners did score numerically higher on post-assessments than their already high pre-scores. Ceiling effects heremade statistically significant improvements difficult to attain. ESL andGED learners demonstrated especially significant gains in topicsfocused upon. Even though they had not improved in reading abilitiesenough to do well with the earlier reported Cloze newsletter article,the reading practice received during their limited hours of instructionhad led to a more sophisticated approach to reading. Control groups,who received no instruction, demonstrated no improvement from pre-to post-assessments of reading strategies.

Comprehension questions of increasing difficulty were alsoasked in the different job literacy scenarios. Learner gains againreflected learner instruction. Tech Prep learners, whose extensiveclass work addressed inference and problem-solving tasks, improved

99

EVALUATING THE IMPACT OF WORPLACE LITERACY PROGRAMS

most on the more difficult scenario questions. The ESL learners, whomet eight hours each week and spent some time with workplacematerials, improved most on middle level difficulty questions. TheGED group, who met only four hours per week and did little withworkplace materials, demonstrated gains on only one comprehensionquestion related to workplace literacy. Once more, gains appear to bedirectly related to the type and amount of instruction received bylearners.

Family LiteracySome impact of instruction on home literacy has already been

discussed in an earlier section on literacy practices. Learners inclasses which focused upon home materials did improve home literacypractices; those in classes which did not focus on home materialsmade no changes. Only a relatively small number of learners hadchildren and were therefore qualified to answer the family/parentliteracy questionnaire items. No class spent direct instructional timeon family literacy and no significant gains were noted in this area.Though not statistically significant, there was some indication of a fewparents taking their own children to the library more often afterlearners from the Tech Prep class were taken to the library by theirinstructor. Similarly, a few parents reported reading to childrenslightly more often. These accounts were infrequent. It appears thatbenefits of instruction do not transfer very far beyond the focus ofactual instructional activities.

ProductivityFor a variety of reasons discussed in Chapters 6 and 7, few of the

productivity assessments proved satisfactory for groups as small as the12-15 member classes and control groups. There is some indication,however, that some productivity gains were directly related to typeand amount of instruction received by learners. The only group todemonstrate consistently improved supervisor ratings in workplace-based communication and literacy use was the CumberlandCommunication and Collaboration class. The entire class wasstructured to address these workplace communication demands.

100

DISCUSSION AND IMPLICATIONS

Results suggested that this focused instruction was effective in theseareas directly related to company goals. No comparable gains weredemonstrated with Delco supervisor ratings, though difficulty inobtaining acceptable ratings clouds this finding. Questionnaire itemsdealing with participation in team meetings indicate that both Delcoand Cumberland learners who participated in class group work anddiscussions did make significant gains with participation indiscussions during team meetings. Again the relationship betweeninstruction and improved performance is fairly direct.

Conclusions from ResultsWith workplace literacy instruction, it appears that you get what

you pay for and not much more. Classes and instructors at the twosites demonstrated that what you choose to spend time on in classmatters a great deal. Statistically significant gains were made by somestudents for every segment of the evaluation model. More detailedanalysis reveals that gains occurred, however, only in areas directlyaddressed by instruction and class activity.

This is both good news and bad news. It is heartening to knowthat instruction works. Workplace literacy programs that focus on aspecific goal and provide significant instruction toward that goal can

help learners improve. If time is spent providing learners withfeedback about their improved literacy performance and developingliteracy habits at home and work, workers will improve their self-concepts about their own literacy and will read more. The bad news isthat hopes for broad transfer from relatively brief programs (as nearlyall workplace programs are) appear to be misplaced hopes. Whatevereffective class activity focuses upon is the major area of gain. Evenimprovement in literacy practice appears tightly related and limited toclassroom practice. If class time focuses only on workplace activities,practices appear to improve only with workplace literacy materials.For productivity to improve, instruction needs to focus directly onactivities involved with production. Extra dividends of transfer toimproved family literacy seem unlikely unless instructors also spend

time with family literacy activities.

101

EVALUATING THE IMPACT OF WORPLACE LITERACY PROGRAMS

This implies some hard decision-making by instructionalplanners. The results of this study make it much more difficult toaccept the contention that any single focus of literacy instruction willbring improvement in a multitude of areas. It also suggests that diffuseinstruction, which touches lightly on many areas, will not bring aboutgains of any significance in any particular area. The GED groupimproved a little in general reading strategies and may have improvedin taking the GED test, but there are no indications that much moreoccurred. Instructional planning did not focus on much beyond theGED goal. The ESL group improved in oral activities and in someworkplace activities for which they received instruction, but not inareas where they received little instruction. These same patterns ofinstruction being directly and narrowly related to gains hold for theTech Prep class and the Communication and Collaboration class.

This should not be taken to mean that workplace literacyinstruction should always focus upon a single workplace goal. It islikely that the most beneficial mix is instruction which expar. 'slearner practice time beyond the classroom by improving workerliteracy practices and beliefs at home and the workplace. Since 50 oreven 200 hours of class time is not sufficient for many learners toreach tlrir full potentials. the impact of precious class time must be,in part, to increase literacy practice and learner independence. Ifproductivity is an issue, workplace materials and activities used inclass should be directly related to materials and activities employedduring production. If other goals are desirable, they must be plannedfor and it seems likely that additional learning time will also need tobe provided.

WHAT HAS BEEN LEARNED ABOUT HOW TO EVALUATE PROGRAMS

One of the major goals of this study was to develop a model forevaluating workplace literacy programs. For the most part, the pilotassessments validated the utility of a broad-based conceptualframework of adult literacy learning in the workplace. It was possibleand productive to note gains in areas of learner literacy beliefs,

102

DISCUSSION AND IMPLICATIONS

practices, processes and abilities, plans, productivity and familyactivities. A good deal was also learned about the limitations andpitfalls of particular evaluation approaches and methods.

Limitations of QuestionnairesTime is at a premium in workplace literacy programs. Many

programs are only able to provide brief instruction and still others losemoney for each hour of learner time, since learners are not producinga profit while in class. To the degree that checklists andquestionnaires can be used to gather information, as opposed toindividual interviews, a substantial time saving can be made. This pilotassessment used overlapping oral interview and written questionnaireitems to test the degree to which the assessment approachesproduced similar findings. For the most part, questionnaires, thoughtime-efficient, were much less effective and accurate than even brief

face-to-face interviews. This was especially true in the areas ofliteracy beliefs and practices. In these areas, learner responses werevery brief on written forms -- even with the more literate Tech Preplearners. Interviewers, however, waited until learners paused inspeaking and then asked, "Anything else? or "Can you think of any

other examples?" until they received the answer, "No." This produced

a good deal more information and more accurate representations.Questionnaire responses in these areas probably more closely reflect

the degree to which learners could read and wanted to write. Thequestionnaire responses tended to reinforce the interview responses,but questionnaire assessment was often not sensitive enough to detectchanges -- especially on global questions about literacy beliefs or

practices.

Questionnaires were effective when they could be focused. Forexample, descriptions of literacy behaviors in team meetings, listingsof recently-read materials specific to a workplace, and descriptions ofliteracy behaviors with one's children were able to elicit informationrapidly from learners. In cases where there was an overlap between

interview questions and these questionnaire items (e.g. home literacybehaviors in the interview and family literacy behaviors in the

103

EVALUATING THE IMPACT OF WORPLACE LITERACY PROGRAMS

questionnaire), triangulation revealed the questionnaire items toreflect accurately the more extensive oral comments.

Job Literacy ScenariosThe custom-designed job literacy scenarios were specially

designed to reflect workplace literacy tasks of importance at eachworksite. They attempted also to reflect the range of reading typespresent in National Assessments of adult literacy (i.e. prose reading,document reading, and quantitative reading). The scenarios provided,as much as possible, a realistic purpose for reading and attempted toassess both how the learner went about reading (processes) and howwell the learner understood and could use information from thereading (abilities).

These job literacy scenarios proved to be quite productive inassessing improvements in the sophistication with which learnersapproached reading tasks. The initial scenarios, which were limitedto a very few comprehension questions, were somewhat productive,but need to be expanded to reflect more accurately gains in severaltypes of reading (searching for facts, drawing inferences, and makingapplications beyond the task at hand). Instruments in Appendix Ahave been revised to reflect these changes.

Cloze TestsThe Cloze test was simple to construct and relatively easy to

administer. Instructors reported little difficulty with the tests, whichprovided a sample sentence demonstrating how to fill in blanks. Withthe Tech Prep and Currberland classes, the material selected was wellwithin initial comprehension ranges. This was not true for GED andESL learners at Delco, however. The high school difficulty level storywas beyond most learners both before and after instruction. For lowlevel learners, it would be desirable to construct a second Cloze testusing simpler workplace materials.

Though some instructors at pilot sites were familiar with theCloze test procedure, others were not. Directions for how to developand interpret Cloze tests were created for instructors. These are also

104103

DISCUSSION AND IMPLICATIONS

included in Appendix C with samples of Cloze tests developed at thepilot sites.

Fami ly Literacy QuestionsBoth oral focus group methods and written questionnaire items

related to family and parent literacy were piloted for this study. Bothseemed effective in gathering information. However, workplaceliteracy classes are small and the number of learners with children atthe pilot sites was even smaller. For most items, these small numbersprecluded meaningful statistical analysis of workplace literacy impact

on parent literacy practices. These measures are likely to be of moreuse for special programs which focus upon the workplace/familyconnection or for much larger groups.

Employer-Gathered Productivity IndicatorsThough previous studies have discussed the need for assessing

productivity impacts of workplace literacy programs, few have tried todo so. This pilot assessment attempted to use some of the indicatorsof productivity suggested in discussion sections of studies in theresearch literature (i.e. attendance, accident reports, usefulproductivity suggestions made by employees, etc.). The pilot revealedthat it is possible to gather such data with a minimum of effort on the

part of employers. It also revealed that the information is not of great

use if sample sizes are small and time between assessments is not verylong. If a class and control group are comprised of only 15 individualseach, the impact upon absences of a single individual with the flu can

overpower all other factors. This would be less likely to occur withmuch larger groups where influences of sickness would be more likely

to balance out. Similarly, safety is an important indicator ofproductivity and many workplace literacy programs address safety.Accidents among a group of 15 people during a 6 month period areusually rare, however, and therefore not likely to be of much use indetermining program impact. This same pattern held for productivity

suggestions and discipline measures as possible indicators of program

impact. Neither employer maintained data on individual employeeproductivity, so those measures were not available. Such indicators

105 111;1

EVALUATING TT-LE IMPACT OF WORPIACE LITERACY PROGRAMS

are likely to be of worth when available.

Supervisor RatingsSpecially constructed supervisor ratings of employee

productivity with literacy and communication behaviors on the jobwere of greater use. Discussions with supervisors and top employeeswere able to identify the types of literacy, problem solving, andcommunication considered important on particular jobs. A carefulprocess of developing and revising scales to reflect these discussionsis available in Chapter 6 (see also Appendix G.) At the Cumberlandplant, supervisor ratings proved to be useful in noting employeeimprovements on the job. The ratings were less successful at Delco.

The reasons for this lack of success at Delco bear someexamination. Delco supervisors generally had less education than theCumberland supervisors and were less familiar with the concept ofindividual employee evaluation. Individual appointments to haveresearchers meet with supervisors to rate learners' job performancewere cancelled for several legitimate reasons. As a result, supervisorssometimes rated employees without someone to remind them to thinkcarefully about each scale. The resulting ratings seemed to reflect adesire to complete the task rapidly (e.g. many workers receivedexactly the same rating on each scale). It seems advisable in the futureto require that supervisors make ratings with a researcher asking thequestions and encouraging careful consideration of each scale andeach worker.

Supervisor ratings were possible at the two pilot sites becauselearners came from highly similar jobs. Programs that draw intoclasses learners from several different jobs may not be able to usesupervisor ratings to assess impact on productivity. Unless jobs haveseveral common tasks, it will not be possible to construct scales whichcan be used by all learners. If several different scales need to beconstructed for several different jobs, the small number of learners ineach job category is likely to preclude any meaningful statisticalanalysis.

DISCUSSION AND IMPLICATIONS

Learner questionnaire items related to literacy andcommunication practice on the job were of some use in gaining a

picture of impact on productivity. These items are subject to some ofthe same limitations as supervisor ratings. At the two pilot sites, theexpectation was for all workers to become involved in team meetings.

For this reason, it was possible to have several questionnaire itemsrelated to such meetings. A workplace literacy program without suchworkplace commonalties would not be able to benefit from these

questionnaire items.

CONCLUSION

This study has shown the feasibility of using a detailed impactassessment model with workplace literacy programs. Withoutrequiring a large commitment of resources, it is possible to gather a

great deal of information on learners' own literacy, the literacy of their

families, and their job productivity.

The results of the study indicate what can be expected of

effective workplace literacy programs. Instruction has produced someimprovement in all of the areas assessed, but gains appear to be

limited to areas directly addressed in class. There is apparently notransfer of learning into areas not covered by instruction. Because of

this, it appears that program providers need to have clear goals for

what they want to achieve in the limited time that learners are inclass. They should also be seeking ways to extend this time beyond

the classroom. One way of doing this is to use on-the-job materials inclass so that learners will be practicing outside class time. Also,

encouraging learner motivation and independence is likely to lead to

learners' engaging more often in literacy-related activities.

The second phase of this study aims to determine whether the

assessment model can be used by other workplace literacy programs,

with a minimum of assistance from project personnel. In addition,

results from this second phase will throw more light on the

conclusions reached here in the pilot assessment.

10710

BIBLIOGRAPHY

Anderson, R. C. (1985). Becoming a nation of readers: The report ofthe commission on reading. ERIC Document ED: 253 865.

Auspos, P. et al. (1989). ilinpldemonstration QARf rschool dropouts in the JTPA system. New York: Manpowerdemonstration reserve corporation. ERIC Document ED: 311 253.

Bobko, P., Karren, R. & Parkington, J.J. (1983). Estimation ofstandard deviations in utility analysis: An empirical test. Journalof Applied Psychology, f2a, 170-176.

Borman, W. C. (1977). Some raters are simply better than others atevaluating performance: Individual differences, correlates ofrating accuracy using behavior scales. Paper presented at themeeting of the American Psychological Association Convention,San Francisco.

Boudreau, J. W. (1983). Economic considerations in estimating theutility of human resource productivity improvement programs.Personnel Psychology, aa, 551-576.

Bormuth, J. (1969). Cloze tests and reading comprehension. ReadingResearch Quarterly, 2 (3), 359-367.

Brogden, H. E. (1949). When testing pays off. Personnel Psychology,2, 171-183.

Bronstein, Erica. (1991). benchmarks and student learning profile forthe workplace ESIp_mgom. Labor Education Center,Southeastern Mass. University.

Buchanan-Berrigan, D. L. (1989). Using children's books with adults:Negotiating literacy. Unpublished doctoral dissertation.Columbus: The Ohio State University.

Burke, Michael J. & Frederick, James T. (1986). A comparison ofeconomic utility estimates for alternative SD-sub(y) estimationprocedures. Journal of Applied Psychology, 71, 334-339.

Bussert, K. (1991). Survey of workplace literacy programs in theUnited States. Unpublished manuscript. Language EducationDepartment, Indiana University, Bloomington, IN.

Cascio, Wayne F. (1982). Costing humj resources: The financialimpact of behavior in organizations. Kent, Boston, MA.

109 107

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

Cascio, Wayne F. & Ramos, Robert A. (1986). Development andapplication of a new method for assessing job performance inbehavioral/economic terms. Journal of Applied Psychology, 71,20-28.

Chall, J. & Snow, C. (1982). Families and literacy: The contribution piout-of-school experiences to children's acquisition of literacy.National Institute of Education Final Report. Cambridge, MA:Graduate School of Education, Harvard University. ERICDocument ED: 234 345.

Chall, J. S. (1984). Literacy: Trends and explanations. AmericanEducation, 20 (9), 16-22.

Chomsky, C. (1972). Stages in language development and readingexposure. Harvard Educational Review, AZ, 1-33.

Co llino, G. E., Aderman, E. M. & Askov, E. N. (1988). Literacy and jobperformance: A perspective. Pennsylvania State Institute for theStudy of Adult Literacy.

Cooper, E. et al. (1988). Improving basic skills in the workplace:Workplace literacy programs in region III. Pennsylvania.Employment and training administration. Department of Labor.ERIC Document ED: 308 392.

Cronshaw, S. F. & Alexander, R. A. (1985). One answer to the demandfor accountability: Selection utility as an investment decision.Organizational Behavior and Human Performance, 11, 102-118.

DeLoache, J. S. & DeMendoza, 0. A. P. (1985). Joint picturebookreading of mothers and one-year-old children. Urbana, IL:University of Illinois.

Dickinson, T. L. (1977). The discriminant validity of scales developedby retranslation. Personnel Psychology, Q, 217-228.

Drew, R. A. & Mikulecky, L. J. (1988). How to gather and develop job-specific literacy for basic skills instruction. Bloomington, IN:Office of Education and Training Resources, School of Education,Indiana University.

Fielding, L. G., Wilson, P. T. & Anderson, R. C. (1986). A new focus onfree reading: The role of trade books in reading instruction. InT. Raphael & R. Reynolds (Eds.), Contexts of Literacy. New York:Longman.

BIBLIOGRAPHY

Fitzgerald, J., Spiegel, D. L. & Cunningham, J. W. (1991). Therelationship between parental literacy level and perceptions ofemergent literacy. Journal of Reading Behavior, j (2), 191-212.

Greer, Olen L. & Cascio, Wayne F. (1987). Is cost accounting theanswer? Comparison of two behaviorally based methods forestimating the standard deviation of job performance in dollarswith a cost-accounting-based approach. Journal of AppliedPsychology, zz, 588-595.

Greer, E. A. & Mason, J. M. (1988). Effects of home literacy on children'srecall. Center for the Study of Reading, Technical Report No. 420,Urbana: Illinois University. ERIC Document ED: 292 073.

Gross, A., Lee, M., & Zuss, M. (1988). PrWect Reach. Final EvaluationReport. New York. City University of New York. ERICDocument ED: 314 602.

Haigler, K. (1990). The job skills education program: An experimentin technology transfer for workplace literacy. A discussion paperprepared for the Work in America Institute, Harvard Club, NewYork, June 1990.

Hargroves J. (1989). The basic skills crisis: One bank looks at itstraining investment. New England Economic Review,September/ October, pp. 58-68.

Harkness, Frances & Miller, Larry (1982). A descriptirei of theinteraction among mother, child and books in a bedtime readingsituatiop. Paper presented at the Annual Boston UniversityConference on Language Development. Boston, MA.

Hunter, John E., Schmidt, Frank L. & Coggin, T. Daniel (1988).Problems and pitfalls in using capital budgeting and financialaccounting techniques in assessing the utility of personnelprograms. ,Journal of Applied Psychology, la, 522-528.

Kirsch, I. & Jungeblutt, A. (1986). lagramaag:ezollmeacstayoung adults. Princeton, NJ; National Assessment of EducationalProgress at Educational Testing Service.

Kutner, Mark, Sherman, Rene, Webb, Lenore, & Fisher, Carcia.(1991). A review of the national workplace literacy program.Washington, D.C.: U.S. Department of Education Office ofPlanning, Budget and Evaluation.

Laosa, L. M. (1982). School, occupation, culture and family: Theimpact of parental schooling on the parent-child relationship.Journal of Educational Psychology, a (6), 791-827.

EVALUATING THE IMPACT OF WORKPLACE LITERACY PROGRAMS

Laosa, L. M. (1984). Ethnic, socioeconomic, and home languageinfluence upon early performance on measures of abilities.Journal of Educational Psychology, 76 (6), 1178-1198.

Latham, G., Wex ley, K. & Pursell, E. D. (1975). Training managers tominimize rating errors in the observation of behavior. Journal ofAp_p_lieilsgy, 60, 550-555.

Loban, W. (1964). Language ability grades seven, eight. and nine.Project No. 1131, University of California, Berkeley.

Lujan, M. E. & Stolworthy, E. (1986). A parent training earlyintervention program in preschool literacy. ERIC DocumentED: 270 988.

Lytle, S. L. (1990a). Living Literacy: The practices and beliefs of adultlearners. Presented at an Invited Symposium of the LanguageDevelopment SIG entitled : "Adult Literacy/Child Literacy: OneWorld or Worlds Apart." American Educational ResearchAssociation Annual Meeting, Boston, MA.

Lytle, S. L. (1990b). Rethinking adult literacy development. Paperpresented at the American Educational Research Associationmeeting, San Francisco, Ca.

Marjoribanks, S. K. (1984a). Ethnicity, family environment andadolescents' aspirations: A follow-up study. Journal ofEducational Research, 77 (3), 166-7.

Marjoribanks, S. K. (1984b). Occupational status, familyenvironments, and adolescents' aspirations: The Laosa Model.Journal of Educational Psychology, za (4), 690-700.

Mason, J. M. & Stewart, J. (1988). Preschool children's reading andwriting awareness. Technical Report No. 442. Bolt, Beranekand Newman, Inc., Cambridge, Mass.; Illinois University, Urbana,Center for the Study of Reading. ERIC Document ED: 302 822.

McCormick, C. & Mason, J. M. (1986). Use of Little books at home: Aminimal intervention strategy that fosters early reading. Centerfor the Study of Reading. Technical Report 388. Urbana: IllinoisUniversity.

Mikulecky, L. J. (1982). The relationship between school preparationand workplace actuality. Reading Research Quarterly, 17, 400-420.

BIBUOGRAPHY

Mikulecky, L. J. (1985). Literacy task analysis; Defining and measuringoccupational literacy A paper presented at the annualmeeting of the American Educational Research Association,Chicago, IL. ERIC Document ED: 262 206.

Mikulecky, L. J. (1989). Second chance basic skills education. InInvesting in people: Commission on workforce quality and laborforce efficiency. Washington, D.C.: U. S. Department of Labor,Volume 1, 215-258.

Mikulecky, L. J. (1990). Basic skills impediments to communicationbetween management and hourly employees. ManagementCommunication Quarterly, a (4). 452-473.

Mikulecky, L. J. & D'Adamo-Weinstein, L. (1991). How effective areworkplace literacy programs? In M. Taylor, G. Lewe, and J.Draper, (Eds.) Basic skills for the workplace. Toronto: CultureConcepts, Inc.

Mikulecky, L. J. & Diehl, W.A. (1980). Job literacy: A study of literacydemands. attitudes and strategies in a cross-section ofoccupations. Bloomington: Indiana University School ofEducation, Reading Research Center.

Mikulecky, L. J. & Ehlinger, J. (1986). The influence of metacognitiveaspects of literacy on job performance of electronic technicians.Journal of Reading Behavior, la (1) 41-62.

Mikulecky, L. J. & Philippi, J. (1990). An evaluation of the UAW/Fordmathematics enrichment program. Dearborn, MI: UAW/FordNational Education, Development and Training Center.

Mikulecky, L. J. & Strange, R. (1986). Effective literacy trainingprograms for adults in business and municipal employment. InJ. Orasanu (Ed.) Reading comprehension: From research topractice. Hillsdale, NJ: Lawrence Erlbaum Associates.

Mikulecky, L. J. & Winchester D. (1983). Job literacy and jobperformance among nurses at varying employment levels. AdultEducation Quarterly, a, 1-15.

National Research Council. (1979). Measurement and interpretationof productivity. National Academy of Sciences, Washington, DC.

Nickse, R. S., Speicher, A. M., & Buchek, P. C. (1988). Anintergenerational adult literacy project: A family intervention/prevention model. Journal of Reading, 31 (7), 634-642.

EVALUATING THE IMPACT OF WORKPLACE =RACY PROGRAMS

Norton, S., Balloun, J.. Konstantinovich, B. (1980). The soundness ofsupervisory ratings as predictors of managerial success.Personnel Psychology, aa, 377-388.

Pellegrini, A., Brody, G. & Seigel, I. (1985). Parents' bookreadinghabits with their children. Journal of Educational PsychologyLZ, 332-340.

Perkins, D. N. & Salomon, G. (1989). Are cognitive skills context-bound? Educational Researcher, la, 16-25.

Philippi, J. W. (1988). Matching literacy to job training: Someapplications from literacy programs. Journal of Reading, a (7),658-66.

Philippi, J. W. (1991). Jditeracy at work: The workbook for programdevelopers. New York: Simon and Schuster.

Rush, T., Moe, A., & Storlie, R. (1986). Occupational literacy.Newark, DE: International Reading Association.

Schmidt, F. L., Hunter, J. E., McKenzie, R. & Muldrow, T. (1979). Theimpact of valid selection procedures on work-force productivity.Journal of Applied Psychology, DA, 609-626.

Schmidt, F. L., Hunter, J. E. & Pearlman, K. (1982). Assessing theeconomic impact of personnel programs on work-forceproductivity. personnel Psychology, 15, 333-347.

Schmidt, F. L., Mack, M. J. & Hunter, J. E. (1984). Selection utility inthe occupation of U.S. Park Ranger for three modes of test use.Journal of Applied Psychology, Ea, 490-497.

Sheppeck, Michael A. & Cohen, Stephen L. (1985). Put a dollar valueon your training programs. Training and Development Journal,

9, 59-62.

Snow, C. E. & Ninio, A. (1986). The contracts of literacy: whatchildren learn from learning to read books. In W. Tea le & E.Sulzby, (Eds.), Emergent Literacy. Norwood, NJ: Ablex, 116-138.

Stewart, J. P. (1986). A study of kindergarten children's awareness ofhow they are learning to read: Home and school perspectives.ERIC Document ED: 285 120.

Sticht, T. G. (1975). latadingiLoworking. Alexandria, VA: HumanResources Research Organization.

BIBLIOGRAPHY

Sticht, T. G. (1982). Basic Skill ins Defense. Alexandria, VA: HumanResearch Organization.

Sticht, T. G. (1983). Literacy and human resources development atwork: Investing in the education of adults to improve theeducability of children. Alexandria, VA: Human ResourcesResearch Organization.

Sticht, T.G. & McDonald, B.A. (1990). Teach the mother and reachthe child: Literacy across generations. Literacy lessons. Geneva,Switzerland: International Bureau of Education.

Stufflebeam, D. (1974). Meta-Evaluation: Paper No. 3. OccasionalPaper Series. Kalamazoo: Evaluation Center, College ofEducation, Western Michigan University.

Teale, W. H. (1983). Toward a theory of how children learn to readand write "naturally". Division of Education, The University ofTexas at San Antonio, San Antonio, TX.

Teale, W. H. (1984). Reading to young children: Its significance toliteracy development. In H. Goelman (Ed.) Awakening toliteracy. Antoinette Oberg and Frank Smith. Exeter, NH:Heinemann Educational Books.

Teale, W. H. & Sulzby, E. (1986). Home background and youngchildren's literacy development. In Emergent Literacy: Writingand Reading. Norwood, NJ: ABLEX.

U. S. Departments of Education and Labor (1988). The bottom line:Basic skills in the workplace. Washington, DC. Office of PublicInformation. Employment and Training Administration.Department of Labor.

Weekley, J.A., Blake, F., O'Connor, E.J. & Peters, L.H. (1985). Acomparison of three methods of estimating the standarddeviation in dollars. Journal of Applied Psychology, ZQ, 122-126.

Yaden, D. (1982). A categorization of two children's questions aboutpoint as they learn to read: A case study. Paper presented at theAnnual meeting of the Oklahoma Reading Conference of theInternational Reading Association (Lawton, OK).

1 1 5 11

APPENDIX A

Interview Formand

Instructions for Custom Designing

Interview

What modifications are needed?

The Interview protocol that follows addresses learners' beliefs,practices, processes and plans related to literacy activities. Mostprograms can use the supplied questions concerning:

beliefs

practices

plans

without any modifications.

For the process section. job-specific modifications are requiredto determine how well employees read material from a particularworkplace. This involves selection of reading materials which are keyto performance at that workplace. These will be used to develop threejob reading scenarios. We recommend that you select:

prose material (e.g., a newsletter article)

a graph (e.g., a key graph or chart)

a procedure (e.g., an instruction sheet or job aid)

Guidelines below provide directions for developing process.factual, inference and application questions for each job reading

scenario.

1 1 8

INTERVIEWPersonal Information:

Name: Date:

What class are you in?

Job you do

I'd like to ask you some questions about reading, writing, andeducation. The answers to these questions will give us an idea of theway reading and writing are used here.

Beliefs

1. Describe someone you know who is good at reading andwriting. What makes you choose this person?

2. How good do you consider yourself to be at reading and writing?What makes you think so?

3. Describe how you would like to be in terms of reading andwriting. (Probe : Could you give me some examples?)

I

Practices

1. Tell me the sorts of things you read and write away from workduring a normal week.(For probe, ask: Can you give me more examples?")

2. Tell me the sorts of things you read and write on the job during anormal week.(Use probe above for more examples.)

1 2 0

7

Instructions for Custom-Designing

Process: article, graph, and procedure/job aid

(For Interview, following pages)

INSTRUCTIONS

This approach reveals far more than standardized tests aboutan interviewee's thinking processes while reading and howthose thinking processes apply to job- related reading.

A. You need to find materials kev to your workplace for thescenarios. A typical mix is:

an article,a graph. andan instruction sheet.

B. For each of these, ask 7 questions:

1 process question - what is going on in theinterviewee's mind. (Use those in our examples.)

2 factual questions- based strictly on the material.

2 inferences questions deductions from the material that donot rely on a great deal of background knowledge.

2 application questions - relating information from thematerial to the interviewee's background knowledge(e.g.: the employee's job).

C. Include a range of difficulty in your questions, finishingwith an open-ended question which allows the intervieweeto contribute.

Process: Article Example

Competitor Close Up

1. I am going to show you a newspaper article about your industry.Explain to me how you would read this story in order to find outwhat the writer thinks.(Show attached story: "Competitor Close Up").Describe what you would look at. What would you be thinkingabout? How would you go about reading this story? What wouldyou do first, then next. then next?

2. (easy factual question)How many employees does ASMO have in Statesville?(Answer: 400. Listed in article. )

3. (harder factual question)What is the only company that does not mention customers?(Answer: BG Automotive Motors, Inc. Requires the intervieweeto look at all "customers" in the article. )

4. (easy inference question)From the information provided about products, what do all fourcompanies have in common?(Answer: All of them make some sort of motor. Requires theinterviewee to search for commonalities not readily apparent. )

122

1

Process: Article Example (cont.)

5. (harder inference question)Which of the companies listed is closely related to Japan andwhy do you think so?(Answer: ASMO or Jideco. Each has Japanese plants listedand each sells to many Japanese affiliates and maincustomers. Requires looking at two pieces of information ,

and drawing deductions based on what is provided. )

6. (harder application question)What company makes products closest to yourjob at thisfacility? Why do you say so?(Answer: Relate a product on the list to what the employeemakes. Requires the employee to sort through theinformation and then to apply it to his/her backgroundknowledge. )

7. (easy application question to end the section)From this list, which company pays the least amount to itsworkers? How does this relate to your wages at Delco?(Answer: ASMO. It's more or it's less than what I get paid here.Requires the employee to apply the information to his/herbackground knowledge, but allows him/her to contributemore. )

123

Competitor Close-Up:A Year in Review

Throughout the year, the Delco Doingshas brought you profiles on the companiestrying to take a bite out of our business andour profits. Sometimes there were successstories, when Rochester Operations met thechallenge and came out on top. Other timeswe had to face the fact that there arecompanies in Asia. Europe. and right hereat home that are reaching the market better.faster or with lower prices.

Here's a quick recap of the competitorswe've covered this year.

ASMO. Inc.Location: Battle Creek. Michigan; Statesville.North Carolina; Kosai City, Japan.Affiliate: NippondesoProducts: wiper systems. windshield washersystems. power window lifts, antennas.retractable and blower motors.Main Customers: Nippondeso. Ford.Chrysler. General Motors, and everyJapanese transplant except Nissan.Number of Employees: Battle Creek, 130;Statesville. 400.Total Wage and Benefit Cost/Hour: $9.58

JidecoLocation: Bardstown, Kentucky;YokohamaCity, Japan: Nine production facilitiesthroughout JapanAffiliates: Hitachi (24%), Nissan (21%)Products: wipers, transmissions, reservoirsarms and blades, wiper motors, and others.Control-- wiper switches and others.Motors -- power seat sliders, power window,door lock, blower and engine coolingmotors and others.Accessories: air compressors, power windowkits, door locks, rain-sensing intermittentwiper controls and others.Main Customers: Nissan. Isuzu. Honda,Mitsubishi, Mazda, and Suzuki.Number of Employees: Bardstown. 60 in 1987Total Wage and Benefit Cost/Hour: $10.27

Power MotionLocation: Two plants in London.OntarioParent: Siemans Automotive ofWest GermanyProducts: air moving motors (5.250armatures a day)Main Customer: GM of US & CanadaNumber of Employees: 200 at mainfacility in London, OntarioTotal Wage and Benefit Cost/Hour:$11.50 (U.S. equivalent)

BG Automotive Motors. Inc.Location: Hendersonville. TN.Parents: Bosch Corporation and GeneralElectric CompanyProducts: 20 different small motorsincluding: engine cooling, modular wipers,door lock. seat back, head rest, sun-roof,washer pump, head lamp, power window.Number of Employees: 275Total Wage and Benefit Cost/Hour:Unknown at this time.

Every day another company steps intothe automotive arena ready to try to takeaway our customers. Rochester Operationshas an extensive communication networkto keep employees informed about ourcompetitors and what we're doing to stayahead. Look to Delco Doings to give youthe information you need to help keepRochester Operations competitive in the`90s.

From: Delco Doings. December/January, 1991, p.2.

124

Process: Graph Example

Production Problems

1. I am going to show you a graph. Explain to me how you wouldread this graph in order to find out what it's about.(Show attached graph." Production Problems").Describe what you would look at. What would you bethinking about? How would you go about reading this graph?What would you do first, then next, then next?

2. (easy factual question)What is the total number of culls?(Answer: 149. Shown at top of graph.)

3. (harder factual question)What time period is covered in this chart?(Answer: one week or week one in May. Shown at top ofgraph in abbreviated form.)

4. (easy inference)What is the biggest problem here?(Answer: tear outs. Longest bar on graph.)

125 1 9,

Process: Graph Example (cont.)

5. (harder inference)Find 3 types of problem involving measurement.(Possible answers: thickness, length, width, squareness.Requires selection from list at left of graph.)

6. (easy application question)Pick one problem and suggest at least one cause for thatproblem.(Possible answers: For example, tear outs are caused whenthe wood gets caught in the machine and is gouged; moulderburn is caused by wood getting caught in the machine andbeing burned. Uses interviewee's job-related knowledge. )

7. (more difficult application question)Pick a second problem and suggest both a cause and a solutionfor the problem.(Possible answers: tear outs, caused when the wood getscaught in the machine and is gouged, can be repaired withwood filler and sanding: or moulder knife marks can becaused by gouging of the wood in carving it and can berepaired if you can get at the gouge and sand it andprovided the finish hasn't already been applied. Usesinterviewee's job-related knowledge in more depth.)

1 2 6 1

TE

AR

OU

T

GLU

E J

OIN

TS

TH

ICK

NE

SS

MO

ULD

ER

BU

RN

KN

OT

S

LEN

GT

H

WID

TH

CH

EC

K &

WIN

DS

HA

KE

MO

ULD

ER

KN

IFE

MA

RK

S

SP

LIT

S

BO

W O

R W

AR

P

SQ

UA

RE

NE

SS

DR

YR

OT

1 'A

MO

LDS

IDE

- D

RA

WE

R S

IDE

CU

LL

WE

INIG

MO

LDE

RM

AY

WK

. 1C

OU

NT

S T

OT

AL

= 1

49

05

1015

2025

3035

4045

1

Process: Procedure/Job Aid Example

OSHA CARD

1. The government has safety regulations and special labels inmany workplaces. I am going to show you a safety card thatmany employees in America must keep in their pockets whileworking. This card shows how to understand safety labels.Explain to me how you would read this card.(Show attached card "OSHA ").

Describe what you would look at. What would you bethinking about? How would you go about reading this card?What would you do first, then next, then next?

2 (easy factual question)What should you do when you see the letter "x"?(Answer: Ask my supervisor. Directly explained in the text )

3. (harder factual question)What do all the symbols in "k" represent?(Answer: airline hood or mask, gloves, a suit and boots.Answers are in the text, but are more difficult to find. )

4. (easy inference)What is the most common type of protection from "A" to "K"?

(Answer: gloves. Requires the interviewee to look throughseveral parts of the text and then to generalize the information )

128

Process: Procedure/Job Aid Example (cont.)

5. (harder inference)Name all the letters which refer to severe hazards. How didyou tell this?(Answer: F, H, J, K. Top of the table says "4 severe hazard:4 probably means 4 pictures. These letters have 4 pictures.Requires the interviewee to make deductions betweendifferent parts of the card. )

6. (harder application)If a supervisor says you are about to do a job that requires sanding,which protective items would you choose?(Answer: safety glasses and a dust respirator. Optional: gloves,combination dust/vapor respirator and a face shield. Requiresthe interviewee to interpret the information on the card andto relate it to a real-life situation. )

7. (easy application question to end the section)Give me two examples of how you or someone you knowcould use this card.(Answer: Must give 2 examples and list protections. Thisis more open-ended and allows the interviewee to contributebased on his/her job background.. )

129

)

Haz

ardo

us M

ater

ials

Iden

tific

atio

n S

yste

mH

AZ

AR

D IN

DE

X

4 S

ever

e H

azar

d3

Ser

ious

Haz

ard

2 M

oder

ate

Haz

ard

1 S

light

Haz

ard

0 M

inim

al H

azar

dA

n as

letls

k (')

se

sew

dee

lpst

ios

terr

espe

oes

M W

INIe

nat i

nfor

ms-

Itea

es a

del

s W

eed

w u

psta

teW

eek

Wef

t sol

llies

ello

t

PE

RS

ON

AL

PR

OT

EC

TIO

N IN

DE

X

A IR

VB

vis

r +

alD

076

,-F

ste+

11D

++

Evr

f;7 +

1110

+ 4

e)-

F 7

67 +

Nee

+if-

F-i4

iA

ME

RIC

AN

LA

BE

LMA

RK

, Chi

cago

, IL

6064

6N

CE

C

G 'c

m+

Ise+*

sais

7 +

wig

+

XA

sk y

our

supe

rvis

or fo

r sp

ecia

llied

hand

ling

dire

ctio

ns

717

cPC

Yv.

Saf

ety

Gla

sses

Syn

thet

icA

pron

Spl

ash

Fac

eM

ane

Hoo

dG

love

sG

oggl

esS

hiel

dor

Mas

k

4i)4

\./oW

o

Dus

tne

sgpf

atot

Vap

orC

omov

iatio

nDus

tF

tesp

wat

ot1

Vap

or R

espi

rato

r

Ful

l Plo

tect

ive

dlit

Loot

sS

url

'19

81 N

atio

nal P

aint

& C

oatin

gs A

ssoc

iatio

n

Plans

Now I'd like to ask you about your plans. Explain how you see, readingand education as part of these plans:

A What are your plans for the next year?

B. What are your plans for the next 5 Years?

C. What are your plans for the next 10 Years?

131

APPENDIX B

Questionnaire Formand

Instructions for Custom Designing

gut;tionn.aire

What modifications are needed?

The Questionnaire protocol that follows addresses learners'reading abilities, their literacy practices at work and away from work.and the literacy activities of their families. Most programs can use the

supplied questions concerning:

literacy away from work

literacy at work

family literacy

without any modifications.

The section on self-rating of reading ability has 15 questions. 10of which should apply to most industries and thus need no changes.However, the last 5 items should be site-specific reading materials,such as warning labels, route sheets, product lists, etc. Actual namesmay differ from site to site.

When you choose these last 5 items, select a mix of:

prose and graphic materials (e.g., a note from asupervisor, and a blueprint)

easy and difficult reading materials (e.g., simplesuggestion forms and more complex benefit

information)

134

QUESTIONNAIRE

Name: Age: Sex:

Education: (furthest year in school) Training

Marriage Status:

Children: (number) (ages)

Practices: Self rating reading ability

I. 1. First check only the things you've read in the past month.

2. Now-go back and rate your ability to read the items you've checked.

poor

local newspapers 1

classified ads 1

telephone bills 1

TV guide listings 1

magazines 1

poor

_ training

paycheck stubs 1

_ company newsletters 1

benefit information 1____ graphs and charts 1

poor

1

1

1

1

1

135

2 3 4

excellent

5

2 3 4 5

2 3 4 5

2 3 4 5

2 3 4 5

excellent

2 3 4 5

2 3 4 5

2 3 4 5

2 3 4 5

excellent

2 3 4 5

2 3 4 5

2 3 4 5

2 3 4 5

2 3 4

$,J

Instructions for Custom-Designing

Practices: Self rating reading ability

(For Questionnaire, preceding page)

INSTRUCTIONS

1. This component includes ten standard items followed

by five examples of site specific items.

2. In the site specific section, replace example itemswith items key to your workplace.

3. Have a mixture of prose and document/ chart items.

4. These should cover a range of reading difficulties.

5. Possible sources ofmaterials:

warning labels, production quotas.

parts lists. product lists

department inventories.

EXAMPLE:poor

blueprints 1 2 3 4 5

route sheets 1 2 3 4 5

notes from supervisor 1 2 3 4 5

suggestion forms 1 2 3 4 5

inventory graphs 1 2 3 4 5

136

I

excellent

Practices: Reading frequency

Please check the number of times you have done the following:

1. In the last 7 days how many times have you used a TV guide listing toselect programs?

O 1 2 3 4 5 6 7 8 9 10+

2. In the last 7 days how many times have you read a newpaper?

O 1 2 3 4 5 6 7 8 9 10+

3. In the last 7 days how many times have you read a magazine?

O 1 2 3 4 5 6 7 8 9 10+

4. in the last 7 days how many times have you read a book for pleasure?

O 1 2 3 4 5 6 7 8 9 10+

5. In the last 7 days how many times have you read the followingtypes of books?

mystery: times how-to books: times

novels: times factual books: times

poetry: times encyclopedia: times

Bible: times comic books: times

other types: times

times

137 1 3 t-J

Practices. Reading frequency (cont.)

6. How often do you make a shopping list before you go to the store?never occasionally often always

7. When you're waiting in an office, how often do you read magazines?never occasionally often always

8. Do you subscribe to any magazines? yes no

If yes, which ones?

9. How many different magazine titles do you have in your home?

0 1 2 3 4 5 6 7 8 9 10+

10. How many books are in your home, either owned or borrowed?

0 10 20 30 40 50 60 70 80 90 100+

138

Practices: Literacy at work

Please circle the number which best describes you in the situations below:

(1) You just listen in team or department meeting disc,:ssions.

very like me 1 2 3 4 5 very unlike me

(2) You talk a lot in team or department meetings, asking questionsor sharing ideas.

very like me 1 2 3 4 5 very unlike me

(3) Your ideas are often discussed in team or department meetings.

very like me 1 2 3 4 5 very unlike me

(4) You wait for others to talk about written information, just to besure what is in it.

very like me 1 2 3 4 5 very unlike me

(5) You look for printed directions to help figure out what to do whena problem arises.

very like me 1 2 3 4 5 very unlike me

(6) You often have trouble reading paperwork from management.

very like me 1 2 3 4 5 very unlike me

(7) When the booklet about new health benefits arrived, you read itcarefully.

very like me 1 2 3 4 5 very unlike me

139 13

Practices: Family Literacy

Only answer the following questions if you have a childbetween the ages of 3-17 at home.

Please answer for your youngest child in this age group and please fillin only one answer per question:

1. This child is years old.

2. In the last 7 days how many times has your child looked at orread books or magazines?

O 1 2 3 4 5 6 7 8 9 10+

3. In the last 7 days how many times has your child seen you readingor writing?

O 1 2 3 4 5 6 7 8 9 10+

4. In the last 7 days how many times have you helped your childwith homework and/or with school projects?

O 1 2 3 4 5 6 7 8 9 10+

5. In the last 7 days how many times have you read/looked at bookswith your child or listened to him/her read?

O 1 2 3 4 5 6 7 8 9 10+

6. In the last 7 days how many times has your child asked to beread to?

O 1 2 3 4 5 6 7 8 9 10+

7. In the last 7 days how many times has your child printed, madeletters, or written?

O 1 2 3 4 5 6 7 8 9 10+

140

Practices: Family Literacy (cont.)

8. In the last month how many times has your child gone to a publiclibrary?

O 1 2 3 4 5 6 7 8 9 10+

9. In the last month how many times have you participated/helpedout in your child's school?

O 1 2 4 5 6 7 8 9 10+

10. In the last month how many times have you hung up or displayedyour child's reading and writing efforts?

O 1 2 3 4 5 6 7 8 9 10+

11. In the last month how many times have you bought or borrowedbooks for your child?

O 1 2 3 4 5 6 7 8 9 10+

12. (Please check only one.)I expect my child to finish at least:

6th gradetwo-year college

_9th grade _high school_4-year college or more

APPENDIX C

Cloze Test Samplesand

Instructions for Custom Designing

143

1-L. `- ',..1

Cloze Exercise

The doze procedure is based on the psychological principle ofclosure, which is the human tendency to recognize and complete apattern or sequence. It involves replacing missing words in a readingpassage. This procedure can assess the ability of employees tocomprehend the passage. Cloze test scores correlate very highly withstandardized reading test scores. Cloze tests can be made from localworkplace materials.

CLOZE PROCEDURE

1. Select a job relevant passage of 150-200 words.

2. Leave the first and last sentences intact.

3. Starting with the second sentence, omit every fifth

word. This will give about 25 blanks. Replace all

omitted words with the same sized blank line,13-15 spaces is typical.

4. Employees are to read the passage and to fill the

blanks with their best guess at the word removed.

5. Instructions for the doze test should suggestemployees read the entire text before attemptingto fill in the blanks and should encourageemployees to answer all questions even if theyhave to guess. It is rare for anyone to know more

than 50% of the blanks.

6. Avoid controversial or emotional topics, and

topics requiring technical knowledge. Scores for

such materials are less valid.

144

Name Date

CLOZE Exercise

In a doze exercise, you try to guess which words are missing. Forexample. in the sentence below, a word is missing.

She looked before she the street.A good guess for the missing word is "crossed."

She looked before she crossed the street.

In the story below, try to guess and replace the missing words. Don'texpect to get them all. Some are nearly impossible.

G.M Designs Safety for All AgesWe all like to think about the old days. Life seemed simpler and, in

some ways, better then. But when it comes to , the

good old days offer the same degree

safety as today's cars trucks. Advancements in

technology the G.M. vehicle you

today among the safest the world. Each G.M.

and truck is backed thousands of

dedicated men women who care about

safety of their customers. , as G.M.

customers themselves, have a stake in

G.M. vehicles the highest quality

and reliability.

And you're wondering if safety

improved in recent years, this: The classic 1955

would require more than major

changes or additions hundreds of incremental

changes be as safe as vehicles.

From: Kilborn, C. GM Today (November/December, 1990), page 1.

145 I

Cloze Exercise Answer Key

G.M Designs Safety for All AgesWe all like to think about the old days. Life seemed simpler and, in

some ways, better then. But when it comes to automobiles , the

good old days didn't offer the same degree of safety

as today's cars and trucks. Advancements in

technology make the G.M. vehicle you purchase

today among the safest in the world. Each G.M.

car and truck is backed by thousands of

dedicated men and women who care about

the safety of their customers. _find , as G.M.

customers themselves, they have a stake in

making G.M. vehicles the highest in quality and

reliability.

And if you're wondering if safety has

improved in recent years, consider this: The classic 1955

Chevrolet would require more than 60 major

changes or additions and hundreds of incremental

changes to be as safe as today's vehicles.

146

Name or ID# Date

CLOZE Exercise

In a doze exercise, you try to guess which words are missing. Forexample, in the sentence below. a word is missing.

She looked before she the street.A good guess for the missing word is "crossed."

She looked before she crossed the street.

In the story below, try to guess and replace the missing words. Don'texpect to get them all. Some are nearly impossible.

Two more teams on the self-directed journeyOur workplace is taking on more change daily. So are the skills

that all our employees must have in order to change with it. It is

getting to be essential each day that skill

gaps be filled our small business can a

source of competitive

The changes all companies expect over the next

a shrinking labor force, demand for

workers in jobs, and increasingly competitive

markets -- will require businesses all sizes

to strengthen employee skills and training

We believe that our firm can remain competitive

the large firm by a more flexible training

education program. We hope be better than the

firm in adapting an previous training

experiences to company's needs.

Two more are now involved in self-directed

team training. They are Green Team and the

Team. They join the Orange Team, which completed their sessions

last year.

147

Cloze Exercise Answer Key

Two more teams on the self-directed journey

Our workplace is taking on more change daily. So are the skills

that all our employees must have in order to change with it. It is

getting to be more essential each day that the

skill gaps be filled SO our small business can remain

a source of competitive strength .

The changes all companies can expect over the next

decade a shrinking labor force, more demand for workers

in technical jobs, and increasingly competitive world markets

will require businesses of all sizes to strengthen their

employee skills and training programs .

We believe that our small firm can remain competitive with

the large firm by having a more flexible training and

education program. We hope to be better than the large

firm in adapting an employee's previous training experiences to

the company's needs.

Two more teams are now involved in self-directed work

team training. They are the Green Team and the White

Team. They join the Orange Team, which completed their sessions

last year.

148J

Name or ID# Date

CLOZE ExerciseIn a doze exercise, you try to guess which words are missing. For

example, in the sentence below. a word is missing.She looked before she the street.

A good guess for the missing word is "crossed."She looked before she crossed the street.

In the writing below. try to guess and replace the missing words.Don't expect to get them all. Some are nearly impossible.

Cumberland Safety Rules

L For your welfare, all injuries, no matter how slight. incurred on Company premises

must be reported immediately to your supervisor. The services of a physician

available and will be as required. Failure to

such injuries may cause difficulties and could affect

Workingmen's Compensation benefits.

2. Wear and shoes suitable to work.

Open toed or top shoes are not . Shorts are not permitted.

are required.

3. Dust your only with an air equipped with an air

nozzle.

4 Keep fire equipment its proper place and all fire

rules.

5. Learn lift properly. Keep your straight and use

your to avoid strain.

6. All guards should be kept place. Unsafe

machine guards be reported to your at once. No

v11%1 be removed without the of your supervisor.

7. Do repair machinery when it in operation.

Stop it and fix the switch so that it cannot be accidently turned on.

149

Cloze Exercise Answer Key

Cumberland Safety Rules

1. For your welfare, all injuries, no matter how slight, incurred on Company premises

must be reported immediately to your supervisor. The services of a physician

are available and will be obtained as required. Failure to

report such injuries hay cause medical difficulties and could affect

your Workingmen's Compensation benefits.

2. Wear clothing and shoes suitable to your work.

Open toed or canvas top shoes are not permitted . Shorts are not permitted.

Shirts are required.

3. Dust your clothes only with an air hose equipped with an air

restricting nozzle.

4. Keep fire equipment in is proper place and obey all fire

rules.

5. Learn to lift properly. Keep your back straight and use

your legs to avoid strain.

6. All machine guards should be kept in, place. Unsafe

machine guards should be reported to your supervisor at once. No

guard should be removed without the permission of your supervisor.

7. Do not repair machinery when it is in operation.

Stop it and fix the switch so that it cannot be accidently turned on.

150 1

APPENDIX D

Family LiteracyFocus Group Interview

1

151

Family Literacy Focus Group Interview

This interview form is designed to be used with a group oflearners as the basis for a discussion about family literacy. It has beenfound that the comments of one member of the group will stimulatethe thoughts of others. producing a wider range of ideas than willindividual interviews.

152

Family Literacy Focus Group Interview

1. Why do you think some children learn to read and write wellin school and others don't?

2. Do you think there is anything parents might do to help theirchildren learn to read and write better?

3. Do you keep any reading or writing materials at home?(i.e., letter blocks, flashcards, paper, pens, chalkboard, books.magazines, comics, cassettes with books, encyclopedia,dictionary, newspapers, etc)

153

4. Do you do any reading or writing activities with your children?(i.e., visit library, hear stories, read to them, watch educationaltelevision, look at magazines or books with children, point outwords to them, play school, show them how to read or write,etc)

5. At home, do your children see you doing any reading orwriting? (i.e. books, magazines, papers. recipes, directions,letters, lists, notes, etc.)

6. What activities are you involved in at your child's school?(i.e., parent/teacher meetings, school fund-raisers,committees, notes or letters. informal talks when collectingchild, assist in classroom, help child read at home, etc)

1541:

7. Have you begun anything new related to reading andwriting since you started classes here?

a. Materials

b. Activities

c. Modelling

d. School

APPENDIX E

ClassroomObservation Form

157

Classroom Observation Form

The classroom observation form serves as a guide for recordingnotes about the activities actually occurring in the classroom. It isdivided into columns reflecting the time in five minute intervals, theactual activities of both teacher and student, and comments about theoverall class. The form suggests items the observer might wish tonote.

15 8

Time0

05

10

15

20

25

30

35

40

45

50

55

Classroom Observation

Student Activity Teacher Activity Comments

Make note of time spent by students actually reading or doing things. Also note time

learners spend listening to the instructor. When learners are in small groups or

working individually should be mentioned. Special note should be made when the

instructor or a student demonstrates how to do something.1 5 9

15.6

APPENDIX F

ESL Checklist

161

156

,

ESL Checklist

The ESL checklist is designed for teachers to reflect upon the levelof competence each student is demonstrating. Teachers will be ableto note individual areas of strength and weakness. This form is helpfulboth in planning instruction and in suggesting areas for the student topractice on outside of the workplace.

162

ESL Benchmarks and Ratings*

Learner Name Teacher Name

Date of Rating

For each item, rate the learner:

3 = can do this as well or nearly as well as a native speaker2 = can usually manage to do this, but sometimes has trouble1 = can only sometimes manage to do this adequately0 = cannot do this

Beginner Level

Briefly describes feelings about work

Briefly describes feelings about other life areas

Follows simple directions

Asks for clarification if something is not understood

Reads alphabet in English

Word recognition:

has access to dictionary/understands dictionary use

uses dictionary

uses roots, prefix, suffix

uses context

Looks up simple information (phone book, dictionary)

Reads simple signs

Begins short journal entries

* -41clified from Bronstein, E. (1991) Benchmarks and student learning profile for theworkplace ESL program of the Labor Education Center at Southeastern Mass.University

163 1

Intermediate Level

OralDiscusses feelings about work with some elaboration

Discusses feelings about other life areas with some elaboration

Gives/follows instructions at work

Gives/follows instructions in other life areas

Asks for clarification if something is not understood

Discusses industrial specific diseases/illnesses

Describes/reports dangerous conditions

Offers suggestions to supervisor

Reading

Uses dictionary ( bilingual English-English)

Locates own reading material in newspapers

Understands literal level of text

Infers information not explicitly stated

Draws conclusions from reading

Writing

Fills out more complex forms

job application social security insurance

other application forms (library card, courtesy card, creditcard)

Writes short notes/memos ( at workout of work)

Writes journal entries (dialogue journal)

Uses correct punctuation

1641

CJ

Advanced LevelOral

Discusses feelings with more elaboration

Asks for clarification if something is not understood

Gives/follows more complex directions

Understands and can discuss basic worker rights

Understands and can discuss key contract sections

Reading

Uses index/table of contents

Locates own reading material in newspaper

Locates own reading material in encyclopedia or otherreference

Reads short pieces in newspaper and simple magazines

Reads short pieces in flyers, notices, factsheets

Reads short self-selected material at home

Reads longer pieces in books or longer articles

Writing

Fills out more complex forms

job application social security insurance

other application forms (library card, courtesy card, creditcard)

Writes short notes/memos ( at workout of work)

Observes differences in tone/register between formal andinformal writing

Writes longer journal entries/responds to entries

165

APPENDIX G

Supervisor Rating ScalesExamples and Instructions

Developing Supervisor Ratings

It is best to develop ratings of employee job performancetogether with supervisors and possibly key employees.

1. First ask supervisors to describe how top performers useinformation on the job.

Encourage them to think of specific workers who are topperformers. A supervisor might say, for example, that a topperformer reads charts and responds with his own analysis,

or sets machines correctly and checks settings thoroughly,or completes all job-related paperwork and tries to improveprocedures. Continue to probe until you feel reasonablysatisfied you have a complete list. From this list, you canidentify important areas (i.e. communication, problemsolving, paperwork, etc.)

Next ask supervisors to go through a three-step process infleshing out these areas. The order of these steps is important.

2. Ask supervisors to:a. describe the behavior of the top performers first;b. then, describe the behavior of the bottom performers;c. last, describe the average performer.

These behaviors will be used to provide descriptions andanchors for ratings. In relation to paperwork, for example,

supervisors might agree on the following descriptions:

Top: completes all job-related paperwork and tries toimprove procedures:

Jottom: intimidated by job-related paperwork and does itpoorly:

Average: does job-related paperwork but simply keeps pace.

168

As supervisors develop these descriptions. new areas andcategories may emerge. The supervisors may give examplesrelated to problem-solving or to machine setting, or someother area. These may later become additional ratingscales.

3. Once the descriptions of top, bottom, and average performances arecompleted, work with supervisors to develop acceptable labels

for the categories.

For example, labels might include items like machinesetting, paperwork, communication, responsibility, andproblem-solving.

4. After this discussion, you will draft a rating scale and submit it tothe supervisors for comment and possible revision. Sometimesduring revision, complex scales split to become two separatescales.

Examples of scales appear on the following pages.

169 1

Employee Assessment - Overall RatingPlease rate each employee on a scale of 1 10 for each aspect below.

An average employee would be rated 5.A top employee would be rated 8 or higher.A bottom employee would be rated 2 or lower.

EMPLOYEE

COMMUNICATION

Bottom

won't speak;can't express self;nervous; won'tshake hands

RATER

Average

open, relaxedcommunicator;good listenerand responder

DATE

Top

processesinformationand respondswith own analysis

1 2 3 4 5 6 7 8 9 10

CONCERNS. PROBLEM - SOLVING

Bottom

doesn't consideralternative solutions;makes irrelevantsuggestions;never thinks ofconsequences

Average

can suggestsolutions, butnot work throughthem in detail

Top

suggests solutionsand analysesconsequences,including atimeline

1 2 3 4 5 6 7 8 9 10

170

HANDLING CONFLICT

Bottom

antagonistic;turns back on others;makes abrupt denialsand impolite comments

Average

cooperates withothers most ofthe time, but someantagonism

Top

empathetic;cooperative:consistentattitude

1 2 3 4 5 6 7 8 9 10

SELF-ESTEEM

Bottom

shy; uncertain;overwhelmed bylife's problems

Average

some confidencein self; but lifenot reallyunder control

Top

confident;usually in controlof life and ofmost situations

1 2 3 5 6 7 9 10

SETTING GOALS

Bottom

unable to planahead andset goals

Average

some short-termplanning andgoal setting

Top

clear plans forfuture; definite,reachable goals

1 2 3 4 5 6 7 8 9 10

1711f5

COMMITMENT

Bottom

lacks motivation;no interest incompany goals

Average

some commitment;but just doing acompetent job

Top

conscientious;committed tocompany goals

1 2 3 4 5 6 7 8 9 10

RESPONSIBILITY

Bottom

has to be told whatto do and checked on

AN'1111.C.

can -,,e left to carryout -..-riutine work

1 2 3 4 5

Top

dependable; takesresponsibilityfor own work

7 8 9 10

INITIATIVE

Bottom

ignores machineerrors and letsthem build up

:rage

realizes machineerrors andattemptsimmediatesolution only

Top

monitors machineerrors and dealswith themthrough the team

1 2 3 4 5 6 7

1721

8 9 10

PAPERWORK

Bottom

intimidatedby job=reraiedpaperwork anddoes it poorly

Average

does job-relatedpaperwork.simplykeeping pace

Top

completes alljob-relatedpaperwork andtries to improveprocedures

1 2 3 4 5 6 7 8 9 10

MACHINE SETTING

Bottom

unable to setmachines correctly

Average

usually setsmachines correctly,but doesn't alwayscheck settings

Top

sets machinescorrectly andchecks settingsthoroughly

1 2 3 4 5 6 7 8 9 10

173 (-;

APPENDIX H

Tabular Data

175 l('' C;

Glossary of Variables

In the tables that follow, the variables are given brief descriptionswhich may not always be entirely clear. So this glossary provides a fullerexplanation for those variables that require it.

BELIEFS AND PLANSLiteracy self-ratingChange in literacyself-image

Change in plans for1, 5, 10 years

Change in plans foreducation

PRACTICESReading/writingaway from work

Reading/writingat work

Items read (in20 item list)

Frequency ofreading activities

Ownership ofreading materials

Self-rating ontalking in meetings

Self-rating onideas discussed

Self-rating onasking for help

PROCESSESTotal processresponses

Focusresponses

Strategyresponses

Topicresponses

Learner self-rating of literacy level (on scale 1 5)Holistic judgement of learner's change in literacyself-image (on scale -1. 0, +1)

Holistic judgement of learner's change in plans for1, 5, 10 years (on scale -1, 0, +1)

Holistic judgement of learner's change in plans forreading and education (on scale -1, 0. +1)

Count of types of reading/writing away from workin last week

Count of types of reading/writing at workin last week

Count of items from given list read in last month

Sum of 6 frequency ratings of literacy activities(each on scale 1 = never to 5 = everyday)

Sum of book ownership and magazine subscription( on scale 1 = 1-5 to 8 = 50+)

Learner self-rating on their participation in meetings(on scale 1 5)

Learner self-rating on how much their ideas arediscussed in meetings (on scale 1 - 5)

Learner self-rating on how much they ask for helpat work (on scale 1 - 5)

Count of all responses to process question

Count of responses to process question involvingpoints of focus (e.g. title, bold print)

Count of responses to process question involvingreading strategies (e.g. skim, read through)

Count of responses to process question involvingtopics of interest (e.g. products. wages)

* In each pair of values, mean is above and standard deviation is below.

Delco: Technical Preparation Class (n = 14)

BELIEFS & PLANS Pre-testmean/s.d.*

Post-testmean/s.d.

Literacyself-rating

Change in literacyself-image

Change in plansfor 1 year

Change in plansfor 5 years

Change in plansfor 10 years

Change in plansfor education

3.3571.082

3.929

(Holistic

judgements

of change:

no pre-

and post-

scores)

.829

Change Significancemean/s.d.

.571 p<.05.938

0.5.65

0.429.646

0.357.633

0.0.679

.143

.949

p<.01

p<.05

p<.05

n.s.

n.s.

PRACTICES

Reading/writing 4.786 6.571 1.786 p<.05away from work 2.082 2.709 2.86

Reading/writing 2.846 2.615 .231 n.s.at work 1.725 2.293 3.516

Items read (in 17.308 18.846 1.538 n.s.20 item list) 4.644 2.734 3.799

Frequency of 16.308 16.923 .615 n.s.reading activities 3.093 3.201 1.502

Ownership of 5.154 5.154 0.0 n.s.reading materials 2.035 1.951 1.354

Self-rating on 2.769 3.231 .462 p<.05talking in meetings 1.301 1.235 .776

Self-rating on 2.385 3.231 .846 p<.05ideas discussed 1.325 1.092 1.463

Self-rating on 1.615 1.615 0.0 n.s.

asking for help .768 .768 1.08

In each pair of values, mean is above and standard deviation is below.

178

Delco: Technical Preparation Class (cont.)

PROCESSES Pre-test Post-test Changemean/s.d.* rnean/s.d. mean/s.d.

Significance

Total process 4.214 5.286 1.071 n.s.responses 1.424 1.858 2.526

Focus 1.786 1.786 0.0 n.s.responses 1.188 1.251 .961

Strategy 1.143 1.357 .214 n.s.responses .864 .842 1.051

Topic 1.286 2.143 .857 n.s.responses 1.437 1.748 2.143

Article question .929 1.0 .071 n.s.(easy factual) .267 0.0 .267

Article question 1.5 1.714 .214 p<.05(harder factual) .519 .469 .426

Graph question 4.857 4.857 0.0 n.s.(easy factual) .363 .363 .392

Graph question 4.571 4.643 .071 n.s.(easy factual) .756 .633 .917

Graph question 4.071 3.714 .357 n.s.(harder factual) 1.207 1.139 1.008

Graph question 2.071 2.786 .714 p<.01(inference) 1.072 .893 .994

Job aid question .929 1.0 .071 n.s.(easy factual) .267 0.0 .267

Job aid question 2.429 2.643 .214 n.s.(harder factual) .646 .497 .893

Job aid question 2.643 2.857 .214 n.s.(inference) .745 .535 .975

Cloze test score 10.857 12.429 1.571 p<.052.685 3.131 2.377

* In each pair of values, mean is above and standard deviation is below.179177

Delco: Technical Preparation Control (n = 12)

BELIEFS & PLANS Pre-test Post-test Change Significancemean/s.d.* mean/s.d. meanls.d.

Literacyself-rating

Change in literacyself-image

Change in plansfor 1 year

Change in plansfor 5 years

Change in plansfor 10 years

Change in plansfor education

3.583 3.5.669

(Holistic

judgements

of change:

no pre-

and post-

scores)

.674.083.515

.083

.515

.167

.718

.333.492

.083.515

2.25.866

PRACTICES

Reading/writing 4.75 4.583away from work 1.42 1.564

Reading/writing 3.0 3.083at work .853 1.832

Items read (in 18.75 18.58320 item list) 2.34 2.644

Frequency of 18.167 16.583reading activities 2.082 1.676

Ownership of 4.917 4.833reading materials 2.019 2.038

Self-rating on 3.333 3.083talking in meetings 1.614 1.311

Self-rating on 3.5 3.25ideas discussed 1.314 1.055

Self-rating on 1.667 2.25asking for help 1.231 1.485

.1672.29

.0831.505

- .1673.81

1.5832.644

.0831.165

.252.094

.251.138

.5831.379

In each pair of values, mean is above and standard deviation is below.180

n.s.

n.s.

n.s.

n.s.

n.s.

n.s.

n.s.

n.s.

n.s.

n.s.

n.s.

n.s.

n.s.

n.s.

Delco: Technical Preparation Control (cont.)

PROCESSES Pre-test Post-test Change Significance

Total process 5.75 5.25 .5 n.s.responses 1.765 1.545 2.393

Focus 1.917 2.083 .167 n.s.responses .9 .9 .937

Strategy 1.583 1.0 .583 n.s.responses 1.443 .603 1.676

Topic 2.25 2.167 .083 n.s.responses 1.712 1.801 1.676

Article question 1.0 1.0 0.0 n.s.(easy factual) 0.0 0.0 0.0

Article question 1.417 1.583 .167 n.s.(harder factual) .515 .515 .718

Graph question 4.583 4.333 - .25 n.s.(easy factual) .515 .651 .754

Graph question 3.917 3.917 0.0 n.s.(easy factual) 1.311 .793 1.044

Graph question 2.75 3.833 1.083 p<.05(harder factual) 1.865 .835 1.782

Graph question 2.75 2.583 - .167 n.s.(inference) 1.138 .669 1.267

Job aid question 1.0 1.0 0.0 n.s.(easy factual) 0.0 0.0 0.0

Job aid question 2.917 2.833 .083 n.s.(harder factual) .289 .577 .289

Job aid question 2.583 2.667 .083 n.s.(inference) .793 .778 .289

Cloze test score 9.583 10.417 .833 n.s.2.843 2.968 1.801

In each pair of values, mean is above and standard deviation is below.181 1 7 s

Delco: GED Class (n = 15)

BELIEFS & PLANS Pre-test Post-test Change Significancemean/s.d.* mean/s.d. mean/s.d.

Literacy 2.9 2.8 .1 n.s.self-rating .568 .632 .568

Change in literacy 0.133 n.s.self-image (Holistic .64

Change in plans judgements 0.0 n.s.for 1 year .845

of change:Change in plans 0.2 n.s.for 5 years no pre- .561

Change in plans and post- 0.067 n.s.for 10 years .704

scores)Change in plans 0.067 n.s.for education .704

PRACTICES

Reading/writing 3.867 4.2away from work 1.407 1.424

Reading/writing 2.133 2.267at work 1.598 1.71

Items read (in 18.867 19.620 item list) 1.807 1.056

Frequency of 17.0 17.357reading activities 2.746 2.56

Ownership of 5.463 5.692reading materials 2.634 2.136

Self-rating on 3.0 3.267talking in meetings 1.464 1.58

Self-ratig on 3.067 3.4ideas discussed 1.28 1.242

Self-rating on 1.533 1.667asking for help 1.125 1.113

.333 n.s.1.447

.133 n.s.1.642

.733 p<.051.1

.357 n.s.2.62

.231 n.s.1.691

.267 n.s.1.58

.333 n.s..9

.133 n.s.1.598

In each pair of values, mean is above and standard deviation is below.

182 r17 --,

Delco: GED Class (cont.)

PROCESSES Pre-test Post-testmean/s.d.* mean/s.d.

Changemean/s.d.

Significance

Total process 3.333 5.133 1.8 p<.001responses 1.175 2.1 1.821

Focus 1.533 1.6 .067 n.s.responses .915 .737 .594

Strategy 1.667 1.133 .533 p<.05responses 1.047 .743 1.125

Topic .133 2.4 2.267 p<.001responses .352 2.261 2.187

Article question .933 .867 - .067 n.s.(easy factual) .258 .352 .258

Article question 1.467 1.267 .2 n.s.(harder factual) .516 .594 .676

Graph question 3.8 3.933 .133 n.s.(easy factual) 1.082 .594 1.187

Graph question 3.4 3.467 .067 n.s.(easy factual) 1.183 .64 1.223

Graph question 3.133 2.667 .467 n.s.(harder factual) 1.407 1.496 1.356

Graph question 1.6 2.067 .467 n.s.(inference) 1.454 1.1 1.598

Job aid question .533 .467 - .067 n.s.(easy factual) .516 .516 .594

Job aid question 2.4 2.867 .467 p<.05(harder factual) .828 .352 .915

Job aid question 2.667 2.4 .267 n.s.

(inference) .724 1.121 1.438

Cloze test score 7.467 7.933 .467 n.s.1.642 2.187 2.232

In each pair of values, mean is above and standard deviation is below.183

1 75

BELIEFS & PLANS

Literacyself-rating

Change in literacyself-image

Change in plansfor 1 year

Change in plansfor 5 years

Change in plansfor 10 years

Change in plansfor education

Delco: ESL Class (n = 15)

Pre-test Post-testmean/s.d .* mean/s.d.

Changemean /s.d.

3.091.302

(Holistic

judgements

of change:

no pre-

and post-

scores)

3.455.522

.364

.505

0.067.799

0.2.775

0.2.676

0.067.704

0.533.64

Significance

n.s.

n.s.

n.s.

n.s.

n.s.

p<.005

PRACTICES

Reading/writing 4.8 4.6 .2away from work 1.897 1.805 2.077

Reading/writing 1.867 2.6 .733at work 1.506 1.682 1.624

Items read (in 16.286 19.143 2.85720 item list) 3.539 1.657 3.009

Frequency of 15.6 16.333 .733reading activities 4.205 4.624 1.907

Ownership of 4.462 4.923 .462reading materials 2.295 2.326 1.506

Self-rating on 2.333 2.6 .267talking in meetings 1.175 1.242 .799

Self-rating on 2.5 2.571 .071ideas discussed 1.401 1.016 1.141

Self-rating on 1.267 1.933 .667asking for help .594 1.033 1.234

In each pair of values, mean is above and standard deviation is below.184

n.s.

n.s.

p<.005

n.s.

n.s.

n.s.

n.s.

p<.05

Delco: ESL Class (cont.)

PROCESSES Pre-test Post-testmean/s.d.* mean/s.d.

Changemean/s.d.

Significance

Total process 3.533 5.0 1.467 p<.0005responses .99 1.414 1.356

Focus 1.733 1.667 .067 n.s.responses .884 .724 .961

Strategy 1.533 1.667 .133 n.s.responses 1.125 1.047 1.685

Topic .267 1.667 1.4 p<.005responses .594 1.543 1.765

Article question .867 .867 0.0 n.s.(easy factual) .352 .352 .378

Article question .8 1.333 .533 p<.05(harder factual) .676 .617 .99

Graph question 3.733 3.733 0.0 n.s.(easy factual) 1.1 1.1 1.363

Graph question 3.0 3.467 .467 n.s.(easy factual) 1.195 1.125 1.642

Graph question 2.4 3.4 1.0 p<.05(harder factual) 1.844 1.183 1.813

Graph question 1.733 2.333 .6 n.s.(inference) 1.335 1.345 1.454

Job aid question .667 .733 .067 n.s.(easy factual) .488 .458 .458

Job aid question 2.2 2.533 .333 n.s.(harder factual) 1.014 .9

Job aid question 1.6 2.533 .933 n<.005(inference) 1.352 1.06 1.163

Cloze test score 6.467 7.0 .533 n.s.2.615 3.229 2.642

In each pair of values. mean is above and standard deviation is below.185 177

Cumberland: Communications and Collaboration Class (n = 21)

PRACTICES Pre-test Post-testmean/s.d.* mean/s.d.

Changemean/s.d.

Significance

Self-rating on 3.2 2.933 .267 n.s.talking in meetings 1.424 1.335 1.033

Self-rating on 3.0 3.333 .333 n.s.

ideas discussed 1.464 1.345 1.397

Self-rating on 1.214 1.643 .429 p<.05asking for help .579 1.008 .646

SUPERVISOR RATINGS

Communication 2.762 5.048 2.286 p<.0001.944 1.024 .644

Concerns. 2.667 4.857 2.19 p<.0001

problem-solving .796 1.014 .928

Handling 3.0 5.143 2.143 p<.0001conflict 1.049 1.315 1.236

Self-esteem 2.905 5.333 2.429 p<.0001.944 1.39 1.076

Setting goals 2.857 4.857 2.0 p<.0001.727 1.315 1.225

Commitment 3.143 4.857 1.714 p<.00011.276 1.315 1.007

Responsibility 3.19 5.19 2.0 p<.00011.169 1.327 1.095

Initiative 3.143 4.714 1.571 p<.00011.195 1.347 .811

Paper work 2.429 4.714 2.286 p<.0001.87 1.707 1.309

Machine setting 3.238 5.095 1.857 p<.00011.179 1.546 1.276

* In each pair of values, mean is above and standard deviation is below.

1861 '7