33
This article was downloaded by: [University of Ulster Library] On: 03 December 2014, At: 02:49 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Journal of Applied School Psychology Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/wapp20 Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text Sara E. Witmer a , Nell K. Duke b , Alison K. Billman c & Joseph Betts d a Michigan State University, East Lansing, Michigan, USA b University of Michigan, Ann Arbor, Michigan, USA c University of California–Berkeley, Berkeley, California, USA d Center for Cultural Diversity and Minority Education, Madison, Wisconsin, USA Published online: 01 Aug 2014. To cite this article: Sara E. Witmer, Nell K. Duke, Alison K. Billman & Joseph Betts (2014) Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text, Journal of Applied School Psychology, 30:3, 223-253, DOI: 10.1080/15377903.2014.924454 To link to this article: http://dx.doi.org/10.1080/15377903.2014.924454 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &

Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

  • Upload
    joseph

  • View
    217

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

This article was downloaded by: [University of Ulster Library]On: 03 December 2014, At: 02:49Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registeredoffice: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Journal of Applied School PsychologyPublication details, including instructions for authors andsubscription information:http://www.tandfonline.com/loi/wapp20

Using Assessment to Improve EarlyElementary Students’ Knowledge andSkills for Comprehending InformationalTextSara E. Witmera, Nell K. Dukeb, Alison K. Billmanc & Joseph Bettsd

a Michigan State University, East Lansing, Michigan, USAb University of Michigan, Ann Arbor, Michigan, USAc University of California–Berkeley, Berkeley, California, USAd Center for Cultural Diversity and Minority Education, Madison,Wisconsin, USAPublished online: 01 Aug 2014.

To cite this article: Sara E. Witmer, Nell K. Duke, Alison K. Billman & Joseph Betts(2014) Using Assessment to Improve Early Elementary Students’ Knowledge and Skills forComprehending Informational Text, Journal of Applied School Psychology, 30:3, 223-253, DOI:10.1080/15377903.2014.924454

To link to this article: http://dx.doi.org/10.1080/15377903.2014.924454

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the“Content”) contained in the publications on our platform. However, Taylor & Francis,our agents, and our licensors make no representations or warranties whatsoever as tothe accuracy, completeness, or suitability for any purpose of the Content. Any opinionsand views expressed in this publication are the opinions and views of the authors,and are not the views of or endorsed by Taylor & Francis. The accuracy of the Contentshould not be relied upon and should be independently verified with primary sourcesof information. Taylor and Francis shall not be liable for any losses, actions, claims,proceedings, demands, costs, expenses, damages, and other liabilities whatsoever orhowsoever caused arising directly or indirectly in connection with, in relation to or arisingout of the use of the Content.

This article may be used for research, teaching, and private study purposes. Anysubstantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &

Page 2: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 3: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

Journal of Applied School Psychology, 30:223–253, 2014Copyright © Taylor & Francis Group, LLCISSN: 1537-7903 print / 1537-7911 onlineDOI: 10.1080/15377903.2014.924454

Using Assessment to Improve Early ElementaryStudents’ Knowledge and Skills

for Comprehending Informational Text

SARA E. WITMERMichigan State University, East Lansing, Michigan, USA

NELL K. DUKEUniversity of Michigan, Ann Arbor, Michigan, USA

ALISON K. BILLMANUniversity of California–Berkeley, Berkeley, California, USA

JOSEPH BETTSCenter for Cultural Diversity and Minority Education, Madison, Wisconsin, USA

Although assessment of student progress in word reading skills iscommon, students’ knowledge and skills for comprehending infor-mational text are rarely assessed. Despite research indicating theneed for instruction in this area and a growing national under-standing of its importance that is reflected in the Common CoreState Standards, few formative assessment tools are readily avail-able and used to assess informational text comprehension. In thisstudy, teachers who were randomly assigned to an experimentalgroup were provided with ongoing professional development onhow to administer and interpret the newly developed Concepts ofComprehension Assessment in their classrooms. The assessment wasdesigned to help assess and consequently encourage instructionalattention to students’ skills for comprehending informational text.Those students in the experimental group showed greater growth onthe Concepts of Comprehension Assessment and on a writing task.

KEYWORDS informational text, comprehension, assessment,elementary

From elementary school to late adulthood, individuals are expected to ac-cess information from print. Social studies and science textbooks, magazines,

Address correspondence to Sara E. Witmer, Michigan State University, 620 Farm Lane,Room 434, East Lansing, MI 48823, USA. E-mail: [email protected]

223

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 4: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

224 S. E. Witmer et al.

brochures, maps, encyclopedias, informational websites, and newspapers arejust a few examples of materials containing informational text that individu-als may be expected to read and comprehend across the life span. As earlyas fourth grade, students are expected to read and learn from a vast amountof expository text (Moss, 1997; Myers & Savage, 2005). Therefore, skills andstrategies for comprehending informational text are clearly needed for stu-dents to be effective learners and consumers of information both duringand after formal schooling. Unfortunately, research has suggested that U.S.students’ overall reading comprehension skills are not where they shouldbe (U.S. Department of Education, 2012) and more specifically that U.S. stu-dents are particularly weak in the area of reading informational texts (Mullis,Martin, Foy, & Drucker, 2012). In the sections that follow, the need forearly instruction in skills for comprehending informational text is described,along with a rationale for using assessment tools developed to measure thesespecific skills as a way to promote instructional improvements and studentlearning in this area.

The Importance of Skills for Comprehending Informational Text

The need to improve informational text comprehension is now gainingwidespread attention, as evidenced by the Common Core State Standards,which refer to “extensive research establishing the need for college and ca-reer ready students to be proficient in reading complex informational textindependently in a variety of content areas” (Common Core State StandardsInitiative, 2012, p. 4). Improving students’ comprehension of informationaltext cannot occur solely through improving their general reading ability orthrough experience with narrative reading. In a review of studies, Dukeand Roberts (2010) revealed 22 areas in which informational comprehensionproved to be different from other kinds of reading. Their review summa-rized work that involved a variety of research methods (e.g., comparisons ofstudent performance across genre, think aloud studies examining the pro-cesses students use to comprehend across genre, identification of predictorsof reading comprehension achievement), all of which point to differencesin the knowledge and strategies needed to comprehend reading material ofdifferent genres, including informational text.

The Common Core State Standards begin early with an emphasis oninformational text, calling for 50% of kindergarten through fifth-grade chil-dren’s reading to be focused on informational genres (National GovernorsAssociation Center for Best Practices, 2010, p. 5). Specific standards for thecomprehension of informational text for second grade, for example, includethe use of text features to navigate informational text (e.g., table of contents,index, bolded print, headings), the use of illustrations (e.g., diagrams) tounderstand key ideas in an informational text, and ascertaining the meaning

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 5: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

Elementary-Grade Informational Text 225

of words and phrases in informational text (National Governors AssociationCenter for Best Practices, 2010, p. 13).

This attention to informational text in the elementary grades is sup-ported by decades of research. Studies have shown that even kindergartenthrough second-grade students can learn from and about informational text ifprovided the opportunity (e.g., Baker et al., 2013; Duke & Kays, 1998; Moss,1997; Pappas, 1993; Reutzel, Smith, & Fawson, 2005; Stahl, 2004; Williams,2005). A federally commissioned panel on improving reading comprehen-sion in kindergarten through third grade called for educators to teach readingcomprehension of informational as well as literary text and to teach spe-cific text structures of informational text (Shanahan et al., 2010). Knowledgeof how to use features of informational text (e.g., headings, table of con-tents, index) supports effective navigation of the text that in turn supportslearning (Kelly & Clausen-Grace, 2010; Symons, MacLatchy-Gaudet, Stone,& Reynolds, 2001). Readers also need specific knowledge and skills to nav-igate the vocabulary demands of informational texts. Given evidence thatinstruction in vocabulary and vocabulary strategies can improve comprehen-sion, experts recommend teaching high-utility words that are commonly usedacross informational texts (e.g., investigate, examine), as well as strategiesfor determining meanings of unknown specialized vocabulary (e.g., teachingstudents that bolded words often signal that a word is considered new anddefined either in the section or in a glossary; Baumann, Edwards, Boland,Olejnik, & Kame’enui, 2003; Fukkink & de Glopper, 1998; Kuhn & Stahl,1998). Unique graphical devices such as diagrams often accompany infor-mational text (e.g., photosynthesis cycle, animal life cycles), and readers mustknow how to read these diagrams in conjunction with printed text in orderto better understand the concepts being described and explained (Peacock,2001; Peacock & Weedon, 2002). Altogether, instruction may need to specif-ically target these specific skills in order to prepare students to effectivelycomprehend informational text. Given that students need these strategies tocomprehend informational texts that they will encounter in later elementaryschool, it is essential that instruction in related strategies occurs early in theirelementary school experience.

Lack of Early Comprehension Instruction

Despite the growing foundation of research on the skills and knowledgestudents need for comprehending informational text, research indicates thatcomprehension instruction in general, and for informational text specifically,is often neglected in early schooling (Duke, 2000; Jeong, Gaffney, & Choi,2010; Pressley, 2002; Wright, 2012). For example, one study in third-gradeclassrooms found on average less than 1 min per day spent on teacher-managed reading comprehension instruction activities and only 5–6 min per

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 6: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

226 S. E. Witmer et al.

day of child-managed reading comprehension activities (Connor, Morrison,& Petrella, 2004). Another study found that first graders were provided withan average of only 3.6 min per day of experience with informational text,with students in low socioeconomic status school settings provided witheven less (1.4 min; Duke, 2000).

This neglect of comprehension instruction may be rooted in the be-lief that children need to develop decoding, word recognition, and fluencyskills before instruction can focus on the development of comprehensionstrategies. Although basic word reading strategies should certainly be a fo-cus of early reading instruction, it is important to note that the most effectiveteachers of young children attend to comprehension as well as word reading(e.g., Taylor, Pearson, Clark, & Walpole, 2000). Furthermore, more effectiveapproaches to primary grade instruction involve attention to comprehensionas well as word reading (e.g., Vellutino & Scanlon, 2002), and it is importantto note that comprehension instruction can have a positive effect even inyoung children (for reviews, see Pearson & Duke, 2002; Roberts & Duke,2010; Shanahan et al., 2010; Stahl, 2004). The federally commissioned panelreferenced earlier reached a consensus that “. . . the teaching of reading com-prehension should begin in kindergarten . . .” before children have learnedto decode (Shanahan et al., 2010, p. 5). Just as the development of readingfluency requires the development of fundamental skills and strategies relatedto phonemic awareness and the alphabetic principle, it appears that the de-velopment of informational text reading comprehension may be facilitatedthrough the development of vocabulary and world knowledge and of certainfundamental skills, such as using comprehension strategies, understandingtext structure, using strategies for identifying the meaning to unknown words,and using strategies for effective navigation of informational text (e.g., useof the table of contents, index, diagrams).

Using Assessment to Inform Instruction

The common adage “what is measured is what is treasured” has often beenused to describe the influence of testing on instruction within the era ofaccountability testing. Now that greater attention is focused on the develop-ment of informational text comprehension skills in the Common Core StateStandards, we can expect that associated skills will be important for studentsto have in order to earn proficient scores on accountability assessments. It isworth noting that, to date, there has been relatively little attention paid to thedevelopment of formative assessment tools to inform instruction and monitorstudent progress in the area of informational text comprehension, especiallyfor early grades. This may help explain the lack of attention to this importantarea during early literacy instruction. Although research has pointed to ben-efits associated with teaching educators how to administer brief assessment

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 7: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

Elementary-Grade Informational Text 227

tools and use the resulting information to inform instruction (Stecker, Fuchs,& Fuchs, 2005), the majority of such assessment tools for early literacy devel-opment tend to focus on word reading strategies (i.e., phonemic awareness,alphabetic principle), rather than strategies fundamental to comprehension.1

Although some measures for monitoring the development of readingcomprehension skills have been developed, these measures ultimately re-quire that students have prerequisite decoding skills in order for the studentsto be able to demonstrate their competency in the realm of comprehension.For example, comprehension assessment tools such as maze and retell re-quire students to be able to decode a text before displaying any level ofcomprehension skills. These tools therefore cannot track fundamental skillsand knowledge for comprehending text without being confounded with stu-dents’ decoding ability. Although an approach to assessing fundamental skillsand knowledge for comprehending text that does not ultimately require abasic level of decoding skills would ultimately resemble a listening compre-hension task rather than true reading comprehension, such an approach toearly comprehension assessment could help to ensure that comprehensionalone can be feasibly measured and monitored in young children and in-creases the likelihood that reading instruction will address such fundamentalstrategies for comprehension at the same time that students are developingdecoding skills. As indicated by Perfetti, Landi, and Oakhill (2005), readingcomprehension and broader language comprehension are related within de-velopment, with word identification being the main factor that contributesto differences in these skills across students. By removing word identifica-tion skill from the measurement of early comprehension skills, one mightensure that fundamental comprehension-focused skills are being measuredand subsequently addressed through instruction.

In addition to requiring decoding skills to demonstrate comprehension,the existing reading comprehension assessment tools do not attend to thegrowing knowledge of the specific knowledge and skills students need toeffectively and efficiently comprehend different types of text. In a recentarticle, van den Broek and Espin (2012) described the Integrated Model ofReading Comprehension (IMREC), which emphasizes the importance of con-sidering both process and product in reading comprehension development.They suggest that the IMRECshould be used to develop tools that assessboth the product and processes of reading comprehension and that suchassessment tools should provide information related to strengths and weak-nesses in component skills and processes. They argued that such assessmentinformation could be used to better target intervention to promote readingcomprehension among students. Development and use of such tools may

1 See chart of progress monitoring tools evaluated through a collaborative effort betweenthe Center on Response to Intervention and the National Center on Intensive Intervention athttp://www.rti4success.org/progressMonitoringTools.

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 8: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

228 S. E. Witmer et al.

increase teachers’ productive instructional attention to this important areaand ultimately serve to improve students’ use of knowledge and skills forcomprehending text.

In the past few years, two tools have been developed to examine earlyelementary students’ development of strategies for comprehending informa-tional text specifically: the Concepts of Comprehension Assessment (COCA;Billman et al., 2008) and the Informational Strategic Cloze Assessment (ISCA;Hilden et al., 2008). The former tool is intended for administration to firstand second graders; the latter is intended for administration to first throughthird graders. The focus of this study was on early elementary comprehen-sion of informational text and so use of the COCA, rather than the ISCA, wasexamined.

Purpose of This Study

In the present study, first- and second-grade elementary teachers were pro-vided with professional development on administration and use of the COCAto potentially facilitate their early elementary students’ development of fun-damental knowledge and skills for comprehending informational text. Morespecifically, the following research question formed the foundation for thisstudy: Does having teachers administer an assessment of students’ knowl-edge and skills for comprehending informational text affect their students’growth in this area? It was hypothesized that students in classrooms in whichteachers were provided with professional development in the administrationand use of data from the COCA would show greater growth in relatedknowledge and skills. To examine whether administration of this assessmentcontributed to growth in student achievement beyond the specific types ofquestions included on the COCA, we also examined growth using a trans-fer measure (i.e., an informational text writing prompt). Last, we examinedvariation in teachers’ use of the COCA via their reported and observed in-struction in COCA-related concepts and the extent to which that variationcorresponded to differences in student gains in the COCA over the schoolyear.

METHOD

Design

An experimental design at the teacher level (with a quasi-experimental de-sign at the student level) was used to investigate a possible causal rela-tionship between teacher administration of the assessment tool and studentoutcomes. Ten teachers were randomly assigned to condition (experimentalvs. control) within matched pairs, with an 11th teacher randomly assigned to

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 9: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

Elementary-Grade Informational Text 229

the experimental group without a matched classroom (this teacher expressedinterest in participating in the study a few days after the initial recruitmentprocess and was allowed to participate even though there was no associ-ated teacher match available). To assign one teacher from each matchedpair to the experimental or control group, a coin was flipped, and if theresult was heads, the first teacher among the pair as listed in a file wasassigned to the experimental group, with the remaining teacher in the pairassigned to the control group. To assign the eleventh teacher to a group, acoin was tossed once more, and given that the coin landed on heads, thisteacher was assigned to the experimental group. Teachers in the experimen-tal group received professional development and administered the COCA atthe beginning, middle, and end of the school year to a subset of six childrenin their classrooms (see the Participants section for a description of howstudents were selected for administration of the COCA). Members of the re-search team (i.e., graduate assistants) administered the COCA to 6 additionalchildren in the experimental classroom and to 12 students in each controlteacher’s classroom. The additional administrations by graduate assistants tostudents in both the experimental and control groups were included to (a)allow for comparison to a control group of students with teachers who hadno knowledge of the COCA and (b) allow for detection of any differencesthat were due solely to examiner type (which would otherwise confoundour interpretation of the reason for any detected differences between the ex-perimental and control groups). In addition to an examination of growth onthe COCA, an analysis was conducted of the experimental effect on a secondmeasurement tool: an informational text writing prompt, administered at thebeginning and end of the school year.

Participants

Four first-grade and seven second-grade teachers from five elementaryschools in a Midwestern school district volunteered to participate. The districtwas in a locale classified as suburb: large. Of the children, 22% in the districtqualified for free or reduced price lunch. Ethnic/racial identification for thefive participating elementary schools included 13.5% students of color. All11 teachers were female and collectively represented a range of teachingexperience (2 years to more than 25 years). Ten of the teachers were pairedwith another participating teacher on the following criteria: (a) they taughtthe same grade level, and (b) they received similar ratings of teaching qual-ity based on information provided by the executive director of academicservices, who was involved in prior observations and evaluations of eachteacher.

Parental consent was sought for all students in each classroom; consentwas obtained for 73.4% of students in the participating classrooms. Of these,

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 10: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

230 S. E. Witmer et al.

12–13 students per classroom were selected by the researchers within eachclassroom to participate fully in COCA administration—thus representing56% of all students in these classrooms (48 first-grade students; 85 second-grade students). The selection of students for full participation in the studywas done using procedures intended to promote maximum similarity inparticipating students across control and experimental classrooms in termsof (a) reading level, (b) special services received (e.g., special educationservices, remedial reading support), and (c) mother’s education level (as aproxy for socioeconomic status). It was anticipated that if these characteristicswere not similar across the experimental and control groups, differences ingrowth across conditions might be a result of different initial reading profilesand home support for learning, rather than the COCA professional devel-opment. Therefore, the following information on each student for whomconsent was obtained was collected: (a) initial reading level as measuredand reported by the teachers (see the following paragraph); (b) additionaleducational services received (i.e., reading specialist support, special edu-cation services); and (c) mother’s education level. Then, we selected 12–13students from among those with parental consent such that there was a stu-dent with matching characteristics selected for participation from each ofthe matched pair teachers. For example, if there was a student (with par-ent consent) of reading level orange with no additional services received,whose mother had an associate’s degree as her highest level of education,we would select a student from the classroom of the matched teacher whoalso had those characteristics. When we could identify more matched pairsof students than needed, we selected pairs that allowed for the widest rangeof reading levels to be included in the study. For example, if we had morematched pairs that were at a medium reading level than a high reading level,when deciding among two possible remaining pairs, we selected those witha high reading level.

Teachers’ initial reports of student reading level were based on an as-sessment and leveling system used districtwide, which is described by Foun-tas and Pinnell (1999). As part of this process, each student is asked to readindividually from one or more benchmark books that correspond to 17 dif-ferent reading levels (kindergarten to 4th grade). The student is asked to read100–150 words from each selected book. To assign a student to a particularreading level, the student must have read the associated benchmark bookout loud with between 90% and 94% accuracy and have demonstrated anappropriate rate of reading. This categorization scheme has been found toalign with the corresponding performances of students on the QualitativeReading Inventory (Hoffman, Roser, Salas, Patterson, & Pennington, 2001;Leslie & Caldwell, 1990). The beginning of the year scores of participatingstudents represented a range of thirteen different reading levels.

Participating student demographic information is provided in Table 1.As anticipated based on our selection criteria, student characteristics were

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 11: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

Elementary-Grade Informational Text 231

TABLE 1 Descriptive Statistics for Total Sample and for Each Group

Experimental(n = 73)

Control(n = 60)

Total sample(N = 133)

Variable n % n % n %

Female 32 44.4 28 33.3 60 45.1

Age in years, M (SD) 7.3 (0.7) 7.2 (0.6) 7.2 (0.7)

Students of color 11 15.3 9 15.0 20 15.0Mother with bachelor’s degree

or higher29 40.3 28 46.7 57 42.9

Receiving special educationservices

3 4.2 2 3.3 5 3.8

Reading below first grade levelin fall

19 26.0 18 30.0 37 27.8

similar across experimental and control conditions. Specifically, there wereno significant differences between experimental and control groups on thethree matching criteria (initial reading level, additional services received,and maternal education), nor in age, racial/ethnic background, or gendercomposition. Of the students, 93% selected to participate were includedin all three COCA administrations and 81% completed both the pre- andpostwriting assessment activities.

Measures

COCA

The COCA (Billman et al., 2008) is an individually administered test de-signed to measure first- and second-grade students’ specific fundamentalknowledge and skills for comprehending informational text and is intendedto help inform instruction. The tool is not intended to compare studentsand therefore does not have a standardization sample. Four subscales ofthe COCA include items measuring students’ demonstration of unique skillsand knowledge for comprehending informational text as discussed in theliterature review (i.e., Vocabulary, Text Features, Graphics in the Contextof Text, and Comprehension Strategies). Vocabulary items require studentsto demonstrate their knowledge of vocabulary words commonly used ininformational text and use of strategies for understanding unfamiliar words.The vocabulary words represent those commonly found across informationaltexts that would be classified as Tier 2 words by Beck, McKeown, and Kucan(2002). Text Features items prompt the student to demonstrate knowledgeof headings, labels, pronunciation guides, tables of content, index, and theglossary. Graphics in the Context of Text items require students to deriveand use information from visual and graphical devices (e.g., diagrams, flow

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 12: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

232 S. E. Witmer et al.

charts, illustrations) as well as the written text. Last, Comprehension Strate-gies items require students to answer questions regarding their predictions,use of prior knowledge, generation of inferences, and summarization.

There are two COCA forms—Dragonflies and Salmon—which were de-veloped to be equivalent, alternate forms. Both include an informationalbook chronicling the life cycle of a species and designed to include genre-specific features and visual and graphical devices common in informationaltext. Although the written text for each form is different, the overall formatand instructions for the assessment are identical, and many items are identi-cal or similar (e.g., both forms have an item asking students to predict whatthe book will be about, both forms have an item asking students to identifyand use the index). Both forms include several items to address each of thefour constructs. Pilot testing and analysis that involved calculations of itemdifficulties and variation in item performance informed the inclusion andexclusion of items for each form. Using this information, scores on the twoforms can be statistically equated.

Each form requires approximately 15 min of administration time. Thetext is read aloud to students to facilitate accurate measurement of knowl-edge and skills for comprehending informational text separate from students’decoding skills. As noted earlier, most existing comprehension measures re-quire students to be able to decode in order to gain information about theircomprehension knowledge and skills, which may ultimately contribute to alack of instruction focused on comprehension in the early elementary years.Because the COCA was explicitly developed to measure fundamental knowl-edge and skills for comprehension that should be taught at the same timethat a student is learning how to read words, the COCA developers decidedthat the knowledge and skills should be measured separately from students’decoding skills. This would ensure that students could demonstrate theirknowledge and skills related to comprehension of informational text with-out being potentially hindered in doing so by their limited decoding skills.Thus, the examiner reads the text aloud to students. Although the resultingtasks resemble listening comprehension rather than reading comprehension,it was deemed necessary to use such an approach to be able to monitor andtrack the development of fundamental skills and knowledge for compre-hending informational text in a way that does not confound decoding skillwith comprehension or require students to have obtained a certain thresholdof decoding skills to demonstrate their comprehension-related knowledgeand skills. Throughout the reading of the text, the examiner prompts stu-dents to respond verbally to questions that assess their knowledge and skillsfor comprehending informational text, using specific directions and promptsthat are indicated in the administration manual. These specific directionsand prompts were developed on the basis of earlier pilot testing. Examinersrecord each student’s responses on a score sheet that identifies the constructassociated with each item. A scoring guide provided with the test manual is

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 13: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

Elementary-Grade Informational Text 233

used to score the quality of student responses on each of 19 questions using a0–2-point scale. Thus, total possible scores on the COCA range from 0 to 38.2

Confirmatory factor analysis based on prior data (N = 166 first- andsecond-grade students) supported the four-contributor model of the COCA(Vocabulary, Text Features, Graphics in the Context of Text, and Compre-hension Strategies; Dragonflies: root mean square error of approximation(RMSEA) = 0.02, comparative fit index (CFI) = 0.99; Tucker-Lewis index(TLI) = 0.99; Salmon: RMSEA = 0.04, CFI = 0.93, TLI = 0.93). As expected,the COCA correlated only moderately with a general measure of readingcomprehension that does not report scores by genre (.38 to .58, dependingon grade level, with the Gates-McGinitie Reading Test (MacGinitie, MacGini-tie, Maria, Dreyer, & Hughes, 2000) but correlated more highly with a mea-sure designed to assess informational reading comprehension specifically(.61 to .67, depending on grade level, with the ISCA [Hilden et al., 2008]).Internal consistency reliability of the instrument was found in previous inves-tigation to be .89 for the Dragonflies form and .85 for the Salmon form. Usingthe data from the first administrations of the COCA in the present study, thecoefficient alpha estimates were .84 for Dragonflies and .79 for Salmon.

COCA ADMINISTRATION

In late September, the experimental group teachers administered the COCAto six of their own participating students. Graduate student assistants—whoreceived training identical to the experimental group teachers—administeredthe COCA to an additional six students in each of these classrooms (see in-formation above about how the 12–13 students were selected). The graduatestudent assistants were enrolled in a school psychology graduate programand had prior experience with academic assessment. Within an experimentalclassroom, a list of participating students was created and ordered by studentreading level, with the first student assigned to have the COCA administeredby the teacher, and the next student assigned to have the COCA administeredby the graduate assistant, the third student assigned to the teacher, the fourthto the graduate assistant, and so forth. For the next experimental classroom,the first student in a ranked reading level list was assigned to the graduatestudent, and the second student to the teacher, and so on. This ordering wasdone to prevent examiner condition (i.e., teacher vs. student assistant) frombeing confounded with student reading level. The purpose of administeringthe COCA to 12–13 students in the experimental classrooms was to increasethe sample size and also to test the effect of examiner effects. Graduate stu-dent assistants administered the COCA to all 12 students selected from each

2 More information on this assessment tool, including the administration and scoringmaterials, is freely available at http://msularc.educ.msu.edu/what-we-do/projects/mai-coca.

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 14: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

234 S. E. Witmer et al.

control group classroom. COCA forms (i.e., Dragonflies vs. Salmon) wererandomly assigned such that within each classroom (a) half of the studentsinitially took the Dragonfly form and the other half took the Salmon form,(b) forms were equally distributed across student reading levels in a givenclassroom, and (c) the matched student pairs described in a previous sec-tion (see the “Participants” section) received the same form during the falladministration. In January, all participating students were administered thealternate to the COCA form they completed in the fall (e.g., students whowere administered Dragonflies in the fall were administered Salmon in thewinter), and in May, all participating students were administered the sameform as in the fall. Examiner type (i.e., teacher versus graduate assistant)remained constant across all administrations for each student.

PROMPTED WRITING SAMPLES

If students in experimental group classrooms were becoming more knowl-edgeable about and skilled with informational text, then it was reasoned thatthis should also be evident in their writing (Shanahan, 2006). The schooldistrict from which the sample was drawn had implemented a K–5 writingcurriculum that included regular collection of writing samples and thereforeit was deemed appropriate to collect writing samples from the participatingfirst- and second-grade students.

Two writing prompts were used: “All About Birds” and “All AboutTrees.” Each prompt consisted of a 10-page booklet with space for writingand drawing. Trained graduate assistants read directions to groups of stu-dents; instructions included setting a purpose and audience for the studentsto write an informational text about the topic. Students were given 30 min towrite independently about the given topic. Responses were scored using pro-cedures described in the following section with six different rubrics3 (Dukeet al., 2006; Purcell-Gates, Duke, & Martineau, 2007). The six scores relateto students’ demonstration of knowledge and skill associated with differentaspects of informational text: informational text features, vocabulary, illustra-tions, organization, aspects of voice, and overall rating of the quality of theinformational text produced (a holistic score). Rubric scores were determinedbased on students’ inclusion of certain aspects of informational text writing intheir writing sample (e.g., for text features, their use of a pronunciation guidein their writing, their use of timeless verb phrases such as “birds have wings”instead of “the bird had two wings,” and their use of a table of contentsin their writing) as well an overall assessment of the quality of the writing.Scores ranged from 0 to 6 for each rubric. Using data from fall and spring ofthe present study, confirmatory factor analysis was conducted to examine the

3 The rubrics used for scoring the writing prompts can be obtained from the first author.

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 15: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

Elementary-Grade Informational Text 235

extent to which the rubrics could be combined to form a single factor scorerepresenting a student’s composite score on the writing prompt. Results ofthe factor analysis (see results section) supported use of a single compositescore.

WRITING PROMPT ADMINISTRATION AND SCORING

The fall and spring writing prompts were group administered before theCOCA administration to decrease the likelihood that recent exposure to theCOCA books themselves might influence the writing. All participating stu-dents were randomly assigned (stratified by classroom) to complete one ofthe two forms of the writing prompt in the fall and completed the alternateprompt in the spring. Three graduate assistants, blind to timing (i.e., fall orspring) and condition (i.e., experimental or control), were trained to scoreeach writing sample according to the six rubrics. Training included indepen-dent practice scoring of ten writing samples selected to represent a range ofwriting skills, followed by group discussions of any score discrepancies. Afterthis training, 30% of the remaining writing samples were selected to be ratedby two raters such that each graduate assistant scored at least 50 writing sam-ples that were also scored by another graduate assistant. Interscorer reliabilityanalyses were as follows for each rubric (reported as percentage rated iden-tically across raters/percentage rated within one point across raters): Holistic(47.5/ 90.8), Organization (46.7/85.8), Informational Features (69.2/95.0), Vo-cabulary (71.7/85.8), Illustrations (66.7/94.2), and Voice (48.3/89.2). On allrubrics, at least 85% of the samples that were double-scored received scoreswithin one point across raters. This level of reliability is similar to that foundfor other published studies that used similar measures of informational textwriting quality (e.g., Purcell-Gates et al., 2007).

SURVEYS AND OBSERVATIONS OF INSTRUCTION IN COMPREHENSION OF INFORMATIONAL

TEXT

Given the use of randomization in this study, differences between experi-mental and control group student growth on the COCA or writing measureshould be attributable to teacher administration of the COCA and concomi-tant changes in instruction. However, as a check that changes in instructionrelated to comprehension of informational text did indeed occur in the exper-imental group classrooms, two surveys and an observational coding schemewere developed and administered. The first survey was administered at threepoints across the course of the academic year in which the study was con-ducted, and required teachers to report on any related changes in theirinstruction or classroom materials, and the extent to which they believedstudents in their classroom benefited from the professional development on

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 16: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

236 S. E. Witmer et al.

administration and use of the COCA. Survey directions indicated that teach-ers were to answer honestly, even if their answer might seem like it woulddisappoint the researchers. In addition, questions were worded so as not toassume changes or benefits (e.g., using the phrase “if any”). To keep thesurvey administered three times per year of reasonable length, we did notinclude questions on specific text features; to address this shortcoming, a sur-vey was administered just once during the year that asked specifically aboutinstruction in each of the following text features: glossary, index, graphs,headings/subheadings, maps, pronunciation guides, table of contents (1 = Idon’t address this in my instruction; 7 = I incorporate this in my instructionseveral times a day). Last, an observational coding scheme was developedto track the number of minutes during a 70-min period in which a teacherwas engaged in teaching students knowledge and skills for comprehendinginformational text. For the observation, teachers were asked to complete thereading lesson they had planned for the day of the observation, and make nospecial adjustments to their instruction. These observations were conductedlate in the fall semester (resources did not permit observations across the en-tire academic year). Overall, the surveys and observation were intended toallow the researchers to interpret student performance in light of the extentto which teachers reported and were observed to engage in instructionalpractices related to what is measured on the COCA. This would allow forthe identification of evidence that instructional practices associated with useof the COCA, rather than mere measurement error, may have contributed toobserved changes in student scores.

Procedure

The chronological order of study events is provided in Table 2. After receiv-ing teacher consent to participate, placing teachers in matched pairs, andrandomly assigning one teacher within each pair to the experimental con-dition, teachers in the experimental group were provided with professionaldevelopment on how to administer and use information from the COCA. Theexperimental teachers participated in a total of 14.5 hr of professional devel-opment across the school year in which they were trained to administer anduse COCA scores. The first two professional development sessions (3.5 hreach) occurred just before the start of the academic year. Experimental teach-ers were initially provided with background knowledge on the COCA, andwere taught how to administer and score the COCA using video models. Theypracticed administering and scoring the COCA, with feedback provided bythe researchers. Three subsequent professional development sessions weredesigned to provide teachers time to examine COCA scores and discuss waysto use the scores to inform instruction. The first of these subsequent sessionsoccurred in November, after the first COCA administration. This was a 2-hrprofessional development session in which teachers were offered support in

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 17: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

TA

BLE

2Chro

nolo

gyofSt

udy

Eve

nts

Act

ivity

Augu

stSe

pte

mber

Oct

ober

Nove

mber

Dec

ember

Januar

yFe

bru

ary

Mar

chA

pril

May

Pro

fess

ional

dev

elopm

ent

×a×

××

Conce

pts

ofCom

pre

hen

sion

Ass

essm

ent

adm

inis

trat

ion

××

×

Writin

gpro

mptad

min

istrat

ion

××

Firs

tsu

rvey

ofin

stru

ctio

nal

chan

ges

(adm

inis

tere

dat

thre

epoin

tsac

ross

study)

××

×

Surv

eyofin

form

atio

nal

text

inst

ruct

ion

×70

min

clas

sroom

obse

rvat

ion

×a

Tw

opro

fess

ional

dev

elopm

entse

ssio

ns

on

the

Conce

pts

ofCom

pre

hen

sion

Ass

essm

entw

ere

hel

din

Augu

st.

237

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 18: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

238 S. E. Witmer et al.

identifying areas of individual and class-wide strength and weakness basedon the subscale mean scores of the students tested. Each teacher receivedgraphs including the individual subscale scores and the total scores of theirstudents as compared with total possible scores. Teachers were then giventime to develop and share ideas for instructional strategies that would ad-dress identified gaps in performance. In January, after the second COCAadministration, a 2-hr professional development session was provided forteachers to examine students’ progress since the fall administration, andto again collaboratively discuss ideas for promoting growth. Experimentalteachers attended a final 2-hr professional development session in March inwhich the researchers provided an opportunity for teachers to further ex-amine their students’ fall and winter scores independently and as comparedwith those of other classes’ (with only the teacher knowing the ID numberfor her class/scores). Again, the group brainstormed and shared instructionalstrategies for addressing areas of need on the basis of students’ COCA scores.

In addition to the group professional development sessions, additionalefforts were made to ensure highly accurate administration and scoring. Amember of the research team met individually with each teacher to reviewthe teacher’s initial scoring. Furthermore, memos to clarify scoring weredistributed to all experimental group teachers after the first and secondCOCA administrations. Last, before final data entry and analysis, all scoresheets were carefully reviewed by a member of the research team andcorrected for errors.

Control group teachers were not provided information on the COCA,nor were they provided with their students’ COCA scores during the datacollection period of the study. After all data for the study were collected,the control teachers each received a complete set of COCA materials andprofessional development on how to administer the COCA.

Data Analyses

EXAMINATION OF POTENTIAL EXAMINER EFFECTS

A preliminary multivariate analysis of covariance was conducted to deter-mine whether there was an effect for examiner (teacher or research assis-tant) at any of the COCA measurement occasions (fall, winter, spring), witheach measurement occasion serving as a different dependent variable. Inaddition, t-tests were completed to compare mean scores of students in theexperimental group by examiner type at each of the three measurementoccasions.

EXAMINATION OF CONDITION EFFECTS ON STUDENT OUTCOMES

All COCA scores were initially transformed to the Dragonfly scale usingthe statistical equating function provided by the developers. Analysis of

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 19: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

Elementary-Grade Informational Text 239

covariance (ANCOVA) was used to determine the extent to which condi-tion (i.e., experimental vs. control) had an effect on winter and spring COCAscores, using fall COCA scores as a covariate. It was expected that studentsin the experimental condition would show greater gains at both the mid-and endpoint of the academic year, and so a one-tail test of significance wasused with a priori alpha of .025 to ensure a familywise type I error rate of.05 across the two sequential statistical tests. ANCOVA was used to examinedifferences in spring writing sample scores by condition, including fall writ-ing sample scores as a covariate. Again, it was expected that there wouldbe significantly higher adjusted mean scores for the spring scores within theexperimental group, and a one-tail test of significance with a prior alpha of.05 was used.

EXAMINATION OF ALTERATIONS IN INSTRUCTIONAL PRACTICES

Survey and observation results were analyzed to (a) determine whetherteachers in the experimental group reported changing their instructionalpractices to incorporate more instruction in the comprehension of informa-tional text and (b) examine the extent to which teachers who reported andwere observed to incorporate more instruction in this area had studentswho made more substantial learning gains. To address (a), we examinedthe number of teachers in the experimental group who reported makingchanges to their instruction within the first survey (which was administeredat three different time points). To address (b), we used information from theobservation and second survey to categorize each teacher as either high useof strategies to teach informational text based on both survey and observationor low use of strategies to teach informational text based on both survey andobservation. Then, we examined the average growth in COCA performanceacross the year for teachers according to each of these categories. Throughthe use of both survey and observation methods to categorize teachers’ in-structional practices, the potential drawbacks of each of these methods couldbe minimized (i.e., use of just one observation may not reflect typical prac-tices, use of survey alone may be affected by a tendency to respond insocially desirable, as opposed to accurate, ways).

RESULTS

Effects on Student Performance

PRELIMINARY ANALYSIS OF EXAMINER EFFECTS

Box’s test of covariance equality between groups was found to be nonsignif-icant (Box’s M = 6.99, p = .36), and equality of error variances betweengroups across administration occasions using Levene’s test was found to benonsignificant: fall, F(1, 61) = .00, p = .98; winter, F(1, 61) = 2.00, p = .16;spring, F(1, 61) = 3.72, p = .06. Results indicated that the differences in

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 20: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

240 S. E. Witmer et al.

TABLE 3 Mean Comparisons for Experimental Group by Examiner Type and MeasurementOccasion

TeacherGraduateassistant

Independent-samples t-test ofmean differences

M SD M SD t p

Fall 13.1 5.6 13.1 6.3 −0.02 .98Winter 19.7 5.9 17.9 7.2 1.18 .24Spring 24.1 5.3 22.6 7.6 0.88 .38

COCA scores between examiner groups (teacher vs. graduate assistant) atall three measurement occasions were not significant (Wilks’ λ = .92, p =.08). Mean comparisons by examiner type at each measurement occasion forthe experimental group are also provided in Table 3. No statistically signif-icant differences were evident at any measurement occasion. These resultssupported an analysis of results in a manner that involved combining datacollected through the two examiner types.

COCA EFFECTS

Table 4 provides means and standard deviations for testing occasions foreach condition. We used a simple analysis of variance to examine whetherfall scores between the two groups were equivalent. Results indicated thatthere was not a significant difference in fall scores between the groups,F(1, 132) = 0.19, p = .66, suggesting that the groups were not different ontheir knowledge and skill for comprehension of informational text at thebeginning of the study. This was expected given the careful attention tomatching students at onset of the study.

In addition, analyses were conducted to determine whether there weredifferences in performance that could be attributed to COCA form. (Table 5provides mean scores by COCA form and condition, with all scores trans-formed to the equivalent score on the Dragonfly scale using the equationprovided by the developer). We used t-tests to compare mean scores of

TABLE 4 Student Performance on the Concepts of Comprehension Assessment and WritingPrompt by Group

Experimental group Control group

n M SD n M SD

Fall 73 13.3 5.1 60 13.8 6.4Winter 68 18.8 5.4 60 15.7 6.1Spring 67 21.4 6.7 60 19.3 6.0Fall writing rubric total 67 11.2 5.5 48 11.6 6.7Spring writing rubric total 68 18.9 5.5 53 15.5 6.0

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 21: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

Elementary-Grade Informational Text 241

TABLE 5 Student Performance on the Concepts of Comprehension Assessment and WritingPrompt by Group and Form

Experimental group Control group

n M SD n M SD

FallForm D 37 12.9 5.7 30 13.0 7.9Form S 36 12.6 7.9 30 13.3 5.1

WinterForm D 33 19.4 6.6 30 16.3 7.2Form S 35 19.1 5.3 30 14.0 5.4

SpringForm D 33 22.2 7.8 30 19.4 7.2Form S 34 21.6 8.1 30 20.1 5.4

students receiving different forms in the same condition at the same pointin time. No significant differences were identified: Exp(fall): t(71) = .19,p = .85; Exp(winter): t(66) = .21, p = .84; Exp(spring): t(65) = .31, p =.76; Con(fall): t(58) = .17, p = .86; Con(winter): t(58) = 1.40, p = .17;Con(spring): t(58) = .35, p = .73. Furthermore, when considering changes inaverage scores for the experimental group over time separately by form (seeTable 5), similar increases between fall and winter (i.e., 6.5 point increase)and between winter and spring (2.5 point increase) are evident for eachform. When considering changes in average scores for the control groupover time separately by form, slight differences across forms are apparentbetween fall and winter and between winter and spring; however, for bothforms, a lower overall increase is observed for both forms from fall to spring(approximately 6.5-point increase for each form) than that for each form ofthe experimental group (9-point increase for each form).

Six experimental students had fall COCA scores but did not have springCOCA scores (in contrast to the control group in which all students weremeasured at each time point). To evaluate the experimental effect across theacademic year, it was necessary to examine the extent to which these miss-ing data might affect estimates. It was expected that the data were missingcompletely at random (MCAR) given the research design, the care taken tomatch groups, and attempts to obtain measurements at each occasion for allstudents (Little & Rubin, 2002), but analyses were performed to evaluate theextent to which this expectation was supported.

To evaluate the extent to which the students with missing data at thespring were a random subsample of the original experimental group, thesubset of students in the experimental group who had missing data werecompared with the subset of students in the experimental group who hadcomplete data. There were no significant differences between the groupson student age, F(1, 70) = 0.42, p > .05; mother’s educational level, F(1,69) = 3.18, p > .05; or fall COCA scores, F(1, 71) = 0.80, p > .05. These

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 22: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

242 S. E. Witmer et al.

results suggested the students with complete cases were probably a randomsubsample of students in the original experimental condition as none of theseimportant variables that could affect general reading ability were significantlydifferent between those with complete data and those with pretest data only.It was assumed that the mechanism that had produced the missingness of thedata was generated completely at random (i.e., missingness was unrelatedto other study variables; MCAR; Little & Rubin, 2002). Thus, the final analysiswas conducted on only the students with complete cases in the experimentalcondition (n = 63).

ANCOVA results indicated a higher winter COCA score for the exper-imental group at a level of statistical significance, F(1, 120) = 17.14, p <

.025, partial η2 = 0.13, as well as a higher spring outcome at a level ofstatistical significance, F(1, 120) = 16.68, p < .025, partial η2 = 0.12. Theseresults suggested the experimental group had higher growth as comparedwith the control group by winter and maintained that statistically significantdifference at the end of the year.

EFFECT ON WRITING SAMPLES

Table 4 provides mean scores on the fall and spring writing prompts bycondition. Like the previous analysis with COCA scores, there was the issueof missing writing assessment data; however, there was much more missingdata than with the COCA. There was no known systematic reason for thedifferences in missingness of data between the COCA and writing promptswithin the research design so evaluating the extent to which missing datamight affect estimates was undertaken. There were 111 students with nomissing writing prompt data, but they were not similarly split across theexperimental (n = 65) and control group (n = 46). This resulted in about11% of the experimental and about 23% of the control group having missingdata. Therefore, students within the experimental and control groups, inde-pendently, were evaluated as to the extent to which students with completecases were different from those students with missing data on key variablesmeasured in the fall. For the experimental group, there were no significantdifferences between students with and without complete cases on studentage, F(1, 70) = 0.85, p > .05; mother’s educational level, F(1, 69) = 0.01, p >

.05; or fall COCA scores, F(1, 71) = 2.78, p > .05. For the control group, therewere also no significant differences between the groups with and withoutcomplete data on student age, F(1, 58) = 3.66, p > .05; mother’s educationallevel, F(1, 57) = 0.07, p > .05; or fall COCA scores, F(1, 58) = 0.78, p >

.05. These results, in addition to the research design, suggested the studentswith complete cases were probably a random subsample of students in theoriginal experimental and control conditions as none of the important back-ground variables were significantly different between the groups with andwithout complete data at the start of the study. Therefore, it was assumedthat the mechanism that had produced the missingness of the data was

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 23: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

Elementary-Grade Informational Text 243

generated completely at random (MCAR; Little & Rubin, 2002), and analysiswas done on only the students with complete cases.

To evaluate the effect of condition on writing scores, two expectationswere examined. First, given that classrooms were paired based on reportsof teachers’ instructional quality and then one classroom from each pairrandomly assigned to the experimental condition, with students matchedacross paired classrooms, it was expected that there would be no differencesacross the writing measures at the fall measurement period. Second, if therewas an effect for condition, it was expected that there would be a gainevident on the spring measurement.

To explore the first expectation, we computed a multivariate analysisof covariance by entering all the scores on the fall rubrics as dependentvariables and condition as the independent variable. Assumptions for the testappeared to be satisfied, Box’s M = 25.91, p = .27, and all error varianceswere not significantly different between groups using Levene’s test (p > .05).Multivariate tests suggested no significant differences between conditions atthe fall testing on any of the rubrics, Wilks’ λ = .96, p = .64.

To explore the second expectation, a preliminary investigation involv-ing factor analysis was conducted (as indicated in the methods section) todetermine whether use of a single composite score was appropriate to useas an overall measure of student performance on the writing prompt. Fac-tor analysis was conducted by fixing a single, common factor model andusing maximum likelihood estimation methodology. Composite scores werecomputed from the final model estimation procedure using the regressionmethod. Results of the factor analysis were consistent across both fall andspring administrations. For the fall scores, the common factor model ac-counted for approximately 62% of the variance and appeared to have goodmodel fit, χ2(9) = 7.35, p = .17. The common factor model accounted forapproximately 52% of the variance in spring scores and also appeared tohave good model fit, χ2(9) = 13.56, p = .14. This finding provided supportfor use of the composite factor score as an overall indicator of a student’sknowledge and skill related to informational text as demonstrated in theirwriting. ANCOVA was used to evaluate the extent to which differences be-tween the experimental and control groups were found. Results indicatedthat there was a significant difference between conditions at the spring as-sessment, F(1, 108) = 9.25, p < .01, partial η2 = .08. The experimental groupwas found to have had a significantly higher mean score than the controlgroup.

Teacher Use of Informational Text Instructional Strategiesand Corresponding Student COCA Gains

Across the course of the year, teachers in the experimental group reportedmaking many changes to their instruction based on their experiences with the

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 24: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

244 S. E. Witmer et al.

COCA. At some point during the school year (i.e., on at least one of the threeadministrations of the first survey), all experimental group teachers reportedchanging the content of their instruction and their classroom materials. Mostreported changing their writing activities (n = 5), and how they taught (n =4). Teachers were also asked to rate the extent to which they believed COCAadministration and the associated professional development had a positiveeffect on learning for each student who was tested by the teacher (1 =“no/negative gains”; 10 = “incredible gains”). The average of these reportedgain scores was 6.1 (range = 1 to 9; SD = 2.1). Using the same scale,teachers rated their perceived overall class gains, with a resulting averageacross teachers of 7.4 (range = 4 to 9; SD = 1.9). These results provideevidence of social validity of the COCA professional development.

Teachers in the experimental group were found to vary in their useof instructional strategies to teach the comprehension of informational text,based on both the observation and survey data. During the 70-min observa-tions of classroom reading instruction (in which teachers were asked to teachas they would have had there been no observer), three of the six teacherswere found to engage in instructional practices thought to support informa-tional text comprehension development, ranging from 1 to 49 min of theobservation. On the basis of their responses to the survey, average reporteduse of instruction in the associated informational text features ranged from3.0 (approximately once a month) to 5.1 (between “several times a week”and “at least once a day”). Table 6 presents classroom growth according tothe extent to which the teacher was determined to provide instruction inthe comprehension of informational text. Those classrooms deemed to havehigh levels of instruction in the comprehension of informational text accord-ing to both survey and observation tended to have higher classroom growth,and those deemed to have lower levels tended to have lower growth.

DISCUSSION

In this study, the effect of having teachers administer an assessment toolmeasuring students’ fundamental knowledge and skills for comprehendinginformational text, namely the COCA, on students’ development of thoseknowledge and skills was examined. This work represents an initial stepin applying a comprehension assessment tool that aligns with several rec-ommendations from van den Broek and Espin (2012) in school settings.Results suggest a statistically significant positive effect of teacher adminis-tration of the COCA on children’s knowledge and skills for comprehendinginformational text as measured by both the COCA and a transfer measure(an informational text writing prompt). These findings, along with teachers’reports of various changes in instructional practices and materials, seem tosupport the notion that professional development in the administration and

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 25: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

TA

BLE

6St

uden

tPer

form

ance

on

the

Conce

pts

ofCom

pre

hen

sion

Ass

essm

entby

Tea

cher

Use

ofIn

stru

ctio

nin

Info

rmat

ional

Tex

t

Ave

rage

surv

eyA

vera

gem

inute

sFa

llsc

ore

,W

inte

rsc

ore

,Sp

ring

score

,Fa

llto

spring

resp

onse

inobse

rvat

ion

M(S

D)

M(S

D)

M(S

D)

grow

th,

M(S

D)

Hig

hev

iden

ceofin

corp

ora

ting

info

rmat

ional

text

stra

tegy

use

during

inst

ruct

ion

Tea

cher

A5.

149

16.7

(5.3

)23

.6(4

.6)

30.6

(3.8

)13

.9(4

.1)

Tea

cher

B4.

75

9.3

(4.6

)18

.3(5

.3)

23.5

(3.4

)14

.3(3

.0)

Tea

cher

C4.

49

10.4

(4.7

)15

.6(7

.8)

17.2

(7.1

)6.

8(4

.1)

Low

evid

ence

ofin

corp

ora

ting

info

rmat

ional

text

stra

tegy

use

during

inst

ruct

ion

Tea

cher

D3.

70

13.3

(6.2

)17

.0(5

.3)

21.3

(7.8

)8.

0(5

.6)

Tea

cher

E3.

31

17.3

(4.2

)21

.8(3

.4)

24.2

(3.6

)6.

8(4

.7)

Tea

cher

F3.

00

13.3

(6.2

)19

.3(5

.0)

23.5

(6.7

)10

.2(4

.1)

245

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 26: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

246 S. E. Witmer et al.

use of data from the COCA can have a positive effect on student learningof related skills. The results align with the summary of research providedby Stecker and colleagues (2005) indicating that providing teachers withtraining for monitoring student progress and making associated instructionaldecisions can have a positive effect on student learning.

It was anticipated that the experimentally manipulated variable (i.e.,professional development on administration and use of the COCA pairedwith teacher administration of the COCA to six students at three pointsacross the course of an academic year) would affect student performancevia teacher’s targeted changes in instruction. Teachers’ responses to sur-veys indicated that all teachers in the experimental group did make relatedinstructional changes across the course of the year. Observations also doc-umented informational text instruction in several classrooms. Furthermore,when analyzing the student performance gains according to the extent towhich each experimental teacher reported and was observed to engage ininstruction about comprehension of informational text, the expected patterngenerally emerges. Specifically, in those classrooms where teachers moreextensively incorporated informational text instruction within their instruc-tional practices, students tended to experience greater learning gains acrossthe academic year.

Given research that points to the influence of examiner effects (e.g.,familiarity with examiner [Fuchs & Fuchs, 1989]), it was deemed importantto parse out effects that might be due to a familiar examiner type (i.e.,teacher) as opposed to those that might be due to the associated profes-sional development on administration and use of the COCA. Thus the studywas designed to have some students assessed by their teacher while otherswere assessed by a (less familiar) graduate assistant. No significant differ-ences between examiner type were identified; however, a nonsignificantpattern did emerge in the data suggesting that students in the experimentalgroup who were administered the assessment tool by the teacher exhibitedslightly higher scores and higher gains over time than those students in theexperimental group who were administered the assessment tool by a lessfamiliar graduate assistant. Therefore, it is possible that part of the experi-mental effect that was identified was due to the fact that some students inthe experimental group were administered the COCA by a familiar examiner(i.e., teacher), whereas all students in the control group were administeredthe COCA by a less-familiar graduate assistant. However, it is important toemphasize that the effect due to examiner type was not significant, whereasthe effect due to COCA administration was significant. Furthermore, giventhat the identified effect supporting the experimental group transferred toa writing assessment that was administered by a graduate assistant for allstudents in both conditions, further evidence is available to support the no-tion that COCA rather than examiner type was primarily responsible for theassociated significant effects. Given that one would not expect examiner

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 27: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

Elementary-Grade Informational Text 247

effects for the writing sample analysis (i.e., all writing prompts were admin-istered by graduate assistants and not by teachers), it is important to payclose attention to those findings. The associated effect was significant, butsmall (partial η2 = .08). In fact, it was smaller than the relatively small ef-fect size on student COCA scores (partial η2 = .12 and .13). Although thisslight difference in effects may be due to the different task involved, it alsomay reflect a more true estimate of the effect given that the writing promptassessment results are not confounded by the potential influence of familiarexaminer.

Given the time invested in professional development and administrationof the tool by the teachers and the researchers, one might question whetherthe associated effects were worth the time invested. Although knowledgeand skills for comprehending informational text are certainly skills that stu-dents can use to be more efficient and strategic readers in the future, addi-tional research may be helpful to determine the long-term effect on studentachievement of effects of this magnitude on students’ future comprehensionand achievement. In addition, it is important to note that in order to addressthe research question, professional development in this study was focusednarrowly on administration and use of the COCA. It is possible that the ad-dition of broader professional development on teaching informational textcomprehension—avoided in this study in order to isolate the effect of use ofthe assessment tool specifically—would yield more robust effects.

Average ratings from the first survey (administered at three points acrossthe study) suggest that—after the 14.5-hr professional development experi-ence that was provided across the course of the academic year—teachersconsidered the COCA relatively easy to administer, found that it provideduseful information, made instructional changes based on it, and indicatedthat it was generally worth the time invested. However, anecdotal data pro-vided on the surveys suggested that some teachers did not think they wouldnormally have the time to analyze and interpret the scores in a way thatwould facilitate meaningful instructional changes.

Limitations

As a result of limited resources, the sample of teachers/classrooms used forthis study was small (n = 11). This limitation was intended to be addressedin part through the study design, which involved carrying out random as-signment within teacher pairs that were matched on key dimensions, andthen by matching students across the matched teachers. Careful analysesof pretest data were also carried out, and pretest scores were used in sub-sequent analyses. In these ways, the likelihood that any effects observedin the study were due to the intervention—teacher administration and useof the COCA—increased. However, it is important to note that available

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 28: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

248 S. E. Witmer et al.

information on technical adequacy was limited for the measures used tomatch teachers and students. Furthermore, it is important to note that al-though reasonable procedures were used to address missing data, the proce-dures used assumed the missing data were MCAR, and absolute justificationfor this assumption cannot be provided. Also, it is important to note thatin some cases, experimental and control group teachers were in the sameschool, and thus diffusion of the professional development information mayhave occurred. However, if diffusion did occur, the analysis may have de-tected smaller effects than actually would have been the case if it did notoccur. Nonetheless, it is clear that replication of this study with a larger sam-ple of teachers would be helpful to ensure that the results were truly due tothe professional development provided and the use of the COCA assessmentand not a result of alternative influences.

Implications for Future Research

It will be important to examine the long-term effects that instructing studentsin the knowledge and skills measured on the COCA have on their associatedlearning outcomes. Furthermore, research should investigate whether usingthe COCA to differentiate instruction for individual students results in betteroutcomes than adjusting instruction to address weaknesses identified in thegroup more broadly. Last, use of the COCA might be particularly helpful forstudents with certain reading strengths and weaknesses (e.g., highly fluentstudents with poor language skills vs. students with great language skillsand limited word reading fluency) and could be studied for specific groupsof students to determine whether certain students in particular may benefitmore from its use.

Additional studies should investigate approaches for ensuring that teach-ers can effectively and efficiently collect and use student performance data. Inthis study, teachers were provided with 14.5 hr of professional developmentin COCA administration and use, including substantial time to analyze COCAresults after the fall and winter administrations. Based on teachers’ responseson the Survey of COCA Use, it appears that this time was important to help-ing teachers make use of the data collected. Teachers commonly reportedthat the time available during the November and January professional devel-opment sessions was the reason they made particular instructional changes.Moreover, teachers suggested that they would not otherwise have enoughtime to examine and interpret the assessment results in order to meaningfullyalter instruction. It appears necessary to conduct further research on what isneeded to facilitate teachers’ use of this kind of assessment data to informinstruction (e.g., changes in teachers’ assessment literacy, changes in teacherdispositions toward assessment, changes in school life to allow for greatertime for examining assessment results).

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 29: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

Elementary-Grade Informational Text 249

Furthermore, in the present study, it was not possible to disentanglethe effects on student performance that could be considered due to mereadministration of the assessment tool from those that could be considereddue to the corresponding changes in instruction. Although the premise ofthe present study was that professional development on administration anduse of the COCA would lead to instructional changes that would in turn leadto improvements in student performance, the design of the present studydid not allow for a separate investigation of the (a) effects of the COCA pro-fessional development on instruction and (b) effects of instructional changeson student performance. Future research might involve more careful inves-tigation of these potentially distinct effects. In addition, we did not fullydisentangle effects that could be considered due to administration by a fa-miliar examiner from those that could be considered due to the associatedprofessional development in the COCA. Further study would be needed tomore fully disentangle these effects.

Although results from this study suggest that the COCA contributedto improvements in students’ knowledge and skills for the comprehensionof informational text, it is possible that other important aspects of readinginstruction, such as decoding, were simultaneously neglected, or perhapsenhanced (e.g., teaching students what a pronunciation guide is and howit can be used, which is part of the COCA, may benefit children’s decodingdevelopment). Additional investigation is warranted to better understand anyunintended consequences, negative or positive, of using the COCA to informinstruction.

Implications for Practice

On the basis of a review of the literature, it appears that despite a growingnational concern with students’ ability to comprehend informational text,students do not have much access to informational text materials and in-struction. Most early literacy assessment tools focus on monitoring studentprogress in the development of skills necessary for decoding (e.g., phone-mic awareness, alphabetic principle), which may reduce teachers’ attentionto providing instruction in the knowledge and skills necessary for students toeffectively and efficiently comprehend written material, especially informa-tional text. In this study, providing professional development to teachers toencourage their administration and use of data from the COCA appeared tohelp teachers pay more attention to this type of instruction and subsequentlypromoted student growth in this area.4

4 Associated materials, including COCA student books, score sheets, administration andscoring manual, and video demonstrations can be downloaded from the Internet at no charge(http://www.msularc.org/html/project_COCA_main.html).

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 30: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

250 S. E. Witmer et al.

Promoting early instruction in skills for comprehending informationaltext will likely payoff greatly in terms of future student learning given thenumerous future encounters students are expected to have with informa-tional texts.

At present, as schools need guidance in the implementation of multi-tiered systems of support, school psychologists have a unique opportunityto promote the use of assessment data for instructional decision making. Indoing so, it is essential that school psychologists are knowledgeable aboutcurriculum, and in particular, the Common Core State Standards, so theycan identify assessment approaches that are aligned with those expectationsfor student learning and development. Also, few early literacy assessmenttools focus on fundamental knowledge and skills for comprehending infor-mational text. This is the case despite a strong focus on informational textin the Common Core State Standards, an apparent need for students to de-velop such knowledge and skills early on in order to navigate informationaltext materials in the current age of information, and knowledge of specificinstructional strategies that will ultimately help students effectively navigateand comprehend informational text. At present, early elementary instructionin comprehension, and especially informational text comprehension, is lack-ing. To promote such instruction (alongside instruction in word reading)during the early elementary years, school psychologists should consider en-couraging teachers to administer and use early comprehension assessmenttools such as the COCA.

REFERENCES

Baker, S. K., Santoro, L. E., Chard, D. J., Fien, H., Park, Y., & Otterstedt, J. (2013).An evaluation of an explicit read aloud intervention taught in whole-classroomformats in first grade. Elementary School Journal, 113, 331–358.

Baumann, J. F., Edwards, E. C., Boland, E. M., Olejnik, S., & Kame’enui, E. J. (2003).Vocabulary tricks: Effects of instruction in morphology and context on fifth-grade students’ ability to derive and infer word meanings. American Educa-tional Research Journal, 40, 447–494. doi:10.3102/00028312040002447

Beck, I. L., McKeown, M. G., & Kucan, L. (2002). Bringing words to life. New York,NY: The Guilford Press.

Billman, A. K., Duke, N. K., Hilden, K. R., Zhang, S., Roberts, K., Halladay, J. L.,. . ., & Schaal, A. M. (2008). Concepts of Comprehension Assessment (COCA).Retrieved from http://www.msularc.org/html/project_COCA_main.html

Common Core State Standards Initiative. (2012). Common Core State Standards forEnglish Language Arts & Literacy in History/Social Studies, Science, and Techni-cal Subjects. Retrieved from http://www.corestandards.org/assets/CCSSI_ELA%20Standards.pdf

Connor, C. M., Morrison, F. J., & Petrella, J. N. (2004). Effective reading comprehen-sion instruction: Examining child by instruction interactions. Journal of Educa-tional Psychology, 96, 682–698. doi:10.1037/0022-0663.96.4.682

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 31: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

Elementary-Grade Informational Text 251

Duke, N. K. (2000). 3.6 minutes per day: The scarcity of informational texts in firstgrade. Reading Research Quarterly, 35, 202–224.

Duke, N. K., Hilden, K. R., Billman, A. K., Halladay, J. L., Reynolds, J., Zhang,S., & Park, Y. (2006, November). The impact of the Project-based Approach toBuilding Informational Literacy (PABIL) on informational reading and writingdevelopment. Paper presented at the National Reading Conference, Los Angeles,CA.

Duke, N. K., & Kays, J. (1998). “Can I say ‘Once upon a time’?”: Kindergarten childrendeveloping knowledge of information book language. Early Childhood ResearchQuarterly, 13, 295–318.

Duke, N. K., & Roberts, K. M. (2010). The genre-specific nature of reading com-prehension. In D. Wyse, R. Andrews, & J. Hoffman (Eds.), The Routledge in-ternational handbook of English, language and literacy teaching (pp. 74–86).London, UK: Routledge.

Fountas, I. C., & Pinnell, G. S. (1999). Matching books to readers: Using leveled booksin guided reading, K-3. Portsmouth, NH: Heinemann.

Fuchs, D., & Fuchs, L.S. (1989). Effects of examiner familiarity on Black, Caucasian,and Hispanic Children: A meta-analysis. Exceptional Children, 55, 303–308.

Fukkink, R. G., & de Glopper, K. (1998). Effects of instruction in deriving wordmeaning from context: A meta-analysis. Review of Educational Research, 68,450–469. doi:10.3102/00346543068004450

Hilden, K. R., Duke, N. K., Billman, A. K., Zhang, S., Halladay, J. L., Schaal, A. M., . . .,Martin, N. M. (2008). Informational Strategic Cloze Assessment (ISCA). Retrievedfrom http://msularc.educ.msu.edu/what-we-do/projects/isca

Hoffman, J. V., Roser, N. L., Salas, R., Patterson, B., & Pennington, J. L. (2001). Textleveling and “little books” in first-grade reading. Journal of Literacy Research,33, 507–528. doi:10.1080/10862960109548121

Jeong, J. S., Gaffney, J. S., & Choi, J. O. (2010). Availability and use of informationaltext in second, third, and fourth grades. Research in the Teaching of English,44, 435–456.

Kelly, M. J., & Clausen-Grace, N. (2010). Guiding students through expository textwith text feature walks. Reading Teacher, 64, 191–195.

Kuhn, M. R., & Stahl, S. A. (1998). Teaching children to learn word meanings fromcontext: A synthesis and some questions. Journal of Literacy Research, 30,119–138. doi:10.1080/10862969809547983

Leslie, L., & Caldwell, J. (1990). The qualitative reading inventory. New York, NY:Harper Collins.

Little, R. J. A., & Rubin, D. B. (2002). Statistical analysis with missing data (2nd ed.).New York, NY: Wiley.

MacGinitie, W. H., MacGinitie, R. K., Maria, K., Dreyer, L. G., & Hughes, K. E. (2000).Gates MacGinitie Reading Tests, Fourth Edition (GRMT-4). Itasca, IL: Riverside.

Moss, B. (1997). Qualitative assessment of first graders’ retelling of expository text.Reading Research and Instruction, 37, 1–13.

Mullis, I. V. S., Martin, M. O., Foy, P., & Drucker, K. T. (2012). PIRLS 2011 inter-national results in reading. Chestnut Hill, MA: TIMSS & PIRLS InternationalStudy Center, Boston College. Retrieved from http://timss.bc.edu/pirls2011/international-results-pirls.html

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 32: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

252 S. E. Witmer et al.

Myers, M. P., & Savage, T. (2005). Enhancing student comprehension of social studiesmaterial. The Social Studies, 96, 18–24.

National Governors Association Center for Best Practices, Council of Chief StateSchool Officers. (2010). Common Core State Standards for English LanguageArts and literacy in history/social studies, science, and technical subjects. Wash-ington, DC: Authors.

Pappas, C. C. (1993). Is narrative “primary”? Some insights from kindergarteners’pretend readings of stories and information books. Journal of Reading Behavior,25, 97–129.

Peacock, A. (2001). The potential impact of the ‘Literacy Hour’ on the teach-ing of science from text material. Journal of Curriculum Studies, 33, 25–42.doi:10.1080/00220270120657

Peacock, A., & Weedon, H. (2002). Children working with text in science: Disparitieswith ‘literacy hour’ practice. Research in Science & Technological Education, 20,185–197. doi:10.1080/0263514022000030444

Pearson, P. D., & Duke, N. K. (2002). Comprehension instruction in the pri-mary grades. In C. C. Block & M. Pressley (Eds.), Comprehension instruction:Research-based best practices (pp. 247–258). New York, NY: Guilford Press.

Perfetti, C. A., Landi, N., & Oakhill, J. (2005). The acquisition of reading compre-hension skill. In M. J. Snowling & C. Hulme (Eds.), The science of reading: Ahandbook (pp. 227–247). Oxford, UK: Blackwell.

Pressley, M. (2002). Comprehension strategies instruction: A turn-of-the-century sta-tus report. In C. C. Block & M. Pressley, (Eds.), (Comprehension instruction:Research-based best practices (pp. 11–27). New York, NY: Guilford Press.

Purcell-Gates, V., Duke, N. K., & Martineau, J. A. (2007). Learning to read and writegenre-specific text: Roles of authentic experience and explicit teaching. ReadingResearch Quarterly, 42, 8–45.

Reutzel, D. R., Smith, J. A., & Fawson, P. C. (2005). An evaluation of two approachesfor teaching reading comprehension strategies in the primary years using scienceinformation texts. Early Childhood Research Quarterly, 20, 276–305.

Roberts, K. M., & Duke, N. K. (2010). Comprehension in the elementary grades:The research base. In K. Ganske & D. Fisher (Eds.), Comprehension acrossthe curriculum: Perspectives and practices K–12 (pp. 23–45). New York, NY:Guilford Press.

Shanahan, T. (2006). Relations among oral language, reading, and writing devel-opment. In C. A. MacArthur, S. Graham, & J. Fitzgerald (Eds.), Handbook ofwriting research (pp. 171–186). New York, NY: Guilford Press.

Shanahan, T., Callison, K., Carriere, C., Duke, N. K., Pearson, P. D., Schatschnei-der, C., & Torgesen, J. (2010). Improving reading comprehension in kinder-garten through 3rd grade: A practice guide (NCEE 2010-4038). Washington,DC: National Center for Education Evaluation and Regional Assistance, In-stitute of Education Sciences, U.S. Department of Education. Retrieved fromhttp://whatworks.ed.gov/publications/practiceguides

Stahl, K. A. D. (2004). Proof, practice, and promise: Comprehension strategy instruc-tion the primary grades. The Reading Teacher, 57, 598–610.

Stecker, P. M., Fuchs, L. S., & Fuchs, D. (2005). Using curriculum-based measurementto improve student achievement: Review of research. Psychology in the Schools,42, 795–819. doi:10.1002/pits.20113

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14

Page 33: Using Assessment to Improve Early Elementary Students’ Knowledge and Skills for Comprehending Informational Text

Elementary-Grade Informational Text 253

Symons, S., MacLatchy-Gaudet, H., Stone, T. D., & Reynolds, P.L. (2001). Strategy in-struction for elementary students searching informational text. Scientific Studiesof Reading, 5, 1–33. doi:10.1207/S1532799XSSR0501_1

Taylor, B. M., Pearson, P. D., Clark, K., & Walpole, S. (2000). Effective schoolsand accomplished teachers: Lessons about primary grade reading instruction inlow-income schools. Elementary School Journal, 101, 121–166.

U.S. Department of Education, Institute of Education Sciences, National Center forEducation Statistics. (2012). National Assessment of Educational Progress (NAEP)various years, 1992–2011 Reading Assessments. Washington, DC: Author.

van den Broek, P., & Espin, C. (2012). Connecting cognitive theory and assessment:Measuring individual differences in reading comprehension. School PsychologyReview, 41, 315–325.

Vellutino, F. R., & Scanlon, D. M. (2002). The Interactive Strategies approach toreading intervention. Contemporary Educational Psychology, 27, 573–635.

Williams, J. P. (2005). Instruction in reading comprehension for primary-grade stu-dents: A focus on text structure. The Journal of Special Education, 39, 6–18.doi:10.1177/00224669050390010201

Wright, T. S. (2012). What classroom observations reveal about oral vocabularyinstruction in kindergarten. Reading Research Quarterly, 47, 353–355.

Dow

nloa

ded

by [

Uni

vers

ity o

f U

lste

r L

ibra

ry]

at 0

2:49

03

Dec

embe

r 20

14