18
Delaware School Climate SurveyStudent: Its factor structure, concurrent validity, and reliability George G. Bear , Clare Gaskins, Jessica Blank, Fang Fang Chen University of Delaware, DE, United States article info abstract Article history: Received 29 May 2009 Received in revised form 15 January 2011 Accepted 21 January 2011 The Delaware School Climate SurveyStudent (DSCSS) was devel- oped to provide schools with a brief and psychometrically sound student survey for assessing school climate, particularly the dimen- sions of social support and structure. Conrmatory factor analyses, conducted on a sample of 11,780 students in 85 schools, showed that a bifactor model consisting of ve specic factors and one general factor (School Climate) best represented the data. Those ve factors are represented in ve subscales of the DSCSS: TeacherStudent Relations, StudentStudent Relations, Fairness of Rules, Liking of School, and School Safety. The factor structure was shown to be stable across grade levels (i.e., elementary, middle, and high school), racialethnic groups (i.e., Caucasian, African American, and Hispanic), and gender. As evidence of the survey's concurrent validity, scores for each of the ve subscales and the total scale correlated moderately, across groups and at the school level, with academic achievement and suspensions and expulsions. © 2011 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved. Keywords: School climate Program evaluation Student perceptions Schoolwide positive behavior supports 1. Introduction During the past several decades, a rapidly growing number of schools have implemented schoolwide programs for preventing behavior problems and promoting mental health. These include universal-level prevention and promotion programs for social and emotional learning (Durlak, Weissberg, Dymnicki, Journal of School Psychology 49 (2011) 157174 Corresponding author at: School of Education, Willard Hall Education Building, University of Delaware, Newark, DE 19709, United States. Fax: +1 302 831 4110. E-mail address: [email protected] (G.G. Bear). ACTION EDITOR: Sara Bolt. 0022-4405/$ see front matter © 2011 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved. doi:10.1016/j.jsp.2011.01.001 Contents lists available at ScienceDirect Journal of School Psychology journal homepage: www.elsevier.com/locate/ jschpsyc

1-s2.0-S0022440511000033-main

Embed Size (px)

DESCRIPTION

c

Citation preview

  • ins,University of Delaware, DE, United States

    a r t i c l e i n f o

    Article history:Received 29 May 2009

    groups and at the school level, with academic achievement and

    Journal of School Psychology 49 (2011) 157174

    Contents lists available at ScienceDirect

    Journal of School Psychologyj ourna l homepage: www.e lsev ie r.com/ locate /

    j schpsycsuspensions and expulsions. 2011 Society for the Study of School Psychology. Published by

    Elsevier Ltd. All rights reserved.

    1. Introduction

    During the past several decades, a rapidly growing number of schools have implemented schoolwide

    programs for preventing behavior problemprevention and promotion programs for

    Corresponding author at: School of Education,United States. Fax: +1 302 831 4110.

    E-mail address: [email protected] (G.G. Bear).ACTION EDITOR: Sara Bolt.

    0022-4405/$ see front matter 2011 Society for tdoi:10.1016/j.jsp.2011.01.001gender. As evidence of the survey's concurrent validity, scores for eachof the ve subscales and the total scale correlated moderately, acrossProgram evaluationStudent perceptionsSchoolwide positive behavior supportsJessica Blank, Fang Fang Chen

    a b s t r a c t

    The Delaware School Climate SurveyStudent (DSCSS) was devel-oped to provide schools with a brief and psychometrically soundstudent survey for assessing school climate, particularly the dimen-sions of social support and structure. Conrmatory factor analyses,conducted on a sample of 11,780 students in 85 schools, showed thata bifactor model consisting of ve specic factors and one generalfactor (School Climate) best represented the data. Those ve factorsare represented in ve subscales of the DSCSS: TeacherStudentRelations, StudentStudent Relations, Fairness of Rules, Liking ofSchool, and School Safety. The factor structure was shown to be stableacross grade levels (i.e., elementary, middle, and high school), racialethnic groups (i.e., Caucasian, African American, and Hispanic), andReceived in revised form 15 January 2011Accepted 21 January 2011

    Keywords:School climateGeorge G. Bear, Clare GaskDelaware School Climate SurveyStudent: Its factor structure,concurrent validity, and reliabilitys and promoting mental health. These include universal-levelsocial and emotional learning (Durlak, Weissberg, Dymnicki,

    Willard Hall Education Building, University of Delaware, Newark, DE 19709,

    he Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  • Taylor, & Schellinger, 2011; Zins & Elias, 2006) and character education (Berkowitz & Schwartz, 2006),School-Wide Positive Behavior Support programs (SWPBS; Sailor, Dunlap, Sugai, & Horner, 2009), anduniversal programs that focus on preventing more specic behavior problems, such as bullying(Merrell, Gueldner, Ross, & Isava, 2008; Swearer, Espelage, Vaillancourt, & Hymel, 2010) and schoolviolence (American Psychological Association Zero Tolerance Task Force, 2008; Jimerson & Furlong,2006). What many of these programs have in common is the aim of promoting a positive schoolclimate. Although a wide range of denitions of school climate exist, most refer to positive socialrelationships. For example, Haynes, Emmons, and Ben-Avie (1997) dene school climate as the qualityand consistency of interpersonal interactions within the school community that inuence children'scognitive, social, and psychological development (p. 322). Recognizing the importance of interper-sonal relationships but placing additional emphasis on safety, Cohen, McCabe, Michelli, and Pickeral(2009) recently dened school climate as the quality and character of school life, that includesnorms, values, and expectations that support people feeling socially, emotionally, and physically safe(p. 182).

    School climate has been linked to a wide range of academic, behavioral, and socio-emotional outcomes(Anderson, 1982; Haynes et al., 1997), including academic achievement (Brand, Felner, Shim, Seitsinger, &Dumas, 2003; Grifth, 1999); student academic, social, and personal attitudes and motives (Battistich,Solomon, Kim, Watson, & Schaps, 1995); attendance and school avoidance (Brand et al., 2003; Welsh,2000); student delinquency (Gottfredson, Gottfredson, Payne, & Gottfredson, 2005; Welsh, 2000);attitudes and use of illegal substances (Brand et al., 2003), bullying (Nansel et al., 2001); victimization(Gottfredson et al., 2005; Welsh, 2000); depression and self-esteem (Brand et al., 2003; Way, Reddy, &Rhodes, 2007); and general behavior problems (Battistich & Horn, 1997; Kuperminc, Leadbeater, & Blatt,2001; Welsh, 2000). Although a positive school climate is a goal of most schoolwide programs forpreventing behavior problems, school climate is seldom evaluated in studies of program effectiveness.The most common method of evaluating the effectiveness of programs for preventing behaviorproblems in schools has been the use of teacher reports of student behavior (Wilson & Lipsey, 2007).Likewise, in studies of SWPBS, ofce disciplinary referrals (ODRs) have been the most common outcomemeasured (Horner & Sugai, 2007). Both teacher ratings and ODRs have their shortcomings. A majorshortcoming of teacher reports is reporter bias. That is, in rating student behavior, teachers inintervention schools often are well aware that the interventions implemented are expected to improvestudent behavior and that their negative ratings are likely to cast a negative light on their school'seffectiveness and, in some cases, their own effectiveness. This bias may largely explain why interventioneffect sizes tend to be larger when teacher reports, rather than student reports, are used in studies ofprogram effectiveness (Wilson & Lipsey, 2007). ODRs also have multiple shortcomings (Morrison,Redding, Fisher, & Peterson, 2006). Perhaps chief among them is that decreases in ODRs may occurwithout improvements in student behavior. Instead of improvement in behavior, reduced ODRs maysimply reect normal uctuations in ODRs from year to year and changes in referral policies andpractices (Wright & Dusek, 1998). To be sure, both teacher ratings and ODRs also have their advantages,especially when used as part of a multimethod system of assessing program needs and effectiveness(Irvin, Tobin, Sprague, Sugai, & Vincent, 2004; McIntosh, Frank, & Spaulding, 2010). However, inaddition to the disadvantages noted above, they do not assess, nor are they intended to assess, schoolclimate and student perceptions of their schools.

    The primary purpose of the present study was to develop a brief and psychometrically soundinstrument for assessing student perceptions of school climate, the Delaware School Climate SurveyStudent version (DSCSS). Initiated and supported by Delaware's SWPBS project, the DSCSS wasintended to complement existing methods and measures that many schools use to indicate a school'seffectiveness, such as state-wide achievement tests, suspension/expulsion rates, teacher ratings of studentbehavior, and ODRs. Of particular focus was the development of a valid and reliable self-report survey thatschools could use to assess student perceptions of those aspects of school climate related to the aims of twoprogram initiatives in Delaware: SWPBS, which is now implemented in approximately 60% of schools inDelaware, and bullying prevention programs, which are mandated by state law and thus implemented to onedegree or another in all schools. Those program initiatives focus on improving relations among students andbetween teachers and students, establishing clear and fair expectations and rules, reducing student conduct

    158 G.G. Bear et al. / Journal of School Psychology 49 (2011) 157174problems, and increasing school safety.

  • 1.1. Theoretical roots of the DSCSS and supporting research

    The development of the DSCSS was guided by two theoretical frameworks: (a) authoritative disciplinetheory (Baumrind, 1971, 1996; Bear, 2005; Brophy, 1996; Gregory & Cornell, 2009) and (b) Stockard andMayberry's (1992) theoretical framework of school climate.

    Supported by research on childrearing (Baumrind, 1971, 1996; Lamborn, Mounts, Steinberg, &Dornbush, 1991) and research on school discipline and school climate (Brophy, 1996; Gregory et al.,2010), authoritative discipline theory asserts that the most effective style of discipline, authoritativediscipline, is comprised of a balance of two broad components. Those two components are responsivenessand demandingness (Baumrind, 1996), which also are called support and structure (Gregory & Cornell,2009; Gregory et al., 2010). Responsiveness, or social support, refers to the extent to which adults (andalso peers) are responsive to children's social and emotional needs. Responsiveness is seen by othersdemonstrating warmth, acceptance, and caring. Demandingness, or structure, refers to the extent towhich adults present clear behavioral expectations and fair rules, enforce those rules consistently andfairly, and provide necessary supervision and monitoring of student behavior. A healthy balance ofresponsiveness and demandingness fosters both willing compliance to rules and the social andemotional competencies that underlie self-discipline (Bear, 2010; Brophy, 1996). This combination alsohas been found to promote student perceptions of safety (Gregory et al., 2010) and liking of teachers andschools (Osterman, 2000).

    1.1.1. Stockard and Mayberrys Theoretical Framework of School ClimateAn emphasis on responsiveness and demandingness is also seen in Stockard and Mayberry's

    (1992) theoretical framework of school climate. Based on their comprehensive review of sociological,psychological, and economic theories and research of organizations, which included the effective schoolsand school climate literatures, Stockard andMayberry concluded that school climate is best conceptualizedas consisting of two broad dimensions: social action and social order. Social action is similar toresponsiveness, or social support, in authoritative discipline theory, with its emphasis on the everydaysocial interactions among teachers, staff, and students (i.e., the presence of caring, understanding, concern,and respect). In contrast, social order is similar to demandingness, or structure, with its primary goal beingto curtail behavior problems and promote safety. Several studies by Grifth (1995, 1999) have supportedStockard andMayberry's framework, showing that elementary school students' perceptions of social actionand social order, and particularly the former, were related to their self-reports of academic performanceand satisfaction.

    1.1.2. Related Student Surveys of School ClimateOther than the nontitled school climate survey developed by Grifth (1995, 1999) that was

    specically developed for research purposes to test Stockard and Mayberry's (1992) framework, weknow of no school climate survey that was developed based specically on the responsivenessdemandingness, or supportstructure, dimensions of school discipline and school climate. However, toone extent or another, most surveys of school and classroom climate include these two dimensions(Grifth, 1999; Stockard & Mayberry, 1992). These surveys include three with demonstrated reliabilityand validity evidence: The Classroom Environment Scale (CES; Moos & Trickett, 1974, 2002; Trickett &Quinlan, 1979), the Inventory of School ClimateStudent version (ISCS; Brand et al., 2003), and theSchool Climate Survey (SCS; Emmons, Haynes, & Comer, 2002; Haynes, Emmons, & Comer, 1994). TheCES, widely used for the past three decades to assess climate at the classroom level, consists of sixsubscales. Two subscales assess relationships, or social support (Afliation and Teacher Support), andtwo other subscales assess system maintenance and system change, or structure (Order andOrganization and Rule Clarity). The ISCS, similar to the CES in theoretical framework, is designedspecically for middle school students. It consists of 10 subscales, 4 of which are consistent with thedimension of social support (Teacher Support, Positive Peer Interactions, Negative Peer Interactions, andSupport for Cultural Pluralism) and 3 with the dimensions of structure (Consistency and Clarity ofRules and Expectations, Disciplinary Harshness, and Safety Problems). Finally, the SCS, which includesan elementary and middle school version and a high school version, assesses both social support

    159G.G. Bear et al. / Journal of School Psychology 49 (2011) 157174and structure, with much greater emphasis on the former. Five of the six subscales of the most recent

  • elementary and middle school version (Emmons et al., 2002) focus on social support: StudentInterpersonal Relations, StudentTeacher Relations, Parent Involvement, Sharing of Resources, andFairness (dened as the equal treatment of students). A sixth subscale, Order and Discipline (whichassesses appropriate behavior) focuses on structure. The most recent high school version (Emmons et al.,2002) has the same factors (with the exception of the Fairness factor being replaced by a School Buildingfactor, for reasons that are unclear).

    Consistent with authoritative discipline theory and Stockard and Mayberry's (1992) theoreticalframework, each of the school climate surveys above, as well as the one presented in the current study,assumes a socialecological perspective inwhich an individual's perceptions of the social environment, andespecially social transactions, rather than objective reality per se, are viewed as most important inunderstanding human behavior (Bandura, 1986, 1997; Bronfenbrenner, 1979). Both student and teacherperceptions are important; thus, many surveys of school climate, including those above, have student andteacher versions. However, student perceptions, which tend to be less positive than teacher perceptions(Anderson, 1982; Fisher & Fraser, 1983), are used most often in research on school climate. This pattern isnot only to avoid possible teacher bias in program evaluations, as noted previously, but also because thevalue of student perceptions of school climate is widely recognized in school reform efforts (Eccles et al.,1993; Haynes et al., 1997). It also is consistent with research and theory indicating that students'perceptions of their classroom or school environments are more important than objective indicators ofthose environments in understanding studentmotivation, adjustment, andwellbeing (Connell &Wellborn,1991; Eccles et al., 1993).

    1.1.3. Research Supporting the Proposed Factors of the DSCSSGuided by authoritative discipline theory and Stockard andMayberry's (1992) theoretical framework, the

    DSCSSwas designed to assess components of social support and structure consistentwith the primary goals

    160 G.G. Bear et al. / Journal of School Psychology 49 (2011) 157174of SWPBS and bullying prevention programs. Two subscales were created to assess responsiveness, or socialsupport: TeacherStudent Relations and StudentStudent Relations. Three subscales were created to assessdemandingness, or structure: Fairness of Rules, School Safety, and Student Conduct Problems. A sixthsubscale, Liking of School, also was included, for reasons described below.

    1.1.3.1. Relations among teachers and students. Inclusion of the Teacher-Student Relations subscale wassupported by research showing that students feelmore comfortable and supported in schools and classrooms inwhich teachers are caring, respectful, andprovide emotional support (e.g., Battistich, Solomon,Watson,&Schaps,1997; Osterman, 2000). In those environments, students experience greater school completion (Croninger &Lee, 2001), on-task behavior (Battistich et al., 1997), self-reported academic initiative (Danielsen, Wiium,Wilhelmsen, &Wold, 2010), academic achievement (Fredricks, Blumenfeld, & Paris, 2004; Gregory &Weinstein,2004), peer acceptance (Hughes, Cavell, & Wilson, 2001), and motivation to act responsibly and prosocially(Wentzel, 1996). They also engage in less oppositional and antisocial behaviors (Bru, Stephens, & Torsheim,2002; Hamre, Pianta, Downer, & Mashburn, 2008; Jessor et al., 2003), including bullying (Gregory et al., 2010).

    1.1.3.2. Relations among students. In support of the Student-Student Relations subscale, research shows thatstudentswhoare rejectedby their peers are at increased risk for disruptive behavior, poor achievement, dislikingof school, school avoidance, and not completing school (Buhs, Ladd, &Herald, 2006;Welsh, 2000). Studentswhoengage in negative peer interactions are more likely to show delinquent and aggressive behaviors and morelikely to report low self-esteem and depression (Brand et al., 2003). In contrast, social support from classmateshas been shown to be related to academic initiative (Danielsen et al., 2010), to moderate victimization anddistress for boys (Davidson & Demaray, 2007), and to predict externalizing and adaptive behaviors for girls(Reuger, Malecki, & Demaray, 2008).

    1.1.3.3. Fairness of rules. Research supporting the Fairness of Rules subscale shows that students' perceptions ofthe fairness of school rules are related signicantly to greater student engagement and academic achievementand to less delinquent behavior, aggression, and student victimization (Arum, 2003; Brand et al., 2003;Gottfredson et al., 2005). Research also shows that students engage in less offending and misconduct whenthey perceive rules to be fair (Welsh, 2000, 2003).Multiple school climate surveys include a subscale designed

    to assess fairness of rules (e.g., Brand et al., 2003; Furlong et al., 2005; Gottfredson, 1999).

  • 1.1.3.4. School safety and student conduct problems. The Safety and Student Conduct Problems subscaleswereincluded in light of research showing that students and teachers perceive school climatemore favorablywhenthey feel safe (Kitasatas, Ware, & Martinez-Arias, 2004) and when aggression and victimization are notcommon (Astor, Benbenishty, Zeira, &Vinokur, 2002; Goldstein, Young, &Boyd, 2008). Studentswhoperceivefewer safety problems at school tend to be more academically adjusted, engage in less delinquent andaggressive behaviors, and report greater self-esteem and fewer depressive symptoms (Brand et al., 2003;Horner et al., 2009). Whereas TeacherStudent Relations are commonly found on student surveys of schoolclimate, until recently (i.e., post-Columbine), school safety and student conduct problems have not generallybeen included. When included, some surveys present school safety and student conduct problems items onthe same subscale (e.g., Barnett, Easton, & Israel, 2002; Brand et al., 2003), whereas others (e.g., Center forSocial and Emotional Education [CSEE], 2009; Emmons et al., 2002; Gottfredson, 1999) tend to include itemssurveying student perception of either school safety or student conduct problems but not both. In the currentstudy, we developed items to tap both, expecting two distinct factors to emerge, as found on the CaliforniaSafety and School Climate Survey (Furlong et al., 2005).

    1.1.3.4. Liking of school. Although liking of school does not fall clearly under either the social support orstructure category, it has nevertheless been shown to be an important aspect of school climate and was ofspecic interest in the Delaware Department of Education's (DOE) evaluation of school climate. Thus, weincluded a Liking of School subscale. Research shows that liking of school correlates with less delinquency,alcohol and substance use, violence, suicidality, and emotional stress (Fredricks et al., 2004; Resnick et al.,1997) andwith higher levels of academic achievement (Ding&Hall, 2007; Thompson, Iachan, Overpeck, Ross,&Gross, 2006).Whereas somestudieshave treated likingof school as a distinct constructmeasured separatelyfrom school climate (e.g., Child Development Project, 1993; Ladd& Price, 1987), others have included liking ofschool as one of several components of the school climate or environment (e.g., Ding & Hall, 2007).

    1.2. Guiding pragmatic concerns and goals of the study

    In addition to theory and research, the development of the DSCSS was guided by more pragmaticconcerns. As noted previously, the DOE desired a student survey of demonstrated validity and reliabilitythat assessed the social support and structure aspects of school climate targeted in SWPBS and bullyingprevention programs. The DOE also desired a survey that could be administered in less than 20 min (toavoid teacher resistance to its use and to limit time taken from instruction) and one that could be used forgrades 312. The latter was to allow schools to monitor changes in scores on the same subscales from yearto year and across grade levels. Finally, preference was for a survey that was not commercially producedand costly. We found no survey that t all of those criteria. Given that 29 states currently provide ormandate school climate assessments and that only one state (i.e., Rhode Island) offers evidence of thevalidity and reliability of the assessment instruments used (Cohen et al., 2009), the need for a survey suchas the DSCSS is clearly evident not only in Delaware but also in most other states.

    The goals of the present study were twofold. The rst goal was to provide evidence of the DSCSS'sconstruct validity using conrmatory factor analyses to test the proposed six-factor model of schoolclimate, as well as alternative models that might better t the data. The second goal was to provideevidence of the survey's concurrent validity. Consistent with the research reviewed previously, wepredicted that favorable student perceptions of school climate would correlate positively with academicachievement and negatively with school suspensions and expulsions.

    2. Method

    2.1. The numerical method: TISC

    The original sample consisted of 12,262 students enrolled in 85 schools in grades 312 in the state ofDelaware. Among those students, 482 (3.9% of the sample) were deleted because responses on demographicitems (i.e., gender, raceethnicity, andgrade)were eithermissingor unreadable by the Scantron computerizedscoring system used (i.e., bubbles were not lled in dark enough or more than one response was given to the

    161G.G. Bear et al. / Journal of School Psychology 49 (2011) 157174same item). Rather than estimatingmissingvalues,we deleted these cases because gender, raceethnicity, and

  • Table 1Demographic information of the sample.

    Grade level

    Elementary Middle High Full sample Statewide

    Gender

    162 G.G. Bear et al. / Journal of School Psychology 49 (2011) 157174grade levelwereused as groupingvariables in themultigroup conrmatory factor analyses. In thisnal sample,missing responses to individual items on the survey ranged from 0.1% to 2.1%. The sample, as used in allstatistical analyses that follow, included 11,780 students and 85 schools. There were 7889 students in 58elementary schools, 2522 in 17 middle schools, and 1369 in 10 high schools.1 The sample representedapproximately 50% of public schools in the state, excluding alternative education schools, special educationschools, and preschools, and consisted of approximately 10% of the state's public school population of 122,281students.

    Table 1 provides student demographic information on the sample, as obtained from the surveys, and thepercentage of students in each category statewide as reported by the Delaware DOE. The Delaware DOEwebsite was used to obtain the percentage of students who qualied for free or reduced lunch within each ofthe 85 participating schools (which included all students in each school and not just those in the study). Asseen in the table, the demographics for thenal sample closely approximated those for the state.However, thepercentages of African Americans and Caucasians in the sample were slightly less (approximately 3% in eachcategory) than those reported by the state. We speculate that these differences in percentages can beattributed to the differences in the racialethnic identication categories used in the current study and thoseused by the DOE in its reporting of racialethnic composition of each school. That is, in addition to White,Black, Hispanic, and Asian,which are the four categories reported byDOE based on parent registration of

    Boys 3873 (49.1%) 1240 (49.2%) 667 (48.7%) 5923 (50.3%) 51.8%Girls 4016 (50.9%) 1282 (50.8%) 702 (51.3%) 5857 (49.7%) 48.4%

    Race/ethnicityCaucasian 3962 (50.2%) 1203 (47.7%) 803 (58.7%) 5968 (50.7%) 53.9%African American 2355 (29.9%) 832 (33.0%) 341 (24.9%) 3528 (29.9%) 33.0%Hispanic 819 (10.4%) 251 (10.0%) 93 (6.8%) 1163 (9.9%) 8.8%Asian 203 (2.6%) 54 (2.1%) 58 (4.2%) 315 (2.7%) 3.0%Other 550 (7.0%) 182 (7.2%) 74 (5.4%) 806 (6.8%) a

    Free/reducedLunch b 47% 45% 34% 44% 42%

    a This category was not reported by the state.b Average percentage of students across schools who qualied for free or reduced lunch, based on each school's report for the

    entire student body of the school.their children upon entering school, students in the current studywere given the option of self-identifying asOther, includingmixed races. This Other categorywas chosen by 6.8% of the students. It is speculated thatthe Other category included many multiracial students identied by DOE as either Black or White.

    2.2. Measures

    2.2.1. Delaware School Climate SurveyStudent (DSCSS)The proposed 29 items of the DSCSS resulted from a two-year process during which multiple items

    were drafted and rened, with reviews of each draft conducted by ve members of the Delaware's SWPBStraining team. This team consisted of two university professors in school psychology, a DOE supervisor ofspecial education, and the SWPBS project director and coordinator. Two years prior to the current study, aninitial draft of the survey consisting of 29 items was eld-tested in two schools. The survey was completed

    1 For purpose of this study and consistent with the classication used by the Delaware DOE, an elementary school was dened asa school with students below the 6th-grade level, a middle school was dened as a school with grades no higher than the 8th gradeor lower than 5th grade, and a high school was dened as a school with grades 9 to 12. Among the elementary schools contributingto this study's sample, 2 consisted of grades kindergarten (K) to 3, 11 consisted of grades K to 4, 1 consisted of grades 2 to 4, 32consisted of grades K to 5, 5 consisted of grades 1 to 5, 3 consisted of grades K to 6, and 4 consisted of grades 4 to 6. Among themiddle schools, 2 included grades 5 to 6, 2 included grades 5 to 8, 9 included grades 6 to 8, and 4 included grades 7 to 8. All 10 highschools included grades 9 to 12.

  • by 221 students in an elementary school (grades K-5) and 155 students in a middle school (grades 68).Items were designed to assess four school climate factors: Caring Relationships, Fairness of Rules, SchoolSafety, and Liking of School. Following this administration, and based on feedback from the faculty at eachschool, six items were deleted, and others were revised. As a result of the difculty teachers observedamong younger students when completing the survey, it was recommended that the survey not be given tostudents below third grade, even if read aloud.

    Following the above revisions, and the addition of six items designed to assess student conduct problems(e.g., ghting, stealing, and cheating), the survey was re-administered to a larger number of students thefollowing school year for purposes of conducting exploratory factor analyses. The revised 29 items wereadministered to 4103 students in grades 312 enrolled in 21 elementary schools (n=2896), 6middle schools(n=899), and 2 high schools (n=308). Results of exploratory factor analyses, conducted for all gradescombined using principal axis factor analyses, with both orthogonal (i.e., varimax) and oblique (i.e., oblimin)rotations, indicated ve factors: TeacherStudent Relations, StudentStudent Relations, Student ConductProblems, School Safety, and Liking of School. A forced six-factor solution, with oblimin rotation, yielded theexpected sixth factor, Fairness of Rules, with an eigenvalue of 1.03 (and .945 for School Safety). These resultsled to minor revisions in the wording of several items, with the intent of producing a more robust andconsistent six-factor solution across elementary, middle, and high schools. For this purpose, and for thecurrent study, the revised survey was administered the following year to a large number of students, andconrmatory factor analyses were conducted, as explained later in the data analyses and results sections.

    As used in the current study, and described later, the DSCSS consisted of a total of 29 items designed toassess those aspects of school climate reviewed previously: TeacherStudent Relations (8 items), StudentStudent Relations (4 items), Fairness of Rules (4 items), Student Conduct Problems (6 items), School Safety(3 items), and Likingof School (4 items). TeacherStudentRelations refers to theperceivedquality of students'interactionswith adults in the school; in particular, adults' demonstration ofwarmth, respect, recognition, andfairness. StudentStudent Relations captures students' beliefs about the quality of students' interactions withother students, such as peers showing friendliness and respecting one another. Fairness of Rules assessesstudents' perceived fairness of school rules and their consequences. Student Conduct Problems was intendedto assess the degree to which students perceive ghting, harming others, bullying, stealing, cheating, andcheating, as problems in their school (e.g., Stealing [drugs, ghting, etc.] is a problem in this school.). SchoolSafety taps how students generally feel about the level of safety in their school. Liking of School assesses howstudents generally feel about their school. Consistent with the goal of producing a survey appropriate forstudents as early as the third grade, the readability level for the nal version of the DSCSS was 2.6 based onthe FleschKincaid readability formula (Kincaid, Fishburne, Rogers, & Chissom, 1975). The survey and itsguidelines for interpreting scores are available from www.delawarepbs.org.

    Students responded to the 29 items using a 4-item Likert scale, with 1=Strongly Disagree, 2=Disagree,3=Agree, and 4=Strongly Agree. For computing the total school climate score and all other factor andsubscale scores items reecting a negative school climate (e.g., I wish I went to another school.) werereversed scored. In the current study, the items and item responses were presented on a Scantron sheet inwhich students lled in the bubble representing their selected responses to each item. Each response sheetwas scanned and scored by computer by the DOE.

    2.2.2. Academic achievementAcademic achievement data were obtained at the school level from the school proleswebsite of each

    school, which is maintained by the DOE. The data consisted of the percentage of students in each schoolthat passed the English-Language Arts (reading and writing; ELA) and Mathematics portions of theDelaware Student Testing Program (DSTP) during the year that the DSCSS was administered (March,2007). The DSTP is a state-wide standards-based assessment of student progress towards the Delawarecontent standards in adherence to No Child Left Behind (NCLB) accountability requirements. Items, alignedwith state standards, were developed by teachers and external consultants with expertise in each contentarea. Reliability exceeds .90 for each portion of the DSTP (Delaware Department of Education, 2009). Apassing score indicates that a student met or exceeded the state standards. Five criterion-basedperformance scores are used: 1=well below the standard, 2=below the standard, 3=meets the standard,4=exceeds the standard, and 5=distinguished. Thus, a score of 3 or above is passing. (Note that, in this

    163G.G. Bear et al. / Journal of School Psychology 49 (2011) 157174study, only dichotomized data at the school level were employed, based on availability.)

  • 2.2.3. Suspensions and expulsionsA school-level suspension and expulsion score consisted of the percentage of students in each school

    (using a non-duplicated count) who were suspended from school or expelled during the school year of thestudy. All schools in Delaware are required to report to the DOE the number of students suspended orexpelled, and those data were obtained from the DOE. The types of violations were not reported, but theygenerally ranged from repeated acts of disruptive behavior to criminal offenses, such as possession of drugsand ghting.

    2.3. Procedures

    All 85 schools participating in the study volunteered to do so upon an invitation from the Delaware DOEin a letter sent to each school district ofce. In return for their participation, each school was promised areport of the survey results. Because of cost limitations, each school was sent 200 surveys. The averageschool return rate was 73% with a range of 19.5% to 100%, indicating that each school returned an averageof 144 completed surveys out of the 200 provided. The average school return rate was 70% for elementaryschools, 79% for middle schools, and 72% for high schools. Each school was directed to administer theschool climate surveys to classrooms they were to select at random andwas given steps for doing so. In thewritten directions for administering the surveys, schools were specically informed to either (a) enter allregular classroom teachers' names into a hat and draw enough classes to complete the 200 surveys or (b)select teachers randomly from the school roster by choosing teacher's names in alphabetical order (e.g.,every 3rd teacher on the school roster) to obtain a sufcient number of classrooms. If students changedclassrooms, as done in middle and high schools, schools were told to select teachers in only one subjectarea that was required of all students (e.g., math, English, and social studies). Written instructions alsowere given for the administration of the survey; teachers were asked to read the survey aloud inelementary schools and assure all students that their responses were condential. To ensurecondentiality, students were told not to record their names or any identication numbers. Likewise, asrequested by the DOE, no method was used to identify the student's classroom or teacher. However, itemson the survey asked students to identify their grade, race (White, Black, Hispanic, Asian, or Other,including mixed races), and gender. All surveys were completed in late February and March.

    2.4. Data Analyses

    In our analyses of the factor structure of the DSCS-S we used conrmatory factor analyses because it isthe recommended method of choice when the factorial structure of a scale is hypothesized, based ontheory or previous research, including previous exploratory factor analyses of the data (Thompson, 2004).We rst tested the proposed six-factor model, with each item specied as an indicator of a factorcorresponding to the assigned subscale. In addition, we estimated a bifactor model with a general factorand six specic factors. After discovering that the Student Conduct Problems factor did not correlateappreciably with other factors and items on this factor did not load on the general factor of school climate,we eliminated this factor and examined three alternative CFA models that might better represent thestructure of the DSCSS. These alternative models included a one-factor model, a ve-factor model, and abifactor model with one general factor and ve specic factors. The SatorraBentler scaled chi-squaredifference test was used to compare nested models (Asparouhov & Muthn, 2010) and the AkaikeInformation Criterion (AIC) values were used to compare non-nested models.

    For cross-validation purposes, the sample was randomly divided into two subsamples. The rst samplewas used to examine model t for the hypothesized model and the three alternative models. The secondsample was used to verify and replicate the nal model derived from the rst sample.

    Mplus 5.2 (Muthn & Muthn, 19982008) was used for conducting the conrmatory factor analyses.We performed missing data analysis using the full information maximum likelihood (FIML) estimator inMplus. FIML is a recommended procedure for estimating parameters with incomplete data. Becausestudents were nested within schools, we calculated intraclass correlations (ICCs) for each of the factorscores to assess the degree to which variability in student responses could be accounted for at a schoollevel. The ICCs on the factor scores in elementary schools ranged from .06 to .15. In middle and high

    164 G.G. Bear et al. / Journal of School Psychology 49 (2011) 157174schools, ICCs ranged from .04 to .12 and from .08 to .18, respectively. ICCs were much higher when

  • calculated without controlling for school type (e.g., elementary, middle, and high school). Because the ICCsindicated that student responses were nonindependent and a portion of the variance was accounted for at

    165G.G. Bear et al. / Journal of School Psychology 49 (2011) 157174the school level, all conrmatory factor analyses accounted for the nesting of students within schools, andindividual item responses were centered around the school mean by utilizing the centering command inMplus. Group mean centering addressed the clustering issue by removing the school mean differencesfrom the item responses, thereby producing ICCs of zero for each item.

    Chi-square difference tests were calculated using the SatorraBentler scaled chi-square difference test(Asparouhov &Muthn, 2010). Given that chi-square t statistics are sensitive to sample size and violationof normality assumption, three other commonly used t indices were also employed to assess model t:the Comparative Fit Index (CFI), the Root Mean-Square Error of Approximation (RMSEA), and theStandardized Root Mean-Square Residual (SRMR). The CFI, an incremental t index, assesses the degree towhich the tested model is superior to an alternative model in reproducing the observed covariance matrix.Because this t index evaluates the goodness of t, the larger the value the better the model t. Generally avalue greater than or equal to .95 is considered an adequate t (Hu & Bentler, 1998). The RMSEA and SRMRare absolute t indexes that assess the degree to which the model-implied covariance matrix matches theobserved covariance matrix. Because these indexes estimate the degree of poor model t, the smaller thenumber the better themodel t. Commonly applied rules of thumb for these t indexes suggest that valuesless than or equal to .08 reect an adequate t (Hu & Bentler, 1998).

    In order to investigate whether the surveys were of comparable factor structure across different groupsof respondents (i.e., elementary, middle, and high school students; racialethnic groups; and boys andgirls), measurement invariance was tested in a hierarchical fashion by testing congural invariance, weakfactorial invariance, and strong factorial invariance (Meredith, 1993;Widaman&Reise, 1997).2 Thepurposeof testing congural invariance is to investigate whether groups share the same structure (or if the sameitems are loading on the same latent factors) in the conrmatory factor analysis. When testing for this typeof invariance, the pattern of freed and xed parameters is kept the same across groups, however theestimates for the parameters in the groups are independent. Congural invariance is supported if the tindices for the groups are adequate. If congural invariance is not achieved, comparing groups on the samescale would be similar to comparing apples with oranges (Chen, 2007; Chen & West, 2008).

    If congural invariance between groups is found, the next step is to test for weak factorial invariance toexamine whether the groups use an equal unit of measurement in their responses to the survey items. Thistest is done by constraining the factor loadings of the groups to be equal with all other parametersestimated independently. Because the subsequent models are nested within one another, the difference orchange between the t indices for the models was calculated and used to evaluate the pattern invariance.Stringent criteria have been recommended for evaluating weak factorial invariance with total sample sizesgreater than 300: a decrease in CFI of at least .010 supplemented by an increase in RMSEA of at least .015 oran increase in SRMR of at least .030 indicates noninvariance (Chen, 2007). When groups have largedifferences in sample size, even more stringent criteria may be imposed in which a decrease in CFI of atleast .010 alone indicates noninvariance. After weak factorial invariance is found, strong factorial invarianceis tested by constraining the factor loadings and intercepts to be equal across the groups. If strong factorialinvariance is found, it suggests that the point of origin for the scale is equal across groups. We used thefollowing criteria for evaluating strong factorial invariance: a decrease in CFI of at least .010 supplementedby an increase in RMSEA of at least .015 or increase in SRMR of at least .010 indicates noninvariance (Chen,2007).

    After conducting the conrmatory factor analyses, we determined the correlation between the factorsand the internal consistency of each factor. Next, using schoolwide-level data, we examined evidence ofconcurrent validity by correlating the mean schoolwide scores on each factor with the percentage ofstudents earning passing scores for English-Language Arts (ELA) and Mathematics and with suspensionand expulsion rates. For purposes of examining differences in correlations among variables as a function ofthe school's grade levels (i.e., 58 elementary versus 27 middle and high schools), middle and high schoolsampleswere combined because of the small number of middle schools (n=17) and high schools (n=10).

    2 Strict factorial invariance, in which the residuals are constrained to be equal, is the most rigorous test of measurementinvariance and is seldom found, particularly in developmental research. Thus, strict factorial invariance was not examined in this

    study.

  • 3. Results

    3.1. Conrmatory Factor Analysis

    3.1.1. Exclusion of Student Conduct Problems subscaleThe proposed six-factor model yielded adequate t indices, 2=2766.47 (362, N=5907), pb .001;

    CFI=.934, RMSEA=.034, and SRMR=.042. However, the Student Conduct Problems factor did notcorrelate signicantly with two of the factors: TeacherStudent Relations, r=.02 (p=.390), and Fairnessof Rules, r=.02 (p=.450). It was only weakly correlated with the three other factors: StudentStudent

    166 G.G. Bear et al. / Journal of School Psychology 49 (2011) 157174Relations, r=.24 (pb .001); Liking of School, r=.10 (pb .001), and School Safety, r=.24 (pb .001). Theseweak correlations were particularly striking given that the estimated correlations between the veother factors ranged from .50 to .75. A bifactor model with six specic factors and a general factor wasalso estimated, 2=2513.61 (348, N=5907), pb .001; CFI=.942, RMSEA=.032, and SRMR=.037.Notably, two of the six items from the proposed Student Conduct Problems factor did not loadsignicantly onto the general factor (with loadings of .02 and .03), and although the loadings for theother four items were statistically signicant (pb .05) their values were low, ranging from .05 to .26. Theabsence of strong relations between the Student Conduct Problems factor and the other factors,together with the low factor loadings of items from the Student Conduct Problems factor on the generalfactor, provided strong evidence that the Student Conduct Problems factor did not measure the sameconstruct measured by the other ve factors. Thus, the Student Conduct Problems subscale wasexcluded from subsequent analyses.

    3.1.2. Testing alternative modelsAs illustrated in Table 2, a one-factor model, the rst and most parsimonious of the three alternative

    models, yielded poor t statistics. When compared to a nested ve-factor model that yielded better tstatistics, the SatorraBentler scaled chi-square difference test indicated that the ve-factor model wassignicantly better tting than the one-factor model, 2= =340.71 (df=14), pb .001. Finally, abifactor model with one general factor (i.e., School Climate) and ve specic factors was estimated. Each ofthe t indices for this model exceeded the recommended cutoff criteria. The Akaike Information Criterion(AIC) values from the ve-factor model (AIC=284,305.73) and the bifactor model (AIC=284,047.44)were compared. Because the bifactor model had a lower AIC value, it was selected as the nal model.

    3.1.3. Conrming t of nal modelConrmatory factor analyses on the second randomly selected half of the sample also generated robust

    t statistics for the bifactor model with one general factor and ve specic factors, 2=1179.08 (207,N=5873), pb .001; CFI=.965, RMSEA=.028, and SRMR=.028. The completely standardized factorloadings were also compared to ensure that there were no large differences across the randomly selectedsamples. As illustrated in Table 3, the items had similar factor loadings on the ve specic factors and thegeneral factor in both halves of the sample. The 23 items generally had similar factor loadings on the vespecic factors as well as the general factor in the two random samples. Because no appreciable differencesin the t indices or factor loadings were found for the two halves of the sample, all subsequent analyseswere run with the full sample. A summary of the t statistics for the bifactor model with full sample andsubsamples is presented in Table 4.

    Table 2Fit statistics for models tested.

    Model 2 df CFI SRMR RMSEA

    One-factor model 7226.12 234 .765 .064 .071Five-factor model 1547.41 220 .955 .029 .032Bifactor model 1359.48 207 .961 .028 .031

    Models were tested on one half of sample, randomly selected. N's=5907. 2=chi-square statistic; df=degrees of freedom;CFI=Comparative Fit Index; SRMR=Standardized Root Mean Square Residual; RMSEA=Root Mean Square Error of Approximation.

    pb .001.

  • Table 3Conrmatory factor analysis of the DSCSS: bifactor model.

    Sample 1 Sample 2

    Item Loading SE z Loading SE z

    General factorI am proud of my school. .70 .01 51.60 .69 .01 50.11I like this school. .69 .01 55.54 .66 .01 57.93I feel safe in this school. .62 .02 40.68 .63 .01 46.47Adults in this school treat students fairly. .62 .01 46.24 .61 .01 49.09Teachers listen to you when you have a problem. .59 .01 48.64 .59 .01 46.99The school's Code of Conduct is fair. .57 .01 39.27 .55 .01 38.15School rules are fair. .55 .02 36.70 .55 .01 37.50Adults who work in this school care about the students. .54 .01 40.15 .55 .01 38.91This school feels like a prison. .53 .02 34.23 .51 .02 26.84I like my teachers. .53 .02 31.54 .51 .02 31.45This school is safe. .53 .02 35.56 .53 .01 45.03Students feel safe in this school. .53 .02 32.67 .55 .01 38.85Teachers treat students with respect. .52 .01 37.78 .52 .02 31.08Teachers are fair when correcting misbehavior. .52 .02 31.70 .51 .02 31.41Teachers care about their students. .51 .01 35.41 .51 .02 31.96Students treat each other with respect. .50 .02 29.65 .51 .01 41.64Students really care about each other. .48 .02 31.10 .48 .01 33.54I wish I went to another school. .48 .02 27.01 .48 .02 29.28Students are friendly towards most other students. .45 .02 29.62 .42 .02 26.59Teachers let you know when you are doing a good job. .41 .02 22.24 .40 .02 25.29Students get along with one another. .40 .02 23.74 .38 .02 23.06Consequences of breaking rules are fair. .35 .02 20.98 .32 .02 16.03The rules are too harsh. .32 .02 13.92 .33 .02 14.55

    Factor 1: TeacherStudent RelationsTeachers care about their students. .53 .02 27.35 .53 .02 23.81Adults who work in this school care about the students. .40 .02 18.01 .40 .02 17.62Teachers treat students with respect. .38 .02 18.00 .40 .02 16.51I like my teachers. .34 .02 15.12 .33 .02 15.43Teachers listen to you when you have a problem. .28 .02 11.81 .31 .02 13.65Adults in this school treat students fairly. .26 .03 10.44 .29 .02 13.32Teachers let you know when you are doing a good job. .22 .03 8.57 .25 .03 10.11Teachers are fair when correcting misbehavior. .21 .02 9.11 .26 .02 13.46

    Factor 2: StudentStudent RelationsStudents really care about each other. .56 .02 29.60 .53 .02 26.98Students get along with one another. .54 .02 29.67 .56 .02 30.95Students treat each other with respect. .52 .02 25.54 .47 .02 30.14Students are friendly towards most other students. .46 .02 27.04 .45 .02 26.38

    Factor 3: Fairness of RulesSchool rules are fair. .54 .03 18.39 .47 .03 16.42The rules are too harsh. .33 .03 12.12 .32 .03 11.94The school's Code of Conduct is fair. .28 .03 8.65 .35 .03 12.72Consequences of breaking rules are fair. .21 .03 6.28 .28 .03 9.77

    Factor 4: School SafetyStudents feel safe in this school. .49 .02 20.89 .43 .02 19.58This school is safe. .47 .02 22.16 .47 .02 21.66I feel safe in this school. .47 .02 20.65 .45 .02 19.97

    Factor 5: Liking of SchoolI wish I went to another school. .54 .03 18.39 .52 .03 19.76I like this school. .50 .02 20.42 .55 .03 23.78I am proud of my school. .30 .02 13.49 .30 .02 15.58This school feels like a prison. .17 .02 7.79 .12 .02 6.13

    Loading=standardized factor loading; SE=standard error; z=robust z score.

    167G.G. Bear et al. / Journal of School Psychology 49 (2011) 157174

  • 3.1.4. Measurement invariance across grade levelA model testing the congural invariance of the conrmatory factor analysis across elementary, middle,

    and high school grade levels yielded t statistics suggesting adequate model t (see Table 5). The difference

    Table 4Fit statistics between groups for bifactor model.

    Model N 2 df CFI SRMR RMSEA

    Full sample 11,780 2250.69 207 .963 .027 .029Elementary 7889 1493.26 207 .965 .026 .028Middle 2522 816.46 207 .955 .038 .034High 1369 589.85 207 .950 .037 .037Caucasian 5968 1298.51 207 .964 .028 .030African American 3528 803.68 207 .963 .028 .029Hispanic 1163 472.05 207 .952 .034 .033Boys 5923 1242.63 207 .965 .028 .029Girls 5857 1244.32 207 .963 .027 .029

    2=chi-square statistic; df=degrees of freedom; CFI=Comparative Fit Index; SRMR=Standardized Root Mean-Square Residual;RMSEA=Root Mean-Square Error of Approximation.

    168 G.G. Bear et al. / Journal of School Psychology 49 (2011) 157174between test statistics for theweak factorial (Model 2) and congural (Model 1) invariancemodels indicatedthat there was weak factorial invariance across grade level, SatorraBentler scaled chi-square differencetest=137.70 (df=46), pb .001,CFI=.000,RMSEA=.001, andSRMR=.004.When the test statisticsfor the strong factorial (Model 3) and weak factorial (Model 2) invariance were compared, invariance in thestarting point of origin for the subscale was found across grade level, SatorraBentler scaled chi-squaredifference test=748 (df=34), pb .001, CFI=.002, RMSEA=.000, and SRMR=.000.

    3.1.5. Measurement invariance across racial ethnic groupsA model testing the congural invariance of the conrmatory factor analysis across three different racial

    ethnic groups (i.e., Caucasian, AfricanAmerican, andHispanic) yieldedt statistics suggesting adequatemodelt(see Table 5). Reports from students who indicated Asian or Other racialethnic identity were excluded fromthe racialethnic groupmeasurement invariance analyses due to small sample sizes. Thedifference between teststatistics for the weak factorial (Model 2) and congural (Model 1) invariance models indicated that there wasweak factorial invariance across race-ethnicity, SatorraBentler scaled chi-square difference test=82.43(df=46), pb .001, CFI=.001, RMSEA=.001, and SRMR=.003. When the test statistics for the strongfactorial (Model 3) and weak factorial (Model 2) invariance were compared, invariance in the starting point oforigin for the subscale was found across race, SatorraBentler scaled chi-square difference test=61.93(df=34), p=.002, CFI=.002, RMSEA=.000, and SRMR=.000.

    Table 5Fit statistics for conrmatory factor analysis of bifactor model testing measurement invariance across grade level, race and gender.

    Model 2 df CFI SRMR RMSEAGrade levelModel 1 3280.40 655 .958 .030 .032Model 2 3340.96 701 .958 .034 .031Model 3 3502.17 735 .956 .034 .031

    RaceModel 1 2747.42 655 .961 .029 .030Model 2 2738.29 701 .962 .032 .029Model 3 2870.23 735 .960 .032 .029

    GenderModel 1 2588.45 431 .963 .028 .029Model 2 2540.06 454 .964 .029 .028Model 3 2634.62 471 .962 .029 .028

    Model 1: congural invariance. Model 2: weak factorial invariance. Model 3: strong factorial invariance. 2=chi-square statistic;df=degrees of freedom; CFI=Comparative Fit Index; SRMR=Standardized Root Mean Square Residual; RMSEA=Root MeanSquare Error of Approximation.

  • Table 6Internal consistency coefcients and correlational coefcients between subscale and total scale scores for the full sample.

    1 2 3 4 5 6

    1. TeacherStudent Relations (.88)

    169G.G. Bear et al. / Journal of School Psychology 49 (2011) 1571743.1.6. Measurement invariance across genderThe test statistics for congural invariance (Model 1) across gender indicated adequate model t (see

    Table5). Theweak factorial invariancemodel (Model 2)wasnestedwithinModel1. Thedifferencebetween teststatistics for the twomodels indicated that there was weak factorial invariance across gender, SatorraBentlerscaled chi-squaredifference test=28.99 (df=23),p=ns,CFI=.001,RMSEA=.001, andSRMR=.001.The strong factorial model (Model 3) was nested within Model 2. The difference between test statistics forthe twomodels indicated that there was strong factorial invariance across gender, SatorraBentler scaledchi-square difference test=40.38 (df=17), p=.001, CFI=.002, RMSEA=.000, and SRMR=.000.

    3.2. Correlations among Factors and Internal Consistency

    To examine the relative independence of scores for the ve subscales supported by the results ofconrmatory factor analyses and the extent to which they assess the school climate construct,correlations between subscale scores were computed. For these analyses, and all other analyses that follow,we usedmanifest indicators of the factor (i.e., sum of raw scores of items on the derived subscales and totalscale). As shown in Table 6, for all students combined, correlation coefcients ranged in strength of value(i.e., absolute value) from .42 to .66, with amedian of .58. Those results indicate that 56% (1 .662=.56) to83% (1 .422=.83) of the variance in each subscale score is independent of the scores on the othersubscales. Correlational analyses between scores on the ve subscales also were computed for eachsubgroup across raceethnicity, gender, and grade level, resulting in 80 correlations (5 factors3 racialethnic groups2 genders3 grade levels). Coefcients among subscales ranged from .26 to .68, with amedian of .53. All correlation coefcients were statistically signicant at the .001 level.

    With respect to the reliability of DSCSS scores (see Table 6), for all students combined across grade levels,internal consistency coefcients ranged from .70 to .88. The reliability of scores for each of the ve subscalesalso was computed for each subgroup (3 racialethnic groups2 genders3 grade levels). Among the 40reliability analyses computed, the median correlation coefcient was .81, with coefcients ranging from .63(Fairness of Rules for elementary students) to .89 (TeacherStudent Relations for Caucasian students). Therewere negligible differences between the alpha coefcients for elementary school (range of .63 to .84,median=.80), middle school (range of .70 to .87, median=.80), and high school (range of .70 to .86,median=.79) students; betweenCaucasian (rangeof .73 to .89,median=.84),AfricanAmerican (rangeof .67

    2. StudentStudent Relations .51 (.81)3. Fairness of Rules .64 .42 (.70)4. School Safety .61 .57 .49 (.83)5. Liking of School .66 .52 .59 .65 (.83)6. Total Scale .89 .73 .77 .79 .85 (.94)

    Values in parentheses are coefcients of internal consistency (Cronbach's alpha) for each subscale. All correlations are signicant atpb .001.to .87, median=.79), and Hispanic (range of .66 to .88, median=.81) students; and between boys (range of.71 to .88, median=.82) and girls (range of .69 to .89, median=.84). Across all subgroups, the lowest alphacoefcients were for Fairness of Rules and the highest for TeacherStudent Relations.3

    For the total score of DSCSS, consisting of the sum of raw scores on all items of the ve subscales(while reverse scoring items reecting a negative climate), high reliability was found across racialethnic,gender, and grade-level groups (range .91 to .94, with overall alpha of .94 for all students combined).

    3 We also examined reliability coefcients at each grade level, 312, with a particular concern about reliability at the lowestgrades. Compared to all higher grades, at grades 3 and 4 alpha coefcients were lower for the TeacherStudent Relationships,Fairness of Rules and School Safety subscales. However, for grades 3 and 4, respectively, those coefcients were .81 and .83 forTeacherStudent Relationships and .78 and .77 for School Safety. It was only for Fairness of Rules that coefcients fell below .70 (.57for grade 3 and .62 for grade 4), but, as noted previously, coefcients on this subscale tended to be low at all grade levels (below .70at all but two grade levels). Thus, there was little evidence that the subscales were less reliable in the lower grades.

  • Table 7Aggregated means and standard deviations of school climate, suspensions/expulsions, and achievement data by school level.

    Elementary a Middle and high b Full sample

    M SD M SD M SD

    School climate scoresTeacherStudent Relations 26.52 1.30 21.54 1.56 24.93 2.71StudentStudent Relations 10.38 1.07 8.95 .64 9.93 1.16Fairness of Rules 12.13 .69 9.96 .77 11.44 1.24School Safety 9.60 .88 7.85 .79 9.04 1.18Liking of School 12.63 1.21 10.31 .93 11.89 1.56Total School Climate 71.56 4.89 58.79 4.09 67.50 7.56

    Suspensions/expulsions 5.62 5.90 23.16 10.60 11.19 11.22Academic achievementELA achievement 79.66 8.45 72.56 9.25 77.40 9.27Math achievement 78.86 9.65 60.63 12.56 73.00 13.57

    170 G.G. Bear et al. / Journal of School Psychology 49 (2011) 1571743.3. Evidence of Concurrent Validity

    3.3.1. Correlations with academic achievement and suspensions/expulsionsAt the schoolwide level, using aggregated scores across all students within each school, we examined

    correlations between DSCSS scores, suspension and expulsion rates, and academic achievement. Meansand standard deviations are reported in Table 7. Due to small sample size, middle (n=17) and high schools(n=10) were combined. As shown in Table 8, scores on the ve subscales and the total scale tended tocorrelate moderately, and in the expected directions, with academic achievement and suspension andexpulsion rates. With the exception of scores on the StudentStudent Relations subscale correlating .24with ELA at the middle and high school level, scores on all ve subscales and the total scale correlatedsignicantly with academic achievement and suspensions and expulsions.

    4. Discussion

    The aim of this study was to develop a brief and psychometrically sound survey for school psychologistsand other educators to use in assessing school climate. In achieving this aim, multigroup conrmatory factoranalyses showed that the DSCSS is best represented by a bifactormodel consisting of ve specic factors and

    ELA=English-Language Arts.a n=58 schools.b n=27 schools.onegeneral factor. Theve specic factors are TeacherStudentRelations, StudentStudent Relations, Fairnessof Rules, School Safety, and Liking of School. Results demonstrated congural, weak and strong factorialinvariance across three grade levels (i.e., elementary, middle, and high school), racialethnic groups (i.e.,

    Table 8Correlations between school climate and academic achievement and suspensions/expulsions.

    Elementary school a Middle and high schools b Full sample

    ELA Math S/E ELA Math S/E ELA Math S/E

    TeacherStudent Relations .59 .53 .72 .40 .64 .48 .56 .77 .83 StudentStudent Relations .72 .63 .67 .24 .40 .68 .66 .70 .75 Fairness of Rules .63 .55 .67 .42* .58 .42 .59 .76 .81 School Safety .76 .68 .74 .45 .66 .70 .69 .81 .84 Liking of School .72 .66 .71 .44 .69 .60 .68 .80 .81 Total School Climate .72 .65 .73 .47 .72 .66 .65 .81 .85

    ELA=English-Language Arts. S/E=suspensions and expulsions.a n=58 schools.b n=27 schools.

    pb .01. pb .05.

  • Caucasian, AfricanAmerican, andHispanic), andgender. Scores for eachof theve subscales and the total scorealso were found to be reliable across grade level, racialethnic, and gender groups, with internal consistencycoefcients across those groups ranging from .63 to .89 for the subscales and from .91 to .94 for the total scale.

    In support of theDSCSS's concurrent validity, at the schoolwide level school climate scores onnearly all ofsubscales correlated positively with academic achievement and negatively with suspension and expulsionrates. At the elementary level and for all grades combined, all correlations were appreciable and statisticallysignicant. All but one correlation (i.e., StudentStudent Relationswith ELA)was signicant at themiddle andhigh school level. Most impressivewere the correlation coefcients for the full sample, which ranged from .56to .85, and particularly the correlations between the school climate subscales and suspension and expulsionrates. For the full sample, Total School Climate scores correlated.85 with suspension and expulsion rates,reecting a strong inverse relation between these two variables. This correlation is confounded bymiddle andhigh schools tending to have lower school climate scores and higher suspension and expulsion rates thanelementary schools. However,whencontrolling for this confoundbyexaminingelementary schools separatelyfor middle and high schools, the correlation coefcients decrease in magnitude but remain appreciable (i.e.,.73 for elementary schools and.66 for middle and high schools).

    Results did not support the inclusion of a proposed sixth subscale, Student Conduct Problems,with itemsassessing bullying,ghting, harming others, stealing, cheating, anddrugs. Not only did a bifactormodelwithve specic factors (excluding Student Conduct Problems) and a general factor provide the strongestmodelt, but the proposed Student Conduct Problems factor also correlated poorly with the other ve factors(with r's ranging from .02 to .24), indicating that the proposed subscale did notmeasure the same constructmeasured by the other ve subscales. Thus, we discourage schools from including the items on the originalDSCSS that were intended to measure student perceptions of schoolwide student conduct problems as adimension of school climate. We plan to delete those items in a future revision of the survey, and to use ascale separate from school climate to assess student conduct problems, when such assessment is desired.

    4.1. Limitations and Future Research

    Whereas the limited number of items (29) on the DSCSS is a practical strength, requiring only about 1520 min formost students to complete, this limited number of items also presents limitations to the surveywithrespect to its scope and reliability. The DSCSS assesses only ve aspects of school climate, primarily aspects ofsocial support andstructureof particular focus in SWPBSandbullyingpreventionprograms. Itwasnotdesignedto tap aspects of school climate included in some other surveys. As discussed previously, social support andstructure are the twomost commondimensions of school climate foundon surveys of school climate. However,twoother dimensions also are often found: (a) teaching and learning and (b) environmentalstructural (Cohenet al., 2009). With respect to providing a comprehensive survey of school climate, whichwas not the intentionof the current study, the exclusion of those two dimensions may be viewed as a limitation of the DSCSS.

    The small number of items also limits the DSCSS's internal reliability, especially for those subscales with thefewest number of items (e.g., one subscale has three items and three subscales have four items). Despite the smallnumberof items, for the total sample, internal reliability coefcients ranged from .70 to .88across theve subscalesand was .94 for the total scale. Internal reliabilities above .60 are considered marginal, above .70 adequate, andabove .80high (Cicchetti&Sparrow,1981).Although it consists of only3 items, theSchool Safety subscaleyieldedacoefcientof .83.Whenexaminingcoefcients for eachof theve subscales across subgroupsbasedongrade level,gender, and race, coefcients fell below .70 only for the Fairness of Rules subscale (.63 for elementary students,.67 for African Americans, .66 for Hispanics and .69 for girls). We plan to add additional items to the Fairness ofRules subscale, including several thatwill tap clarity of expectations and rules, to strengthen its internal reliability.

    Another limitation of the study was the sampling procedures employed. Schools were not randomlyselected, but they volunteered to participate. Moreover, within-school sampling occurred at the classroomlevel, not the individual student level. Although schoolsweredirected to select classroomsrandomly, theextentto which this occurred is unknown. Because each school was given 200 surveys to administer, irrespective ofschool enrollment, a larger percentage of students in smaller than larger schools participated. Thus, it is unlikelythat the study included a truly random sample of students in Delaware. However, as noted previously, thedemographics for the nal sample closely matched those for the general student population in the state.

    Other limitations of the DSCSS, recommendations for its use, and suggestions for future research are the

    171G.G. Bear et al. / Journal of School Psychology 49 (2011) 157174sameones that apply tomanyother studies and surveys of school climate. Caution iswarranted ingeneralizing

  • the results to other populations, and one should not interpret the correlations among variables as implyingcausation. Longitudinal studies are needed to examine causal relations as well as to establish the survey'sstability over time. Likewise, additional reliability and validity studies are needed to support the DSCSS'spsychometric properties, including testretest reliability, convergent validity (e.g., correlations with othersurveys of school climate), predictive validity (e.g., predicting changes in achievement over time),discriminant validity (e.g., examining if school climate scores discriminate between schools with high andlow achievement or school suspensions), and additional studies of concurrent validity (e.g., correlations withother social and emotional outcomes such as bullying, friendships, and self-concept). Studies employingmulti-level analyses are especially needed to examine differences in variance in school climate acrossindividual, school, and classroom levels. Finally, the survey's sensitivity to changes in school climate should beexplored. Related to this recommendation point, another possible limitation of the current study, whichapplies to most other studies of school climate, is that it is unclear to what extent scores were affected byprevention programs being implemented in the participating schools. Although many schools in Delawarewere implementing SWPBS, and all had some type of bullying prevention program, we do not believe thatprevention efforts were any different than those found in most other states, with schools varying greatly intypes of programs (including variations within SWPBS schools) and the delity of implementation.

    4.2. Implications for School Psychologists

    The DSCSS offers researchers and practitioners in school psychology and education a brief survey ofschool climate that is valid, reliable, and free to the public. The survey should be particularly useful inevaluating the effectiveness of schoolwide prevention and intervention programs designed to improverelationships among teachers and students, school safety, the fairness of school rules, and student liking ofschool. These programs would include ones for preventing bullying and school violence, as well as moregeneral programs for school and self-discipline (Bear, 2010), social and emotional learning (see CASEL.org,the website for the Collaborative for Academic, Social, and Emotional Learning), schoolwide positivebehavioral supports and interventions (see PBIS.org, the website for the Technical Assistance Center forPositive Behavioral Interventions and Supports), and character education (see character.org, the websitefor the Character Education Partnership). Each of those programs target school climate in theirinterventions and also view school climate as an important measurable outcome.

    With training in prevention, consultation, child and adolescent development, and program evaluation, schoolpsychologists should be active members of schoolwide teams charged with implementing those programs. Asmembers of those teams, school psychologists are in an excellent position to advocate for evidence-basedinterventions to improve school climate as a means of achieving other positive outcomes, such as decreasedbullying and violence and enhance social, emotional, and academic learning. Likewise, they are in an excellentposition to argue that, in many school prevention programs, student perceptions of school climate are asimportant, or more important, than are other outcomes more commonly evaluated in the schools, such as ofcedisciplinary referrals, suspensions and expulsions, and teacher ratings of student behavior. That is, regardless ofthe number of behavior problems reported by the school, studentsmay ormay not like their school or perceive itfavorablywith respect to safety, fairness, and relationshipswith teachers and students (Arum,2003;Bear, 2010)perceptions that have shown to be related to a number of important academic, social, and emotional outcomes.

    References

    American Psychological Association Zero Tolerance Task Force (2008). Are zero tolerance policies effective in the schools? Anevidentiary review and recommendations. American Psychologist, 63, 852862.

    Anderson, C. S. (1982). The search for school climate: A review of the research. Review of Educational Research, 52, 368420.Arum, R. (2003). Judging school discipline: The crisis of moral authority. Cambridge, MA: Cambridge University Press.Asparouhov, T., & Muthn, B. (2010). Computing the strictly positive SatorraBentler chi-square test in Mplus. Mplus Web Notes:

    No. 12 Retrieved November 20, 2010, from. http://www.statmodel.com/examples/webnotes/webnote12.pdf.Astor, R. A., Benbenishty, R., Zeira, A., & Vinokur, A. (2002). School climate, observed risky behaviors, and victimization as predictors of

    high school students' fear and judgments of school violence as a problem. Health Education & Behavior, 29, 716736.Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Upper Saddle River, NJ: Prentice-Hall.Bandura, A. (1997). Self-efcacy: The exercise of control. New York: Freeman.Barnett, R. V., Easton, J., & Israel, G. D. (2002). Keeping Florida's children safe in school: How one state designed a model safe school

    172 G.G. Bear et al. / Journal of School Psychology 49 (2011) 157174climate survey. School Business Affairs, 68, 3138.

  • 173G.G. Bear et al. / Journal of School Psychology 49 (2011) 157174Battistich, V., & Horn, A. (1997). The relationship between students' sense of their school as a community and their involvement inproblem behaviors. American Journal of Public Health, 87, 19972001.

    Battistich, V., Solomon, D., Kim, D., Watson, M., & Schaps, E. (1995). Schools as communities, poverty levels of student populations,and students' attitudes, motives, and performance: A multilevel analysis. American Educational Research Journal, 32, 627658.

    Battistich, V., Solomon, D., Watson, M., & Schaps, E. (1997). Caring school communities. Educational Psychologist, 32, 137151.Baumrind, D. (1971). Current patterns of parental authority. Developmental Psychology Monographs, 4, 1103.Baumrind, D. (1996). The discipline controversy revisited. Family Relations, 45, 405414.Bear, G. G. (2010). School discipline and self-discipline: A practical guide to promoting prosocial student behavior. New York: Guilford Press.Bear,G.G. (withA.Cavalier&M.Manning)(2005).Developing self-disciplineandpreventingandcorrectingmisbehavior.Boston,MA:Allyn&Bacon.Berkowitz, M. W., & Schwartz, M. (2006). Character education. In G. G. Bear, & K. M. Minke (Eds.), Children's needs III: Development,

    prevention, and intervention (pp. 1527). Bethesda, MD: National Association of School Psychologists.Brand, S., Felner, R., Shim,M., Seitsinger, A., & Dumas, T. (2003). Middle school improvement and reform: Development and validation

    of a school-level assessment of climate, cultural pluralism, and school safety. Journal of Educational Psychology, 95, 570588.Bronfenbrenner, U. (1979). The ecology of human development. Cambrdge, MA: Harvard University Press.Brophy, J. E. (1996). Teaching problem students. New York: Guilford Press.Bru, E., Stephens, P., & Torsheim, T. (2002). Students' perceptions of class management and reports of their own misbehavior. Journal

    of School Psychology, 40, 287307.Buhs, E. S., Ladd, G. W., & Herald, S. L. (2006). Peer exclusion and victimization: Processes that mediate the relation between peer

    group rejection and children's classroom engagement and achievement? Journal of Educational Psychology, 98, 113.Center for Social and Emotional Education (2009). Comprehensive school climate inventory. Retrieved September 11th, 2009 at.

    http://www.schoolclimate.org/programs/csci.php.Chen, F. F. (2007). Sensitivity of goodness of t indexes to lack of measurement invariance. Structural Equation Modeling: A

    Multidisciplinary Journal, 14, 464504.Chen, F. F., & West, S. G. (2008). Measuring individualism and collectivism: The importance of considering differential components,

    reference groups, and measurement invariance. Journal of Research in Personality, 42, 259294.Child Development Project (1993). Liking for school. Oakland, CA: Developmental Studies Center.Cicchetti, D. V., & Sparrow, S. S. (1981). Developing criteria for establishing interrater reliability of specic items: Applications to

    assessment of adaptive behavior. American Journal of Mental Deciency, 86, 127137.Cohen, J., McCabe, E. M., Michelli, N. M., & Pickeral, T. (2009). School climate: Research, policy, practice, and teacher education.

    Teachers College Record, 111, 180213.Connell, J. P., & Wellborn, J. G. (1991). Competence, autonomy, and relatedness: A motivational analysis of self-system processes. In

    M. R. Gunnar, & L. A. Sroufe (Eds.), Self processes and development. The Minnesota symposium on child psychology (pp. 4377).Mahwah, NJ: Erlbaum.

    Croninger, R. G., & Lee, V. E. (2001). Social capital and dropping out of high school: Benets to at-risk students of teachers' support andguidance. Teachers College Record, 103, 548581.

    Danielsen, A. G., Wiium, N., Wilhelmsen, B. U., & Wold, B. (2010). Perceived support provided by teachers and classmates andstudents' self-reported academic initiative. Journal of School Psychology, 48, 247267.

    Davidson, L. M., & Demaray, M. K. (2007). Social support as a moderator between victimization and internalizingexternalizingdistress from bullying. School Psychology Review, 36, 383405.

    DelawareDepartmentof Education (2009).DelawareStudentTestingProgram.RetrievedMay4, 2009, from.http://www.doe.k12.de.us/aab/.Ding, C., & Hall, A. (2007). Gender, ethnicity, and grade differences in perceptions of school experiences among adolescents. Studies in

    Educational Evaluation, 33, 159174.Durlak, J. A., Weissberg, R. P., Dymnicki, A. B., Taylor, R. D., & Schellinger, K. B. (2011). The impact of enhancing students' social and

    emotional learning: A meta-analysis of school-based universal interventions. Child Development, 82, 474501.Eccles, J. S., Midgley, C., Buchanan, C. M., Wigeld, A., Reuman, D., &Mac Iver, D. (1993). Development during adolescence: The impact

    of stage/environment t. American Psychologists, 48, 90101.Emmons, C., Haynes, N. M., & Comer, J. P. (2002). School climate survey: Elementary and middle school version (Revised Edition). New

    Haven, CT: Yale University Child Study Center.Fisher, D. L., & Fraser, B. J. (1983). A comparison of actual and preferred classroom environments as perceived by science teachers and

    students. Journal of Research in Science Teaching, 20, 5561.Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of

    Educational Research, 74, 59109.Furlong, M. J., Greif, J. L., Bates, M. P., Whipple, A. D., Jimenez, T. C., &Morrison, R. (2005). Development of the California School Climate

    and Safety SurveyShort Form. Psychology in the Schools, 42, 137149.Goldstein, S. E., Young, A., & Boyd, C. (2008). Relational aggression at school: Associations with school safety and social climate.

    Journal of Youth and Adolescence, 37, 641654.Gottfredson, G. D. (1999). User's Manual for the Effective School Battery. Ellicott City, MD: Gottfredson Associates.Gottfredson, G. D., Gottfredson, D. C., Payne, A. A., & Gottfredson, N. C. (2005). School climate predictors of school disorder: Results

    from a national study of delinquency prevention in schools. Journal of Research in Crime and Delinquency, 42, 412444.Gregory, A., & Cornell, D. (2009). Tolerating adolescent needs: Moving beyond zero tolerance policies in high school. Theory into

    Practice, 48, 106113.Gregory, A., Cornell, D., Fan, X., Sheras, P., Shih, T., & Huang, F. (2010). High school practices associated with lower student bullying

    and victimization. Journal of Educational Psychology, 102, 483496.Gregory, A., & Weinstein, R. S. (2004). Connection and regulation at home and in school: Predicting growth in achievement for

    adolescents. Journal of Adolescent Research, 19, 405427.Grifth, J. (1995).Anempirical examinationof amodel of social climate inelementary schools.Basic andApplied Social Psychology,17, 97117.Grifth, J. (1999). School climate as social order and social action: A multi-level analysis of public elementary school student

    perceptions. Social Psychology of Education, 2, 339369.Hamre, B. K., Pianta, R. C., Downer, J. T., & Mashburn, A. J. (2008). Teacher's perceptions of conict with young students: Looking

    beyond problem behaviors. Social Development, 17, 115136.

  • Haynes, N. M., Emmons, C., & Ben-Avie, M. (1997). School climate as a factor in student adjustment and achievement. Journal ofEducational and Psychological Consultation, 8, 321329.

    Haynes, N. M., Emmons, C., & Comer, J. P. (1994). School Climate Survey: Elementary and Middle School Version. New Haven, CT: YaleUniversity Child Study Center.

    Horner, R., & Sugai, G. (2007). Is School-wide Positive Behavior Support an evidence-based practice? Retrieved March 18, 2008, from.http://www.pbis.org.

    Horner, R. H., Sugai, G., Smolkowski, K., Eber, L., Nakasato, J., Todd, A.W., et al. (2009). A randomized,wait-list controlled effectiveness trialassessing School-wide Positive Behavior Support in elementary schools. Journal of Positive Behavior Interventions, 11, 133144.

    Hu, L., & Bentler, P. M. (1998). Fit indices in covariance structure modeling: Sensitivity to underparameterized modelmisspecication. Psychological Methods, 3, 424453.

    Hughes, J. N., Cavell, T. A., & Wilson, V. (2001). Further support for the developmental signicance of the quality of the TeacherStudent relationship. Journal of School Psychology, 39, 289301.

    Irvin, L. K., Tobin, T. J., Sprague, J. R., Sugai, G., & Vincent, C. G. (2004). Validity of ofce discipline referral measures as indices of school-wide behavioral status and effects of school-wide behavioral interventions. Journal of Positive Behavior Interventions, 6, 131147.

    Jessor, R., Turbin, M. S., Costa, F. M., Dong, Q., Zhang, H., & Wang, C. (2003). Adolescent problem behavior in China and the UnitedStates: A cross-national study of psychosocial protective factors. Journal of Research on Adolescence, 13, 329360.

    Jimerson, S. R., & Furlong, M. (Eds.). (2006).Handbook of school violence and school safety: From research to practice. Mahwah, NJ: Erlbaum.Kincaid, J. P., Fishburne, R. P., Jr., Rogers, R. L., & Chissom, B. S. (1975). Derivation of new readability formulas (Automated Readability

    Index, Fog Count and Flesch Reading Ease Formula) for Navy enlisted personnel. Research Branch Report 875. Millington, TN:Naval Technical Training, U. S. Naval Air Station, Memphis, TN.

    Kitasatas, A., Ware, H. W., & Martinez-Arias, R. (2004). Students' perceptions of school safety: Effects by community, schoolenvironment, and substance use variables. The Journal of Early Adolescence, 24, 412430.

    Kuperminc, G. P., Leadbeater, B. J., & Blatt, S. J. (2001). School social climate and individual differences in vulnerability topsychopathology among middle school students. Journal of School Psychology, 39, 141159.

    Ladd, G. W., & Price, J. M. (1987). Predicting children's social and school adjustment following transition from preschool tokindergarten. Child Development, 58, 11681189.

    Lamborn, S. D., Mounts, N. S., Steinberg, & Dornbush, S. M. (1991). Patterns of competence and adjustment among adolescents fromauthoritative, authoritarian, indulgent, and neglectful families. Child Development, 62, 10491065.

    McIntosh, K., Frank, J. L., & Spaulding, S. A. (2010). Establishing research-based trajectories of ofce discipline referrals for individualstudents. School Psychology Review, 39, 380394.

    Meredith, W. (1993). Measurement invariance, factor analysis, and factorial invariance. Psychometrika, 58, 525543.Merrell, K. W., Gueldner, B. A., Ross, S. W., & Isava, D. M. (2008). How effective are school bullying intervention programs? A meta-

    analysis of intervention research. School Psychology Quarterly, 23, 2642.Moos, R. H., & Trickett, E. (1974). The Classroom Environment Scale manual. Palo Alton, CA: Consulting Psychologists Press.Morrison, G. M., Redding, M., Fisher, E., & Peterson, R. (2006). Assessing school discipline. In S. R. Jimerson, & M. J. Furlong (Eds.),

    Handbook of school violence and school safety: From research to practice (pp. 211220). Mahwah, NJ: Erlbaum.Moos, R. H., & Trickett, E. (2002). The Classroom Environment Scale (CES) manual (Third Edition). Menlo Park, CA: Mind Garden.Muthn, L. K., & Muthn, B. O. (19982008). Mplus Version 5.2 [Computer Software].Nansel, T. R., Overpeck, M., Pilla, R. S., Ruan, W. J., Simons-Morton, B., & Scheidt, P. (2001). Bullying behaviors among US youth:

    Prevalence and associations with psychosocial adjustment. Journal of the American Medical Association, 285, 20942100.Osterman, K. F. (2000). Students' need for belonging in the school community. Review of Educational Research, 70, 323367.Resnick, M. D., Bearman, P. S., Blum, R. W., Bauman, K. E., Harris, K. M., Jones, J., et al. (1997). Protecting adolescents from harm:

    Findings from the National Longitudinal Study on Adolescent Health. Journal of the American Medical Association, 278, 823832.Reuger, S. Y., Malecki, C. K., & Demaray, M. K. (2008). Gender differences in the relationship between perceived social support and

    student adjustment during early adolescence. School Psychology Quarterly, 23, 496514.Sailor, W., Dunlap, G., Sugai, G., & Horner, R. (Eds.). (2009). Handbook of positive behavior support. New York: Springer.Stockard, J., & Mayberry, M. (1992). Effective educational environments. Newbury Park, CA: Corwin.Swearer, S. M., Espelage, D. L., Vaillancourt, T., & Hymel, S. (2010). What can be done about school bullying?: Linking research to

    educational practice. Educational Researcher, 39, 3847.Thompson, B. (2004). Exploratory and conrmatory factor analysis: Understanding concepts and applications.Washington, DC: American

    Psychological Association.Thompson, D. R., Iachan, R., Overpeck, M., Ross, J. G., & Gross, L. A. (2006). School connectedness in the health behavior in school-aged

    children study: The role of student, school, and school neighborhood characteristics. Journal of School Health, 76, 379386.Trickett, E. J., & Quinlan, D. M. (1979). Three domains of classroom environment: Factor analysis of the classroom environment scale.

    American Journal of Community Psychology, 7, 279291.Way, N., Reddy, R., & Rhodes, J. (2007). Students' perceptions of school climate during the middle school years: Association with

    trajectories of psychological and behavioral adjustment. American Journal of Community Psychology, 40, 194213.Welsh,W.N. (2000). Theeffectsof school climateon schooldisorder.Annals of theAmericanAcademyof Political andSocial Science,567, 88107.Welsh, W. N. (2003). Individual and institutional predictors of school disorder. Youth Violence and Juvenile Justice, 1, 346368.Wentzel, K. R. (1996). Social and academic motivation in middle school: Concurrent and long-term relations to academic effort.

    Journal of Early Adolescence, 16, 390406.Widaman, K. F., & Reise, S. P. (1997). Exploring the measurement invariance of psychological instruments: Applications in the

    substance use domain. In K. J. Bryant, M. Windle, & S. G. West (Eds.), The science of prevention: Methodological advances fromalcohol and substance abuse research (pp. 281324). Washington, DC: American Psychological Association.

    Wilson, S. J., & Lipsey, M. W. (2007). School-based interventions for aggressive and disruptive behavior: Update of a meta-analysis.American Journal of Preventive Medicine, 33(Suppl. 2 S), 130143.

    Wright, J. A., & Dusek, J. B. (1998). Compiling school base rates for disruptive behaviors from student disciplinary referral data. SchoolPsychology Review, 27, 138147.

    Zins, J. E., & Elias, M. J. (2006). Social and emotional learning. In G. G. Bear, & K. M. Minke (Eds.), Children's needs III: Development,prevention, and intervention (pp. 113). Bethesda, MD: National Association of School Psychologists.

    174 G.G. Bear et al. / Journal of School Psychology 49 (2011) 157174

    Delaware School Climate SurveyStudent: Its factor structure, concurrent validity, and reliabilityIntroductionTheoretical roots of the DSCSS and supporting researchStockard and Mayberrys Theoretical Framework of School ClimateRelated Student Surveys of School ClimateResearch Supporting the Proposed Factors of the DSCSSSchool safety and student conduct problemsLiking of school

    Guiding pragmatic concerns and goals of the study

    MethodThe numerical method: TISCMeasuresDelaware School Climate SurveyStudent (DSCSS)Academic achievementSuspensions and expulsions

    ProceduresData Analyses

    ResultsConfirmatory Factor AnalysisExclusion of Student Conduct Problems subscaleTesting alternative modelsConfirming fit of final modelMeasurement invariance across grade levelMeasurement invariance across racialethnic groupsMeasurement invariance across gender

    Correlations among Factors and Internal ConsistencyEvidence of Concurrent ValidityCorrelations with academic achievement and suspensions/expulsions

    DiscussionLimitations and Future ResearchImplications for School Psychologists

    References