26
1 www.sciencemag.org/content/351/6274/664/suppl/DC1 Supplementary Materials for Climate confusion among U.S. teachers Eric Plutzer,* Mark McCaffrey, A. Lee Hannah, Joshua Rosenau, Minda Berbeco, Ann H. Reid *Corresponding author. E-mail: [email protected] Published 12 February 2016, Science 351, 664 (2016) DOI: 10.1126/science.aab3907 This PDF file includes Materials and Methods Supplementary Text Fig. S1 Tables S1 to S8 References

Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

1

www.sciencemag.org/content/351/6274/664/suppl/DC1

Supplementary Materials for

Climate confusion among U.S. teachers

Eric Plutzer,* Mark McCaffrey, A. Lee Hannah, Joshua Rosenau, Minda Berbeco,

Ann H. Reid

*Corresponding author. E-mail: [email protected]

Published 12 February 2016, Science 351, 664 (2016)

DOI: 10.1126/science.aab3907

This PDF file includes

Materials and Methods

Supplementary Text

Fig. S1

Tables S1 to S8

References

Page 2: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

2

Materials and Methods

Data Availability

That entire survey data set, documentation and analysis files are archived at:

https://dataverse.harvard.edu/dataverse/Plutzer

Survey Methodology

Methodology Overview

The survey was designed and conducted by the Penn State Survey Research Center (SRC) and

the National Center for Science Education, which provided the funding for the study.

The analyses reported here are based on 1,500 surveys completed by teachers from fifty states –

1,299 paper and pencil surveys returned by mail and 201 completed online. Surveys were

received between November 10, 2014 and February 18, 2015, when we officially closed the

survey (after February 18th, questionnaires continued to arrive and we logged, but have not

entered data for, 13 additional paper questionnaires).

The samples were generated by Marketing Data Retrieval, a division of Dunn and Bradstreet that

maintains a continuously updated database of 3.9 million K-12 educators who are identifiable by

job title, type of school, and educational sector as well as name and contact information. Based

on SRC specifications, MDR extracted a disproportionate, stratified probability sample of 5,000

middle and high school science teachers and provided the SRC with their contact information.

This sample is designed to represent all middle school science teachers and all biology, life

science, earth science, chemistry and physics teachers in U.S. high schools.

The Penn State IRB reviewed all study materials and determined that the research met the

criteria for exempt research according to the policies of this institution and the provisions of

applicable federal regulations (IRB study # 00960, approved 9/24/2014).

The Penn State SRC then implemented a five-contact mail survey – with a Web option. After

completed interviews were logged and data verified, we calculated a response rate of 37%, using

AAPOR response rate formula RR4 (18).

Before calculation of statistics reported in the paper, SRC calculated inverse probability weights.

The weighting is accomplished in two separate stages. The first stage of weighting adjusts for

different probabilities of selection associated with the number of teachers in each job title. In the

second stage, sample demographics and school characteristics are used to model response

propensity and rebalance the sample to correct for differential non-response (in particular, the

lower response rates of teachers from majority-minority schools).

Page 3: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

3

Methodology Details

Sampling design

In order to have sufficient statistical power to compare middle to high school teachers, and to

compare among high school teachers in different subjects, we employed a disproportionate

stratified sampling design. Our target was to mail out 5,000 survey invitations to five educator

segments, in the following amounts:

1,000 surveys to high school biology and life science teachers

1,000 surveys to high school earth science teachers

500 surveys to high school chemistry teachers

500 surveys to high school physics teachers

2,000 surveys to middle school science teachers.

To achieve these targets, we employed the service of Market Data Retrieval (MDR), a firm that

maintains a continuously updated, whose database of 3.9 million K-12 educators who are

identifiable by job title, type of school, and educational sector. Included are 69,755 science

teachers in public middle schools, and 105,477 public high school science teachers.

From this database, MDR staff selected teacher names in the MDR database at random according

the PI’s specifications, using a constant sampling fraction for each segment. The sampling

fractions needed to achieve the target sample size differed across segments because there are, for

example, many more biology teachers than earth science teachers.

Stratification by job title and state. The five sampling fractions were applied independently in

each of the 50 states and the District of Columbia. Thus there are 5 segments × 51 geographic

units = 255 strata.

Applying the appropriate sampling fraction to each stratum resulted in an initial mail-out of

2,000 Middle School science teachers (N = 2,000), and High School teachers who teach Biology

(N = 1,001), Earth Science (N = 1,006), Physics (N = 494), or Chemistry (N = 492).

Although the research design called for email reminder messages to teachers, the sampling was

implemented without regard for whether an educator had a valid email address in the marketing

database. Teachers with available emails might differ from those without – for example, some

emails are harvested from educators who place orders for educational products from Scholastic;

others from teachers who volunteer to receive marketing information – and we did not want to

bias the initial sample by requiring that MDR had a confirmed, working email address on file.

Page 4: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

4

Questionnaire design

Questionnaires were designed to meet several goals. The first was to cover key topics of interest,

such as topical coverage related to climate change, reports on controversies and potential

pressures from the community, teachers’ personal attitudes and values, as well as their scientific

training and educational background. To the extent possible, we employed question wording

used in prior studies and the critical questions regarding the amount of time devoted to global

warming, topics covered, personal opinions and course emphasis were all adapted directly from

prior published work by ourselves and other scholars.

Because of the heavy reliance on previously used instruments (e.g., 6, 10), a pre-test was not

considered necessary. Only one instrument was entirely new – the use of “foils” embedded in a

question about topics for a hypothetical unit on greenhouse gasses (reported in Table S8).

An iterative process involving the PI, sponsor, and research staff subjected questions to revision

for content, readability, question order, and formatting. A former high school teacher pre-tested

the questionnaire to confirm readability and length.

The final questionnaires were printed on 12 page booklets and subsequently formatted for a Web

version that replicated the visual “look and feel” of the paper survey as closely as possible.

Survey printing

A random subset of the survey booklets for middle school, biology, and earth science teachers

were printed in color (1,000 middle school, 500 biology, and 500 earth science, respectively) as

part of a methodological experiment. This subset of surveys had a color cover page as well as

colored table shading. All other survey booklets were printed using black ink and gray tones.

As we detail below, the impact on full color cover on response rates was positive overall (but not

consistent across segments). However, because this potential confounder was randomly assigned

and uncorrelated with all other variables, we do not discuss this further in this report.

Mailings and reminders

Following the recommendations of Dillman, Smith and Christian (19) this project used a five-

contact postal mail implementation strategy, supplemented by two e-mail contacts.

1. The first postal mailing, sent October 20, 2014, was a pre-notification letter that

explained the importance of the study and alerted teachers that they would receive a

survey within the next week.

2. The second mailing was sent October 31, 2014, and consisted of a cover letter, a survey

booklet, a postage-paid Business Reply Envelope, as well as a $2 bill as a token pre-

incentive.

3. The third mailing consisted of a reminder postcard, sent to non-respondents (N = 3,908)

on November 14, 2014.

Page 5: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

5

4. Approximately 67% of the initial sample had valid emails within the MDR marketing

database and these teachers received an E-mail reminder on November 19, 2014. These

contained hyperlinks taking them directly to the Survey Research Center’s Web survey

platform. Those opting to complete the survey over the Web were asked to enter a five

digit ID number, which ensured that no teacher could submit answers more than once.

a. There were, however, 12 teachers who submitted a paper survey and answered at

least one question on the Web survey. In most cases, the Web survey was not

completed and we discarded that record. When both were complete, we retained

the survey that was logged earliest and discarded the second.

5. A second email reminder was sent on December 4, 2014.

6. A replacement survey packet was sent on December 12, 2014. This consisted of a cover

letter, a new survey booklet, and a business reply envelope (but not an additional

incentive). The cover letter accompanying the follow-up survey mailing also included

instructions on how to access the survey via the Internet.

7. A final postcard reminder was sent to teachers on December 15, 2014. This postcard

noted that the study would be ending soon and urged them to complete the survey as soon

as possible.

Response dispositions and response rate

As detailed in Table S2, a total of 1,167 teachers returned the questionnaire that was mailed on

October 31st, and an additional 132 teachers returned the questionnaires included in the

replacement packets mailed December 12th. In addition, 206 educators completed the survey on-

line.

We also received a small number of outright refusals (N = 13). However, as is typical of mail

surveys, most refusals are implicit and comprise the vast majority of unreturned questionnaires

(N = 3,396).

In addition, we received communication from the intended respondent (N = 61) or from someone

else at their school (N = 11) indicating that the selected individual was not eligible because they

were not currently teaching the subject for which they were selected. Ineligibility could be due to

retirement, having left the school, shifted to an administrative job, or because they were

incorrectly classified by job title.

Finally, eight envelopes were returned undelivered by the US Postal Service and we presume the

intended teacher is ineligible.

To estimate the percentage of ineligible respondent among the non-returns, we undertook a non-

response audit, which is detailed in Appendix 2. From this audit, we estimate that between 68%

and 73% of the un-returned questionnaires reached an eligible respondent who did not respond.

Page 6: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

6

Applying formula RR4 from the American Association or Public Opinion Research (18), and

using our most conservative estimate for eligibility (73%) we calculate the overall response rate

as 37%. A less conservative estimate of eligibility would lead to a response rate calculated as

39%, as reported in the bottom row of Table S2.

Non response bias and representativeness

As an initial step to assess systematic non-response, we modeled the probability of responding.

The results of a logistic regression model are reported below:

Table S3 reports odds ratios, logits (B), and the standard error and significance of the logit

estimates. Three variables were significant at the 0.05 level. The first was our experimental

treatment of color printing. The odds of completing a survey were 17% higher if the initial

mailing includes a questionnaire packet printed on our color printer. This translates into a 3%

increase in response propensity. Since teachers were assigned color booklets at random, this will

not impact any substantive analyses.

Of greater concern is a substantively large impact of racial composition of the student body.

Teachers at schools with large African American or Hispanic enrollments responded at much

lower rates than those at predominantly white schools.

The patterns can be seen more clearly in the Table S4. The table shows that the decline in

response with increasing Hispanic enrollment is modest, but teachers in majority-Black schools

responded at a rate of 19%. For reasons we do not yet understand, the voice of science teachers

at majority-Black schools is diminished in our data set. Weighting, can help adjust this so that

analyses reflect the composition of our original random sample of 5,000 teachers.

Page 7: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

7

Supplementary Text

Weighting

The weighting is accomplished in two separate stages. The first stage of weighting corrects for

different probabilities of selection associated with the number of teachers in each job title, while

the second stage adjusts for non-response.

Stage 1 design weight. Using inverse probability weights, the first stage design weight accounts

for the differing sample sizes and the differing population sizes. For example, there are roughly

four times as many biology/life science teachers than earth science teachers in US high schools

(NSTA 2005). Thus extracting addresses for equal numbers of biology and earth science teachers

requires sampling with different rates of random selection.

In order to project results to the total population of high science teachers, data are weighted in

proportion to the inverse of the sampling fraction: , where Fj is the

sampling fraction for segment j.

With 100% response, this projection weight would allow the sample to generalize to the

population of all biology, life science, earth science, chemistry, physics and middle school

science teachers. This weight has no impact on results reported separately for each subject, as it

is a constant within segments.

Stage 2 non-response weight. In the second stage, sample demographics and school

characteristics are used to model response propensity and rebalance the sample to correct for

differential non-response (in particular, the lower response rates of teachers from majority-

minority schools).

We estimated a logistic regression that modeled response as a function of sample segment (i.e.,

subject and school level), whether or not the teacher had a valid email in our database, four

characteristics of the school’s student body (total enrollment, percent African American, percent

Hispanic, and percent eligible for free lunch), and three characteristics of the county in which the

school was located (the percentage of adults 25 and older who hold a graduate or professional

degree, the percentage of adults affiliated with conservative Christian churches, the percentage of

private sector employment that is in the fossil fuel industries, and the percentage of the

presidential vote that was Democratic). In this way, we explored whether non-response might be

correlated with social, economic, and political factors that could be related to climate science

instruction. The models suggested that response propensity was lower among teachers in

majority-minority schools.

The model estimates allow us to calculate a response propensity for all respondents and non-

respondents:

Page 8: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

8

Where is the probability that sampled teacher i completes the survey, X is a vector of

individual, school-level, and county-level predictors, and β is a vector of logistic regression

coefficients estimate by maximum likelihood.

The second stage non-response weight is the inverse of the response propensity:

The final weight is the product of the design weight and the non-response weight, normalized so

that it has a mean equal to 1. The resulting weight runs from 0.25 to 3.1, with a standard

deviation of 0.42. Typically weights below 0.2 and greater than 5.0 can generate design effects

that effectively reduce the statistical power of the sample, so the relatively narrow range of

weights does not indicate a need for weight “trimming.”

The impact of weights on analysis

In general, the weights have only minimal impact on the analyses. This can be illustrated below

in Table S1. The table shows that our estimates of the percentage of teachers covering recent

global warming and the hours devoted to it are virtually the same for each teacher segment.

The unweighted data in the “total” row are slightly biased upwards, by roughly 5%, due to the

disproportionate number of earth science teachers in the overall sample.

Similarly small impacts are seen for other statistics reported in the paper. Several examples

include:

The estimated percentage of teachers who report giving equal time to climate skeptic

positions in class is 28.19%, and the unweighted estimate is 30.88%.

The estimated percentage of teachers who personally believe that global warming is

caused mostly by humans is 67.69%, and the unweighted estimate is 66.92%.

In general, the impact of the weighting is very small, but the weighted analyses are theoretically

unbiased and we report these throughout the paper.

Page 9: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

9

Supplementary Text

Non-response & Eligibility Audit

Calculating the response rate and assessing response bias for mail surveys is complicated

because information is provided only in a very small percentage of incomplete surveys.

In this study, a total of 61 teachers and 11 schools replied to the Survey Research Center that the

selected teacher was not eligible for the survey – typically because they were no longer teaching,

or teaching a different subject. In addition, staff or colleagues informed us that 11 additional

teachers were not eligible. These 72 educators represent 1.4% of the total number of mailed out

questionnaires and 2% of all non-respondents.

However, it would be inappropriate to assume that all other non-returned surveys were delivered

to eligible teachers who declined to participate.

To get a better estimate of the non-eligible rate, we undertook a detailed audit of eligibility

respondents during the spring of 2015. To do this we selected two samples. The first was a

random sample of 200 non-respondents and the second was a supplemental sample of 38

teachers from majority-minority schools in the 50 contiguous states, plus all 13 teachers sampled

teachers from the District of Columbia (all majority-minority schools) from which we had zero

returns).

Eligibility Audit Procedures

We enlisted three research assistants in an effort to look up the teacher’s school online to

determine whether the teacher is still listed on the faculty. The detailed instructions provided to

the research assistants is included below, in the section “Eligibility Audit Codebook and

Instructions.”

The first step was to confirm that the sampled teacher was on the staff of the school to which the

questionnaire was mailed. The right hand column of Table S5 summarizes our initial findings:

We could not locate a directory of teachers at the schools of 3.6% of the non-respondents.

For an additional 2% of non-respondents, we located a teacher directory and found a

teacher that might be the sampled individual, but we could not confirm with confidence.

These primarily included teachers with the correct subject and first name, but with a last

name that could reflect a recent name change (e.g., due to marriage).

We were able to confirm presence on the staff for 70.9% of the sampled individuals,

We presumed that the balance (23.5%) were ineligible because the staff directory

appeared to be thorough and up to date.

Page 10: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

10

For 70.9% of non-responding teachers who were successfully located we next determined

whether they taught the same subject for which they were sampled. That is, whether they were

eligible to complete the version of the questionnaire that we sent to them.

At the middle school level, we determined whether they were a science teacher. At the high

school level, we determined whether they taught the specific subject of the survey they received

(biology, earth science, physics, or chemistry).

To assess inter-coder reliability, we had a random sample of 12 teachers coded by all three

assistants and had total agreement on 10 of the 12 (83%). The results of this stage of the audit are

reported in Table S6.

We were able to positively confirm that 39.8% were certainly eligible (they taught in the

sampled school and were listed in the correct department)

An additional 19.9% were very probably eligible (in the sampled school and in a science

department).

3.2% were deemed ineligible due to incorrect subject (either not in science at all or a

science subject for which they were not sampled).

Page 11: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

11

Estimating the eligibility rate for non-returned questionnaires

If we combine the 23.5% not teaching at the school and the 3.2% clearly ineligible by subject,

we get a low (most conservative) estimate that 26.7% of non-respondents were not eligible

(AAPOR parameter “e” = 73%).

An additional 15.5% were confirmed as teaching at the school but could not be confirmed as

eligible science teachers. Rather than calculate an unrealistic, though theoretically possible

“upper limit” we calculated a “high estimate” of ineligibility by combining:

The 26.7% confirmed ineligible

One quarter of the 15.5% who could not be confirmed

One quarter of the 5.6% who could not be confirmed as teaching at the school

This gives us a high estimate of 32% (AAPOR parameter “e” = 68%).

Racial differences.

Tables S5 and S6 break down the sample of audited names based on the racial composition of

the sampled school. The table shows that sampled educators at majority-minority schools were

(a) less likely to be confirmed as on the school’s faculty (63% compared to 78% of those

teaching at majority white schools), and (b) less likely to be confirmed as an eligible science

teacher (50% compared to 68% of those at majority white schools). Combined, these help to

account for the much lower response rates found among teachers at majority-minority schools.

Page 12: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

12

Eligibility Audit Codebook and instructions as provided to raters

Online Search I have included a random sample of science teachers that did not respond to our recent survey. We are trying to determine whether some of these teachers didn’t receive our packet because they are ineligible (retired, left school, not a science teacher, moved to administration, etc.) Directions:

1. Google search the name of the school. Make sure that you identify the school in the location (city, state, and street address are included in the spreadsheet for confirmation).

2. Once you click on the appropriate Web page, copy and paste the hyperlink into the first

column “School Web”.

3. Search for the staff directory or department listing. Sometimes this will be very clear and easy and other times the websites are difficult to navigate.

4. Once you find a staff or department directory, copy and paste the hyperlink into the

second column “Dept. Web”.

NOTE: If you have not found a school website or a department website after 5 minutes of searching, then give up and enter ‘9’ in the next two columns.

5. Search for the teacher’s name in the staff directory. If the teacher is listed in the staff

directory, enter ‘1’ in the “Confirmed Teach” column. If not listed, enter ‘0’, if you are unsure enter ‘8’ and if you cannot find a listing, enter ‘9’. For example, you might be unsure if a teacher has a unique first name but a different last name (perhaps they have married or divorced and changed last names). Make a note in the ‘Notes’ column if you see something like this.

0 = no, teacher is not listed in department/faculty directory 1 = yes, teacher in school 8 = Not sure 9 = school does not list any teachers

Page 13: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

13

6. From this same directory, next determine whether the teacher teaches the appropriate

subject and code the “Confirmed Subject” column as:

0 = Confirmed do not teach (from step 5)

1 = Confirmed Science

2 = Confirmed Specific Subject (Biology, Chemistry, Earth Science Physics, or Middle School – NOTE: Science is sufficient for middle school teachers but not for high school teachers, we want to determine if they teach the specific subject)

6 = Confirmed administration

7 = Confirmed teach different science subject (you have to look at the title column to determine which subject they should be teaching – biology, chemistry, earth science, physics, or middle school)

8 = Confirmed teach different non-science subject (and list which subject they teach in “Notes” column)

9 = Cannot confirm either way

7. Add any additional notes in the notes column – including anything that might be

confusing or need clarification. If you find a teacher’s personal website, copy the link there.

8. Finally add a phone number for the school and extension for the teacher (if applicable).

9. I left a tab in called “CompletedbyLH” for an example. Also, the first row of the

“Incomplete” tab has been filled in as an example. Just start right after that row and complete each as best as you can.

Page 14: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

14

List of Columns in Excel File: School Web: Web address of teacher’s school. Dept. Web: Department/faculty or individual Web page if available. Confirmed Teach: Does the person we are searching for teach at the school?

0 = no, teacher is not listed in department/faculty directory 1 = yes, teacher in school 8 = Not sure 9 = school does not list any teachers

Confirmed Subject: Does the person teach the subject that we have them listed for?

0 = Confirmed do not teach 1 = Confirmed Science 2 = Confirmed Specific Subject (Chem, Physics, etc. – note Science is sufficient for middle school teachers) 6 = Confirmed administration 7 = Confirmed teach different science subject 8 = Confirmed teach different non-science subject 9 = Cannot confirm either way

Notes: Are there other things worth noting from the search? Is there a link to the teacher’s individual page? Phone: Phone number for the school. We will only contact teachers at the school. Extension: Extension to teacher’s classroom, if available. Name: Name of teacher Title: Lists whether the teacher teaches biology, chemistry, earth science, physics, or middle school. Inst: Name of school. Mstreet: Street address of school MCity: City where school is located. MState: State where school is located.

Page 15: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

15

Supplementary Text

OLS Regression Analysis of Teaching Emphasis

The body of the article reports that a measure of political ideology is the strongest predictor of

teaching emphasis. This appendix reports the analyses that are basis of that conclusion.

The outcome variable is constructed as the difference between the two variables used to

construct the 2x2 table in Figure 1. As originally scaled “strongly agree” was coded a 1 and

“strongly disagree” as 4. The two questions are:

I emphasize the scientific consensus that recent global warming is

primarily being caused by human release of greenhouse gases

from fossil fuels.

and

I emphasize that many scientists believe that recent increases in

temperature is likely due to natural causes.

By subtracting the numerical score of the first from the score of the second, and dividing by 3,

we get a new variable that ranges from -1 (emphasizes natural causes to the exclusion of

greenhouse gasses) to +1 (emphasizes only greenhouse gases from fossil fuels).

The resulting distribution of scores is illustrated in Figure S1.

Government role. The critical independent variable in this case is a question that has been

employed frequently in the study of risk assessment. Teachers were given two statements at

opposite ends of a seven-point scale and asked to locate themselves. At one end (scored 1) was

“It's not the government's business to try to protect people from themselves,” and at the other

(scored 7) was, “Sometimes government needs to make laws that keep people from harming

themselves.”

Additional variables in the model are

Stratum: Binary indicators of the sample stratum (school level and subject; Biology is the

omitted category)

Sex (coded female = 1, male = 0)

Seniority: Number of years teaching (integer)

Page 16: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

16

Previous coursework on climate change. Answers to two questions were combined to generate

the following categories:

No formal exposure in college (omitted)

One class that devoted one of more classes sessions to climate change

Two or more classes that devoted one of more classes sessions to climate change

One course entirely focused on climate change

Two or more courses entirely focused on climate change

Priority given to foils. Teachers were asked, “Imagine that you were asked to teach a 2-3 day

unit on greenhouse gases and recent global warming. What priority would you give to including

each of the following possible topics?” Four of the topics are not essential or relevant to

greenhouse gases and their effects: questions about aerosols, ozone, pesticides and the launching

of rockets into space. We calculated a score for each respondent, rescaled to run from zero to one

with high scores assigned to those who said that one or more of these topics were a high priority

and low scores to those saying these are not a priority.

Perception of consensus: Whether the respondent selected “81-100%” in response to question

about the number of climate scientists who believe “global warming is caused mostly by human

activities.” (yes = 1, all other responses = 0)

Self-rating of expertise on climate change models, where the options were:

I know less about this topic than many other high school science teachers

Typical of most high school science teachers

Very good compared to most high school science teachers

Exceptional: On par with many college-level instructors

The model estimates show that several variables measuring content knowledge have no impact

on teaching emphasis (self-rating, formal coursework, and selection of non-relevant topics). A

large effect is observed for selecting “81-100%” on the consensus question with an estimated

effect of 0.32 (essentially moving from one bar to an adjacent bar in Figure S1. We get a very

similar estimate (β = 0.33) for the government role question, but since this has greater range (–1

to +1) the maximum possible effect is larger.

Page 17: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

17

Figure S1. Distribution of responses (%) to teaching emphasis scale.

Page 18: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

18

Table S1. Percentage devoting at least one class hour to recent global warming, and the mean number of class hours devoted, among those who taught the subject, weighted versus unweighted data.

Percentage teaching about recent global warming

Mean hours among those teaching

Weighted Unweighted

Weighted Unweighted

Biology 86.64 85.86

5.32 5.25

Earth science 95.46 94.98

7.32 7.41

Physics 48.65 49.02

4.75 4.61

Chemistry 53.54 53.04

4.39 4.17

Middle school 70.25 70.66

5.64 5.64

Total 70.98 73.98

5.47 5.78

Page 19: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

19

Table S2. Final dispositions, return rate and response rate

AAPOR disposition

code Biology Earth

Science Physics Chem Middle Total

1. Completed paper (1.1000) 242 250 126 145 404 1,167

2. Completed replacement (1.1000) 18 18 11 13 71 131

3. Completed online (1.1000) 45 26 20 17 94 202

4. Ineligible (reported by teacher) (4.1000) 11 23 1 6 20 61

5. Ineligible (reported by school) (4.1000) 3 5 0 1 2 11 6. Presumed ineligible (USPS

return) (3.1400) 3 2 0 0 3 8

7. Refused (2.1000) 3 0 0 3 4 10

8. Refused (returned blank)* (2.1000) 3 3 3 1 3 13

9. Not returned (3.1900) 670 675 333 304 1391 3373 10. Returned after data

production (3.1900) 3 4 0 3 8 18

Total

1001 1006 494 493 2000 4994

Return rate among presumed eligible (%)

31 30 32 36 29 31

Assumed eligibility rate among non-returns (RR4 parameter “e”) (%) 73

AAPOR response rate #4 (%)

38 37 39 43 35 37

RR4 using e = 68% (%)

40 39 41 45 37 39

*Returned blank includes individuals who began Web survey but answered zero questions.

Page 20: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

20

Table S3. Logistic regression of response propensity (listwise n = 4,844). Odds of a questionnaire being completed and returned and understandardized logisitic regression coefficients.

Odds ratio B SE(B) t p

Biology sample (omitted) Earth Science sample 0.94 -0.06 0.10 -0.58 0.56

Physics Sample 1.06 0.05 0.13 0.43 0.67

Chemistry sample 1.31 0.27 0.12 2.19 0.03

Middle school sample 0.93 -0.08 0.09 -0.87 0.39

Received color questionnaire 1.17 0.16 0.07 2.21 0.03

School enrollment (1000s) 1.09 0.09 0.05 1.77 0.08

School: Proportion free lunch 0.95 -0.05 0.20 -0.24 0.81

School: Prop African American 0.37 -0.99 0.22 -4.49 0.00

School: Prop Hispanic 0.49 -0.71 0.19 -3.76 0.00

County: Prop Evangelical 0.56 -0.58 0.33 -1.78 0.07

County: Jobs in fossil fuels industry 0.76 -0.27 1.55 -0.18 0.86

County: Prop with grad/prof degree 0.77 -0.26 0.83 -0.32 0.75

Proportion Obama vote 2008 0.63 -0.45 0.31 -1.48 0.14 Intercept

0.73

-0.31

0.22

-1.44

0.15

Note: Bold entries indicate that the probability that B = 0 is less than 0.05.

Page 21: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

21

Table S4. Return rates among eligible respondents, by racial and ethnic composition of the school (sample size in smaller font, italic)

Hispanic enrollment

African-American enrollment

Total <10% 10–50% >50%

Percentage <10% 35 32 16 33

n 1851 545 246 2642

Percentage 10-50% 32 26 16 28

n 822 662 118 1602

Percentage >50% 27 25 33 27

n 429 168 73 670

Total 33 29 19 31

n 3102 1375 437 4914

Page 22: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

22

Table S5: Results of nonresponse online audit – Teachers in the school.

Result of online search

Teachers in

Majority-white

schools

Teachers in

Majority-

minority

schools

All

teachers

Teacher not listed in the

school directory

23

(18.3%)

36

(28.8%)

59

(23.5%)

Listed in department or

faculty directory

99

(78.6%)

79

(63.2%)

178

(70.9%)

Unable to determine

from the faculty listing*

2

(1.6%)

3

(2.4%)

5

(2.0%)

School does not have an

online teacher directory

2

(1.6%)

7

(5.6%)

8

(3.6%)

Total

126

(100%)

125†

(100%)

251

(100%)

*This category was selected when a teacher could neither be found nor ruled out with complete

confidence (e.g. maiden names, nicknames, etc.).

†Although our nonresponse sample is 49% majority-minority schools, only 35% of the entire

dataset fit this category.

Page 23: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

23

Table S6: Results of nonresponse online audit: Teachers teaching correct subject.

Result of online search

Teachers in

Majority-white

schools

Teachers in

Majority-

minority

schools

All

teachers

Confirmed teacher

teachers the correct

subject

54

(42.9%)

46

(36.8%)

100

(39.8%)

Confirmed teacher is in

a science department

29

(23.0%)

16

(12.8%)

45

(17.9%)

Confirmed teacher is an

administrator

- - -

Confirmed teacher

teaches a difference

science subject

2

(1.6%)

1

(0.8%)

3

(1.2%)

Confirmed teacher

teaches a different non-

science subject

1

(0.8%)

4

(3.2%)

5

(2.0%)

Cannot confirm either

way

17

(13.5%)

22

(17.6%)

39

(15.5%)

Teacher not listed in

the school directory

23

(18.3%)

36

(28.8%)

59

(23.5%)

Total

126

(100%)

125

(100%)

251

(100%)

Page 24: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

24

Table S7. Formal class hours devoted to recent global warming

Teachers devoting one or

more class lessons to RGW

(%)

Number of hours devoted to recent global warming (among

those with one or more class lessons)

Mean Median

Middle School (n=568) 70.2 4.4 1.5

Earth Science (n=285) 95.5 6.0 4.0

Biology (n=308) 86.6 4.1 1.5

Chemistry (n=183) 53.5 3.5 1.5

Physics (n=156) 48.6 4.1 1.5

Total (n=1500) 74.3 4.7 1.5

Page 25: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

25

Table S8. Regression predicting emphasis scale (all independent variables are scored 0–1 unless otherwise indicated, B is the unstandardized regression slope)

B t Biology

Earth Science -0.10 -2.95 ** Physics -0.04 -0.84

Chemistry -0.11 -2.58 ** Middle School -0.06 -2.02 *

Female 0.05 2.10 * Number of decades teaching (integer) -0.03 -2.71 **

I know less than many other teachers Typical of most HS science teachers 0.02 0.51

Very good 0.07 1.79 Exceptional 0.13 1.60

No formal exposure to climate change One exposure 0.03 0.88

Two or more exposures 0.01 0.30 One entire class 0.01 0.27 2+ classes -0.04 -0.31

Priority given to foils 0.09 1.17

Selected 81-100% 0.32 12.33 ** Government role scale (–1 to +1) 0.34 13.04 ** Constant 0.04 0.70

N 1,074

R2 0.41 *P < 0.05; **P < 0.01

Page 26: Supplementary Materials for - Science · 5. A second email reminder was sent on December 4, 2014. 6. A replacement survey packet was sent on December 12, 2014. This consisted of a

26

References

1. P. T. Doran, M. K. Zimmerman, Eos 90, 22 (2009). doi:10.1029/2009EO030002

2. J. Cook et al., Environ. Res. Lett. 8, 024024 (2013). doi:10.1088/1748-9326/8/2/024024

3. C. Funk, L. Rainie, “Public and scientists’ views on science and society” (Pew Research

Center, Washington, DC, 29 January 2015); www.pewinternet.org/2015/01/29/public-and-

scientists-views-on-science-and-society/.

4. Ipsos MORI, Global Trends Survey (Ipsos MORI, London, 2014);

www.ipsosglobaltrends.com/environment.html.

5. R. Johnson, M. Holzer, Executive Summary: National Earth Science Teachers Association

K-12 Climate Change Education Survey (National Earth Science Teachers Association,

Fort Collins, CO, 2011).

www.nestanet.org/cms/sites/default/files/documents/ExecutiveSummaryClimateChangeEd

ucationSurveyDecember2011.pdf

6. S. Wise, J. Geosci. Educ. 58, 297 (2010). doi:10.5408/1.3559695

7. National Center for Education Statistics, High School Transcript Study, (U.S. Department

of Education, Washington, DC, 1990 and 2009).

8. National Center for Education Statistics, Digest of Education Statistics 2012, Table 179

(U.S. Department of Education, Washington, DC, 2013), p. 252.

9. National Research Council, A Framework for K–12 Science Education: Practices,

Crosscutting Concepts, and Core Ideas (Board on Science Education, National Academies

Press, Washington, DC, 2012).

10. M. Berkman, E. Plutzer, Evolution, Creationism, and the Battle to Control America’s

Classrooms (Cambridge Univ. Press, Cambridge, 2010).

11. E. Hestness, R. McDonald, W. Breslyn, J. McGinnis, C. Mouza, J. Geosci. Educ. 62, 319

(2014). doi:10.5408/13-049.1

12. L. M. Desimone, A. C. Porter, M. S. Garet, K. S. Yoon, B. F. Birman, Educ. Eval. Policy

Anal. 24, 81 (2002). doi:10.3102/01623737024002081

13. Transition Region and Coronal Explorer (TrACE) (NASA, 2016);

http://science.nasa.gov/missions/trace/.

14. NOAA, Teaching resources, Climate.gov (NOAA, 2016);

https://www.climate.gov/teaching.

15. CLEAN: Climate Literacy and Energy Awareness Network (2016); http://clean.org

16. D. Kahan, Adv. Polit. Psychol. 36, 1 (2015). doi:10.1111/pops.12244

17. T. Dietz, Proc. Natl. Acad. Sci. U.S.A. 110 (suppl. 3), 14081 (2013). Medline

doi:10.1073/pnas.1212740110

18. D. M. Kahan, in Culture, Politics and Climate Change, M. Boykoff and D. Crow, Eds.

(Routledge Press, New York, 2014), pp. 203–220.