CHAPTER THREE
Elementary Teachers’ Science Education Computer Simulation Use: How Can
Professional Development Promote Instructional Adoption?
Amanda L. Gonczi, Jennifer L. Maeng, and Randy L. Bell
Abstract
The purpose of this study was to characterize and compare 67 elementary science
teachers’ computer simulation use prior to and following computer simulation
professional development aligned with Innovation Adoption Theory. The professional
development highlighted computer simulation affordances that elementary teachers might
find particularly useful. Qualitative and quantitative data, including perceptions surveys,
participant interviews, Quarterly Lesson Reports, and videotaped lessons, were analyzed
to identify changes in participants’ computer simulation use. Variables that hindered or
promoted instructional computer simulation use were also identified. Baseline
participant data indicated elementary teachers did not commonly use simulations during
science instruction. There was a significant increase in the number of participants that
used computer simulations pre- (17%) to post- (52%) professional development.
Computer simulation implementation patterns following the professional development
demonstrated participants consciously took advantage of the tool’s content-based and
pedagogical benefits for inquiry-based instruction. The primary barrier to instructional
computer simulation use was participants’ belief that computer simulations
1
are most effective when used by students independently or in small groups. Findings
illuminate Innovation Adoption Theory’s potential and limitations for use when
designing educational technology professional development. A modified six-stage
adoption model is recommended to address participants’ beliefs.
Introduction
Elementary science teachers are tasked with teaching all school subjects.
However, they may have limited background knowledge in certain subjects, including
science (Ginns & Watters, 1995). Furthermore, due to the lack of coursework in science
content areas, elementary teachers may hold alternative conceptions about science
content, scientific inquiry, and the nature of science (Ireland, Watters, Brownlee, &
Lupton, 2012; Schoon & Boone, 1998). Thus, elementary teacher professional
development must work to improve science instruction quality by providing curricular
options that help prevent the perpetuation of alternative conceptions and bridge potential
gaps in elementary teachers’ content knowledge.
Science education computer simulations (hereafter referred to as simulations) are
an instructional technology option that facilitate achievement of many desirable science
instruction outcomes (National Research Council [NRC], 2011). Simulations are
interactive, simplified virtual models of scientific phenomena that allow students to
observe the relationships between variables. In addition to developing students’ science
content understanding, simulations can improve students’ scientific inquiry skills (NRC,
2011). As a result of elementary teachers limited content knowledge or inquiry
experience, the data-based nature of simulations may be especially appealing and
valuable to elementary teachers.
2
Simulations have unique characteristics that may facilitate or hinder elementary
teachers’ instructional integration and should be considered during professional
development to promote effective instructional use (NRC, 2011). Educational
technology professional development is usually very general and may not necessarily
address the affordances of simulations elementary teachers should take advantage of or
challenges to critical student use (Chiu & Linn, 2012; Guzey & Roehrig, 2009).
Utilizing educational technologies is not simple and requires teachers have specific
knowledge about how, when, and why to use a specific educational technology.
Furthermore, the beliefs individual teachers hold regarding educational technologies
influence whether teachers are willing to incorporate them and how they incorporate
them (Morrison, 2013). As a result, research that examines how a technology specific
professional development program shapes elementary teachers’ beliefs about and
instructional simulation use is needed. The complexity of teaching that emerges from
student, teacher, context, and educational technology characteristics and their
interactions, demands a nuanced examination of teacher educational technology use and
professional development outcomes.
What are Science Education Computer Simulations?
In this study, we build upon the definition of computer simulations previously
developed as “dynamic models of scientific phenomena and processes” (Smetana & Bell,
2014; Smetana & Bell, 2011). Our working simulation definition includes three
additional criteria. First, simulations are specifically designed and intended to help
science students understand a specific natural phenomena. Therefore, they are simplified
models of the actual phenomena. Second, they include some degree of student
3
interactivity. Finally, simulations can potentially foster science understanding in one of
three ways: (a) student engagement in scientific inquiry through manipulating variables
and measuring outcomes either qualitatively or quantitatively, (b) building virtual
models, or (c) engaging in unique behaviors representative of specific types of scientists
(e.g. using data to forecast future weather as a meteorologist would).
These additional definitional components are necessary to specifically identify
science education computer simulations as the diversity and number of online simulations
has expanded into various career areas including medicine and mathematics. In addition,
the expanded definition prevents conflation among dynamic visualizations, games, and
simulations (Aldrich, 2009; NRC, 2011). Simulations differ from dynamic visualizations
because the latter do not necessarily permit student interactivity although they allow
students to make observations of abstract science content such as photosynthetic
processes (Chiu & Linn, 2012). Simulations may have some game-like qualities but can
be distinguished from digital games by identifying the primary development and use
goal. The primary goal in the design and use of a simulation is for students to understand
the scientific phenomena or process underlying the software, not for the student to “win.”
By comparison, computer games clearly have a desired outcome that students are focused
on rather than understanding a science concept or scientific process. In summary, a
simulation is an interactive, simplified virtual model of scientific phenomena designed
and used to foster students’ scientific skill development and/or content and nature of
science understanding.
4
Simulation Professional Development: A Design Strategy
Simulations can foster students’ content understanding, engage students in
scientific inquiry, and help students develop accurate nature of science conceptions
during instruction (NRC, 2011). Simulations promote conceptual understanding and
achievement in Physics (Dega, Kriek, & Mogese, 2013; Zacharia, 2007), Chemistry
(Plass et al., 2012), Earth Science (Trundle & Bell, 2010), Biology (Kinzie, Strauss, &
Foss, 1993) and engineering design (Klahr, Triona, & Williams, 2007). However,
simulations greater value may lie in their ability to involve students in scientific inquiry
(Kubicek, 2005; NRC, 2011; Windschitl, 2000).
Ideally, simulations should be used to engage students in inquiry instruction
(NRC, 2011). Inquiry instruction is student-centered pedagogy that involves students in
one or more inquiry-related skills as students seek to answer a research question in ways
similar to a scientist (NRC, 1996; NRC, 2012). These skills include asking questions,
developing and using models, designing and carrying out investigations, analyzing data,
constructing explanations, engaging in evidence-based argumentation, and
communicating scientifically (NRC, 1996; NRC, 2012). Inquiry-based simulation use
accomplishes several desirable goals. Inquiry-based simulation use facilitates students’
deep conceptual understanding and promotes scientific inquiry skills (Finkelstein et al.,
2005; Winberg & Berg, 2007; Trundle & Bell, 2010). It also mimics the process
scientists undergo to generate knowledge (Abd-El-Khalick et al., 2004). When science
instruction affords students opportunities to behave like scientists, students become more
motivated and interested in science (Gibson & Chase, 2002). As a result, simulation use
to support inquiry instruction is desirable to foster students’ immediate academic
5
achievement and long-term interest and success in scientific fields (Liao & Chen, 2007;
Sun, Lin, & Yu, 2008).
There appears to be an initial assumption in many professional development
studies that all instructional digital technology types share similar features and therefore
generalized instructional technology professional development and research are justified
(Gerard, Varma, Corliss, & Linn, 2011; Roehrig & Guzey, 2009). However, this
underlying assumption is problematic. For example, Roehrig and Guzey (2009)
described four beginning secondary teachers’ instructional technology practices and
experiences following a year-long professional development program that emphasized
inquiry instruction and technology integration. They found the teachers encountered
unique integration challenges with different instructional technology types. In particular,
simulation use posed unique classroom management issues. Therefore, generalized
educational technology professional development may not effectively prepare elementary
teachers to integrate specific digital tools, especially simulations. Furthermore,
elementary teachers’ limited science content knowledge and experience with scientific
inquiry means simulation professional development should attend to these teachers’
possible alternative inquiry conceptions and support accurate nature of science
understanding (Ireland et al., 2012.)
Elementary teachers’ educational and science backgrounds offer simulation
professional development programs unique opportunities for both positive outcomes and
implementation challenges. On the one hand, because of limited science background
knowledge, elementary teachers may be amenable to incorporating educational
technology that complements gaps in their own understanding and helps them provide
6
students with accurate representations of scientific phenomena (Pope, Jayroe, Franz, &
Hamil, 2008; Trundle & Bell, 2010). Therefore, professional development with
elementary teachers that highlights this instructional benefit may result in widespread
simulation adoption. On the other hand, many elementary science teachers have a vague
notion of scientific inquiry as “finding things out” or manipulating materials without
understanding the evidence-based nature of knowledge generation in science (Ireland et
al., 2012; Morrison, 2013). Therefore, limited or alternative scientific inquiry and nature
of science conceptions also need to be attended to during simulation profession
development before elementary teachers can be expected to marry desirable pedagogy
with educational technology (Ireland et al, 2012).
Innovation Adoption Theory
Innovation Adoption (IA) Theory (Rogers, 1985) explains why professional
development participants might ultimately adopt innovations including simulations. The
theory is based on a five-stage model that an individual progresses through when
innovations arise and individuals choose to either adopt or reject the product. Stage one
is marked by the individual’s initial awareness of the innovation. Stage two is
characterized by a growth in the individual’s knowledge about the innovation,
particularly its benefits. Stage three is achieved when the individual makes the decision
to attempt to utilize the innovation as a result of being persuaded of its benefits in stage
two. In stage four, the innovation is used for the first time. Finally, in stage five the
individual reflects on their experiences with the innovation and decides to either fully
adopt or discontinue using the innovation.
7
In education, professional development provides teachers opportunities to learn
about new educational innovations such as simulations (stage 1). Based on IA Theory it
is the responsibility of professional development implementers to convince the
participants of the innovation’s instructional value (stage 2) in order for subsequent
adoption to occur (stage 3). It is essential that professional development programs help
move participants beyond stage 2 for two reasons. First, without pedagogical and
technology-related support, teachers often find it easier to continue using strategies and
educational tools that they are familiar with rather than trying new ones (Gerard et al.,
2012). Second, without being convinced that innovative educational technology should
be utilized for reform-based science instruction, participants are likely to use new tools
for traditional teacher-centered pedagogy (Dunleavy, Dexter, & Heinecke, 2007; Gerard
et al, 2011; Waight & Abd-El-Khalick, 2007). The professional development program
that served as the context of this investigation was designed to provide participants with
opportunities to use simulations within inquiry-based lessons after they learned about
their instructional benefits. This design might help participants not only adopt an
educational tool, but also adopt student-centered practices. The professional development
program is described in the methods section, below.
Purpose
Educational technology professional development needs to be aware of and
consider participant and educational technology characteristics and the process of
innovation adoption. Unfortunately, educational technology professional development
opportunities do not often take into consideration participant characteristics and needs
(Zhao & Bryant, 2006). In addition, professional development often provides superficial
8
coverage of many curricular choices rather then helping teachers develop deep
technological pedagogical content knowledge about one educational technology (Graham
et al., 2009; Guzey & Roehrig, 2009). While it may seem appealing to introduce
participants to as many different innovative educational technologies as possible, this
approach may not permit simulation implementation in desirable instructional contexts,
including inquiry-based learning. Thus, the following research questions guided this
investigation:
1. To what extent did participants adopt simulations following professional
development aligned with IA Theory and utilize them for inquiry and nature of
science instruction?
2. What fostered participants’ simulation adoption?
3. What limited participants’ simulation adoption?
Study Context
Participants
The participants in this study were a subset of elementary teacher participants in
the Virginia Initiative for Science Teaching and Achievement (VISTA) Elementary
Science Institute (ESI) professional development program. Teachers applied to VISTA
and were accepted in teams of 2-5 from the same school. Two cohorts of elementary
teachers (N = 67) (Cohort 1: 2 male, 25 female; Cohort 2: 6 male, 34 female) over the
span of three years participated in the computer simulation professional development
study. The participants ranged in science teaching experience from 0 to 23 years
(M=12.21). Seven teachers (10.4%) held bachelors degrees in either Earth Science or
9
Biology. None of the participants had degrees in Chemistry or physics. Fifty-four
participants (80.6%) held education-related degrees.
For each participant, data were collected for two years. The first year constituted
baseline data that reflected the teachers’ instructional practices prior to the professional
development. The second year of data collection occurred following the professional
development. Thus, changes in any instructional practices could be more confidently
ascribed to the professional development.
Data across the two cohorts were combined to attain a large enough sample size
that might clarify simulation use differences pre-and post-professional development.
Independent samples t-tests ensured both cohorts were equivalent in their simulation use
confidence pre- and post- professional development. Levene’s test for equal variances
confirmed variance normality (p > .05). No significant differences between Cohort 1 and
2 participants’ self-report simulation use confidence existed at the beginning of the
baseline data collections year (Table 1). Cohort 1 and 2 participants’ simulation use
confidence means were also statistically similar immediately prior to the professional
development and following the professional development. This indicates the participants
in each cohort had similar simulation use confidence during the baseline data collection
year and that the professional development was implemented with fidelity across cohorts.
As a result, Cohort 1 and Cohort 2 participants were combined for subsequent
quantitative data analysis.
10
Table 1
Cohort 1 and 2 Self-Reported Computer Simulation Use Confidence (Perceptions
Surveys)
Cohort 1M (SD)
Cohort 2M (SD)
df t Significance(2-tailed)
Year 1, Baseline confidence 2.4 (0.9) 2.3 (1.3) 61 .516 .607Year 2, Pre-PD confidence 2.7 (1.0) 2.2 (1.2) 61 1.70 .082Year 2, Post-PD confidence 3.8 (1.0) 3.6 (1.0) 62 .654 .516Note: 1= not very confident; 5 very confident
Virginia Initiative for Science Teaching and Achievement
The Virginia Initiative for Science Teaching and Achievement (VISTA) provided
professional development to elementary (grades 4-6) science teachers. The professional
development included a four-week summer institute (ESI) and follow-up academic year
support. The VISTA professional development had five foci designed to increase
students’ conceptual understanding, scientific literacy, and interest in science. VISTA
constructs included: (a) problem-based learning (PBL), (b) inquiry instruction, (c) nature
of science instruction (NOS), (d) hands-on learning (HOS), and (e) instructional
technology integration (Sterling & Frazier, 2010; Sterling, Matkins, Frazier, &
Logerwell, 2007). The first four constructs are defined in Table 2.
To facilitate instructional technology use within these instructional contexts,
VISTA introduced participants to simulations. In addition to free web-based simulations
participants were given ExploreLearning® accounts that provided access to the
company’s evidence-based commercial simulations (Gizmos®). ExploreLearning®
Gizmos® are designed for students in grades 3-12. Many Gizmos® allow students to
manipulate variables and measure outcomes to develop conceptual understanding in the
11
earth, biological, physical, and life sciences. VISTA implementers encouraged
participants to utilize simulations that facilitated students’ understanding of abstract
science concepts and engaged students in inquiry learning. By focusing on these
benefits, professional development implementers identified general tool affordances and
helped the elementary participants understand how to capitalize on these tools given the
level of their individual content and inquiry-based knowledge.
Table 2
The VISTA Constructs
Construct DefinitionPBL Students work over time to solve a real-world problem by
engaging in scientific inquiry.
Inquiry Students ask questions, collect and analyze data, and use evidence to solve problems or answer questions.
NOS Students understand the values and assumptions inherent to the development of scientific knowledge through explicit instruction.
All treatment participants received three hours of simulation professional
development designed to move participants quickly through adoption stages 1-3. The
professional development first provided an overview of simulations and web access
(stage 1). Implementers subsequently demonstrated simulation use for inquiry instruction
and identified relevant science content addressed with the tool (stage 2). Participants
were then provided content-relevant lesson planning time (stages 3).
During the initial simulation professional development module, implementers
emphasized the value of simulations for science content that is difficult for students to
visualize or experience in the classroom and for inquiry-based instruction. Subsequently,
participants had the opportunity to use simulations during inquiry-based lessons in a
12
summer camp setting surrounded by supportive school team members and professional
development implementers. Thus, potentially negative ramifications for a less than
perfect lesson were virtually nonexistent in the camp compared with the participants’
usual school setting. A benign camp setting reduced any professional risk participants
might have perceived in their school context that can prevent first innovation attempts
(Rogers, 1985). In addition, debriefs with participants and professional development
implementers at the end of each camp day allowed participants to reflect on what went
well or did not go well during the camp lesson and consider changes they could make to
improve future instruction. Many, though not all, elementary participants used
simulations in their camp lessons. Those that did not actually use them likely observed
other participants implement them in their camp lessons. Thus, the VISTA professional
development was designed to help the participants move through stages 1-4 of IA Theory
to facilitate simulation adoption once the participants returned to their schools in the fall.
Methodology
Successful educational technology adoption and integration methods depend on
teachers’ beliefs, technology comfort, instructional context, and professional
development characteristics (Dawson & Heinecke, 2004; Gerard et al., 2011). Thus,
research examining teachers’ adoption of new educational technologies requires multiple
data sources. In addition to determining whether participants utilized simulations
following professional development, this study sought to understand why or why not the
innovation was successfully adopted. As a result, multiple data sources were collected
either concurrently or sequentially to answer each research question (Hesse-Biber, 2010).
Lesson observations and survey data provided a means for participants to express the
13
meanings they created regarding instructional simulation use that might have influenced
their adoption (Schwartz-Shea & Yanow, 2012). Because teachers’ beliefs are an
underlying variable that can foster or hinder educational technology adoption, participant
interviews were employed to capture this element. Participants’ interviews also served to
triangulate or explain emergent data patterns derived from other sources. As a result, to
the extent possible, interviews were conducted after other data sources had been collected
and initially examined to clarify patterns, inconsistencies, or individual beliefs (Merriam,
2009). The purpose of each data source and means of analysis are described below.
Professional development observations. Observations of the computer simulation
professional development captured implementation and participant experiences. The
initial professional development module was examined for evidence that participants
moved through innovation adoption stages 1 and 2 and to ensure fidelity of the
professional development experienced by Cohorts 1 and 2. During camp planning and
camp lesson observations, evidence of a participant’s planning to use or actual initial use
of simulations was documented as evidence that the participant reached stage 3 in the
adoption model.
Observation notes were used to construct detailed, descriptive write-ups that
included inferences. Next, professional development observation write-ups were coded
for evidence of the innovation adoption model stages participants moved through,
participant engagement, and implementer emphasis for four different simulation
implementations purposes: content, inquiry, NOS, and PBL. The focus on the purpose
for instructional simulation use reflects the researchers’ conceptual framework that
considered the unique characteristics of elementary teachers and how those
14
characteristics (e.g. limited science content knowledge and inquiry based-experiences)
may have ultimately influenced implementation patterns.
ExploreLearning® weekly login reports. These reports indicated whether participants
logged into their ExploreLearning® account during the previous week. These reports
were used in conjunction with Quarterly Lesson Reports to identify participants that used
simulations and had reached stage 3, 4, or 5 in the innovation adoption model.
ExploreLearning® use surveys (Appendix A). Treatment participants identified as
users of ExploreLearning® resources via weekly login reports and Quarterly Lesson
Reports completed ExploreLearning® Use Surveys (EL Use Survey). The EL Use
Survey was emailed to participants in November, March, and May following the
professional development to provide participants multiple completion opportunities. The
survey was completed by 37 of the 42 participants who received the survey (88.1%
response rate). This survey captured detailed information regarding participants’ self
report frequency of simulation use and instructional use methods. The survey also
identified participants that reached stage 4 in the innovation adoption model and then
chose not to fully adopt the innovation.
Perceptions surveys. Participants completed Perception Surveys electronically twice
during their baseline year (beginning and end of academic year) and three times during
their professional development year (pre-, post-, delayed-post professional development).
In addition to general questions regarding participants implementation of VISTA
constructs, Perceptions Surveys asked participants to rate their confidence using
simulations using a 5- point Likert scale and describe barriers to simulation
15
implementation. An expert panel representing qualitative research and science education
fields validated the survey.
Barriers to simulation use described on delayed-post Perceptions Surveys and EL
Use Surveys were first read for emergent themes. If more than one participant described
a technology integration challenge, a barrier category was created. Initially, 10 barrier
categories emerged from the data. These were subsequently revised based on code
similarities and differences. For example, an initial code termed “inadequate computer
access” was later divided into two categories: (a) insufficient computers, and (b) limited
access to computers due to testing, when it became clear testing influenced computer
access in some schools but not in others. These barrier descriptions helped explain why
some participants may not have reached adoption stages three, four or five.
Interviews (Appendix B). Seven random participants (11%) were interviewed to
understand how and why they implemented simulations into science instruction. The
interview protocol was validated by three experts in the fields of qualitative and science
education research. Interviews helped triangulate self-report survey data and clarify
obstacles to simulation adoption. Interview transcripts were read multiple times for
emergent themes beyond any captured in the other data sources. In addition, transcripts
were examined for confirming, disconfirming, or explanatory comments of codes created
in the other data sources. Thus, interview analysis primarily served to deepen the
understanding of already emergent data themes. This analysis method reflects the data
collection process that utilized interviews to elucidate participants’ beliefs and computer
simulation implementation patterns beyond what had already been captured with the
other data sources.
16
Lesson observations. All participants were videotaped teaching a science lesson
at four evenly distributed times throughout their baseline and post- professional
development years for a total of eight observations. Videotaped lessons that included
simulations were watched, field-notes taken, and subsequent detailed write-ups
completed. Write-ups of participants’ lessons with simulations were coded as content,
NOS, inquiry, and/or PBL-focused. Codes were quantized (0= not present, 1= present) to
help discern any differences in participants’ computer simulation use for different
instructional purposes (Hess-Biber, 2010).
Quarterly lesson reports (Appendix C). Quarterly Lesson Reports (QLR’S)
recorded elements of instruction during an observed class period as well as three science
classes prior to and following the observed lesson to provide context. The instrument
was previously validated to capture instructional practices (Lawrenz, Huffman,
Appeldoorn, & Sun, 2002). On QLR’s, participants described instructional objectives for
the observed lesson. In addition, the participant indicated whether the observed and
neighboring lessons included inquiry-based, PBL, or NOS instruction and why they
believed those elements existed. QLR’s identified lessons with simulations, triangulated
the extent participants took advantage of simulation affordances, and provided an insight
into participants’ beliefs regarding inquiry-based, problem-based, and NOS instruction.
Quarterly Lesson Reports were analyzed pre- and post-professional development
for evidence of participants’ simulation use. Binomial codes were ascribed to each QLR
based on evidence in the QLR that the participant used simulations during the academic
year prior to and following the professional development (0 = no use; 1 = simulation
17
use). This coding scheme permitted inferential analysis to determine statistical changes
in the extent of simulation use.
Data Interpretation
Data interpretation occurred after each data source was independently analyzed.
During interpretation the emergent patterns from each data source were compared for
consistencies or lack thereof between sources. It was during analysis that qualitative and
quantitative data were holistically utilized to support each other in meaning generation
(Hesse-Biber, 2010). In the event of inconsistencies between two data sources,
participant interviews and QLR’s were examined for explanations. These two data
sources illuminated participants’ beliefs and helped explain differences in self-report and
researcher-coded data. Table 3 overviews the overlap or unique use of data sources, data
collection, and analysis procedures. The results reported in the following section reflect
final interpretation of all data sources and is organized by research question.
18
Table 3
Data Collection and Analysis Procedures
Research Question Data Sources Data Collection Process
Data Analysis
RQ #1:Extent and purpose of simulation use
EL Log In Reports EL Use Surveys Quarterly Lesson
Reports Lesson Observations Participant Interviews
Concurrent/Sequential Paired t-tests Systematic Data
Analysis (Miles & Huberman, 1984)
RQ #2:Supporting variables
Perceptions Surveys Lesson Observations Professional
Development Observations
Quarterly Lesson Reports
Participant Interviews
Concurrent/Sequential Repeated paired t-tests
Systematic Data Analysis (Miles & Huberman, 1984)
RQ # 3:Hindering variables
Post-perceptions Surveys
Participant Interviews
Concurrent Systematic Data Analysis (Miles & Huberman, 1984)
Results
A significant number of participants adopted simulations post-professional
development. Data indicated the professional development helped participants perceive
certain simulation use benefits. Barriers to simulation integration were widespread,
personal, as well as school-based. These themes are elaborated on under the appropriate
research question.
To what extent did participants adopt simulations following the professional
development and utilize them for inquiry, PBL, and NOS instruction?
19
During the baseline data collection year, QLR’s referenced simulation use by 11
(17%) of participants. Following the professional development, 34 (52%) of participants
referenced simulation use on one or more QLR’s. Paired t-tests demonstrated this was a
significant increase in the number of participants that used simulations following
professional development, t(66) = 4.897, p =.001). This indicates the professional
development successfully supported participants’ simulation adoption.
Participants’ baseline and post-professional development lesson observations
reflected they used simulations primarily for content-focused and inquiry-based
instruction and rarely for NOS or PBL instruction. In all observed lessons, there was an
obvious concept-driven purpose for student simulation use. No evidence of simulation
use within a PBL unit or to support explicit NOS instruction existed in any of the
observed baseline lessons. Only two post-professional development lesson observations
demonstrated simulation use for PBL or explicit NOS instruction (Table 4).
Simulation use for inquiry-based teaching was more evident in lessons following
the professional development. Of the four observed baseline year lessons, students were
never observed engaging in data collection for the purpose of answering a research
question. In contrast, students in 10 out of 15 post-professional development lessons
were either given or designed research questions to guide simulation use within the
context of specific science content (Table 4). For example Carolyn explained,
One of the [simulations] that we worked on was the Doppler effect. There was a police car with a siren and it shows the sound waves and you could turn on and off the different sound waves and different variations with that. And the kids …were going through and changing the position of the car and really thinking about how the position of the car, or the speed of the car would effect the Doppler effect. (Interview)
20
During the described lesson, Carolyn allowed students to choose their own independent
variables and pursue their own line of scientific inquiry. In other lessons, students in
Carolyn’s class also engaged in more structured inquiry investigations by answering a
given research question following a set of prescribed procedures. For example, to answer
the question, “What causes particles to move back and forth as a sound wave passes?” a
worksheet explicitly instructed the students “to change the pressure and acceleration of
dividers in the simulation.”
Table 4
Observed Instructional Computer Simulation Use Patterns
# observed lessons
incorporating simulations
Simulations used in
observed lessons
Concept-Focused
Reform –Based Instruction
Inquiry PBL NOSBaseline observations
5 Edheads (1), ScienceJoy (1) Gizmos® (3)
5(100%)
0 (0%) 0(0%) 0 (0%)
Post PD observations
15 PhEt (1), Gizmos® (14)
15(100%)
10 (67%) 2 (13.3%) 2(13.3%)
Data from participants’ post-professional development EL Use surveys were used
to triangulate the lesson observation data. Out of four possible reasons for simulation
use, participants reported using simulations to help teach science concepts most often (M
= 3.39, SD = 1.4). To a lesser degree participants reported using simulations for inquiry
(M = 3.06, SD = 1.43), PBL (M = 2.31, SD = 1.31), and NOS (M = 2.74, SD = 1.26).
Participants reported using simulations for inquiry and content-based instruction
significantly more often than in the context of a PBL (Table 5). However, participants
similarly rated their frequency of simulation use for NOS and inquiry instruction.
21
Table 5
Comparison of Self-report Purpose for Simulation Use Post-Professional Development
Pair Pairwise
Inquiry Mean (SD)
Pairwise PBL Mean
(SD)
Pairwise NOS Mean(SD)
Pairwise Concept
Mean(SD)
df T Sig (2-tailed)
Inquiry & PBL 2.965 (1.426)
2.31(1.31) ------- ------ 28 2.575 .016*
Inquiry & NOS 3.03 (1.45) -------- 2.73
(1.29) ------ 29 1.329 .194
PBL & NOS ----- 2.31 (1.31) 2.66(1.23) ------ 28 -1.672 .106
Concepts & Inquiry 3.06 (1.43) ------ ------- 3.39
(1.45) 30 1.108 .277
Concepts & PBL ------ 2.31(1.31) ------ 3.45
(1.43) 28 4.278 .000*
Concepts & NOS 2.74(1.26)
3.39(1.45) 28 -2.133 .041
Note: Means are based on number of participant item survey responses, = .05/3
During interviews, participants were asked about their reform-based practices
with simulations. Only 1 of the 7 participants interviewed described using simulations
for explicit NOS instruction. However, 2 other participants described simulation use to
support implicit NOS instruction. Phoebe explained why she indicated on her EL Use
Survey that she used simulations for NOS instruction, “They [simulations] are a social
activity. So they were constantly working together.” In other words, Phoebe felt that by
working together, students were learning scientific behaviors. Implicit NOS instruction
was also evident on QLR’s. For example, Shauna described how she planned to integrate
NOS instruction into a simulation-supported lesson, “Students will make observations,
base their conclusions on evidence they gather from the simulated model, and change
their ideas when the scientific evidence disproves their beliefs” (Shauna, QLR #8).
Shauna described plans for students to engage in scientific behavior and therefore learn
about NOS. However, implicit NOS instruction does not guarantee students will
22
understand science is evidence-based or that as a result of new, discrepant evidence,
scientific ideas can be revised. Participants’ perceptions that supporting student
engagement in scientific behaviors constituted NOS instruction may partially explain
why more participants reported integrating NOS instruction with simulations than lesson
observations suggested. It is also possible large survey response standard deviations and
the relatively small sample size made differences in self-report instructional purposes
undetectable or overemphasized in lesson observations.
A significant number of participants adopted simulations for instructional use
following the professional development. Most often participants incorporated
simulations to address students’ conceptual understanding but also reported using them
frequently for inquiry-based instruction often. Participants reported using simulations for
NOS instruction more often than suggested by lesson observations. Interview data
suggested this disparity resulted from participants’ simulation use for implicit, rather than
explicit NOS instruction.
In conclusion, the professional development successfully moved a significant
number of participants through the IA stages. In addition, participants appropriately used
simulations to address students’ science content understanding and to engage students in
scientific inquiry. Although participants recognized that students’ NOS understandings
could also be strengthened with simulations, participants rarely explicitly addressed how
students were acting like true scientists during simulation use. Simulation use within
PBL’s was the least often self-reported purpose for use.
What Variables Fostered Participants’ Simulation Adoption Following Professional
Development?
23
Qualitative and quantitative data sources revealed three primary variables that
contributed to increased participant simulation use following the professional
development. Positive influences included: increased participant simulation use
confidence, increased participant awareness that simulations could develop students’
standards-based content understanding, and participants’ belief that simulations engaged
students in authentic science experiences.
Increased confidence. Participants’ confidence using simulations increased
significantly from pre- (M = 2.4, SD = 1.1) to post-professional development (M = 3.7,
SD = 1.0), t(61) = 10.465, p <. 000). There were no significant differences in
participants’ self-reported simulation use confidence from post- to delayed-post
professional development (M = 3.7, SD = 1.1), t(62) = .354, p = .725). These results
indicate the professional development increased participants’ confidence in simulation
use and their newly acquired confidence was maintained throughout the academic year.
Content instructional support. QLR’s, lesson observations, and interviews
demonstrated the perceived importance participants placed on simulations for supporting
students’ conceptual understanding. Participants utilized simulations for lessons that
provided concrete representations of abstract content and to complement gaps in
participants’ own knowledge. Instructional goals on QLR’s for all lessons that
incorporated simulations following the professional development described one or more
concepts participants’ intended for students to understand. For example Hal expected a
circuit builder simulation to help students “understand the characteristics of electricity”
and be able to “differentiate between a parallel and series circuit” (QLR#7).
24
Lesson observations also reflected participants’ emphasis on conceptual
understanding. Participants primarily asked questions relating to content rather than
engaging students in scientific inquiry skills. For example, following small group work,
Eve displayed the ocean tide simulation students used and asked the class, “Was there a
regular pattern [in the tides]?” Although, a software-generated data table could be used
to justify conclusions, Eve did not ask students to provide evidence to support their
comments. Rather, Eve wanted to ensure that the students could articulate the
appropriate tide-related pattern depicted in the simulation. How the students arrived at
the correct understanding was less important to Eve than making sure students had
achieved standards-based conceptual understanding.
Some participants had evidence simulation integration directly benefited students’
content understanding. Carolyn explained that the teachers at her school,
…compare data once a week when we finish a test. They look at what kids have passed and not and I’ve seen some improvement in some of my kids who weren’t using [simulations before], because they had more practice and it was a different modality and learning. (Carolyn, Interview)
Positive changes in students’ test scores encouraged Carolyn’s continued simulation use
for concept-focused instruction.
During interviews, many participants described the value of simulations for
teaching abstract, difficult to visualize, or micro/macro-scale phenomena. Phoebe
explained, “You can't show the kids molecules evaporating like you can show them on
the simulations. I can't go and show them how the deer populations are increasing or
decreasing.” Lesson observation data confirmed that the simulations participants utilized
frequently addressed content that was difficult for students to easily observe. For
example, targeted content with simulations often addressed cell structure, molecular
25
movement, seasonal changes, ocean tides, and moon phases. Only one lesson following
the professional development incorporated a simulation to engage students in scientific
skills that may have been more effective and appropriately addressed with physical
manipulatives. In this lesson, Ciara used a virtual dichotomous key activity to foster
students’ observational skills and ability to use dichotomous keys for organism
identification. Ciara could have easily implemented this activity with physical and/or
audio materials, which would have allowed students to use all their senses instead of
using sight alone as dictated by the limits of a virtual classification activity. For example,
Ciara could have taken students outside to identify trees, which requires observing and
feeling leaf and bark textures. In addition, Ciara could have played audio recordings of
different bird songs to give students practice using a dichotomous key based on sound.
At the end of the lesson Ciara asked students to summarize what they had learned and
give their own definitions of a dichotomous key. One student described a dichotomous
key as a “picture dictionary” (Ciara Observation # 8). This student’s definition
demonstrates the understanding she came to via the simulation that dichotomous keys
only utilized the sense of sight to identify organisms. In fact, a botanist would utilize the
sense of touch and ornithologists sound to identify unknown plants and birds with a
dichotomous key. Although this lesson was an exception, it highlights how important a
teacher’s knowledge about and choice of curricular options can be in fostering students’
knowledge generation.
Of the 7 interviewed participants, 2 explained simulations potentially
strengthened their science instruction by complementing gaps in their own content
understanding. Christine explained, “Sometimes I know what the answers are, but I don't
26
necessarily understand why those are. I think that the simulations are great for that”
(italics added by authors). Similarly, Cheryl utilized a simulation for a particular lesson
because, “This was actually my first year with rocks… [the simulation] actually had more
knowledge about rocks than I did, so that was good” (Interview). Christine realized
students could use simulations to manipulate more than one variable and compare
differences in outcomes leading to a deeper understanding of the science content than
they might otherwise have. In addition, Christine noted simulations help students
understand theoretical underpinnings for observed phenomena that she might be
unfamiliar with. Cheryl, simply recognized her own sparse geology-related knowledge
and took advantage of an evidence-based simulation to provide a foundation for students
to learn the topic. In both cases, participants’ awareness of themselves and the
affordances of simulations resulted in them making instructional choices that might
benefit students optimally.
Together, these data sources demonstrate participants’ perceived simulations as an
educational technology tool valuable for strengthening and developing students’ science
concept-based understanding. Evidence of student learning and participants’ realization
that simulations had the capacity to complement their own content knowledge likely
encouraged continued simulation implementation.
Authentic and simple curricular option. Many participants recognized and
consciously incorporated simulations into their lessons because they provided students
opportunities to engage in scientific behaviors and use scientifically-relevant technology.
For example, Christine explained that her instructional simulation use for earth and space
science content helped her students understand abstract science content and also was an
27
example of good pedagogical practice because, “being able to use computer model
simulations - that's what scientists would do. So, it was kind of tying in the nature of
science also.” Christine elaborated on how simulation use strengthened students’ NOS
and inquiry skills:
You can run more repeated trials. And therefore, you would have more data, and then as a group, collectively look at that, talk about it, confirm, negate it, whatever. …and that you can get more done in that period of time if you are repeating trials and showing different ways of doing things, where it takes a lot more time if you're doing it with the hands-on material. (Interview)
Christine realized that for inquiry-based learning, simulations are sometimes a better
alternative than analogous hands-on activities because of affordances that allow rapid
data collection. In addition, Christine understood simulations, especially when used for
certain lines of inquiry, introduced students to data collection methods of actual
scientists. Similarly Lily, described a lesson in which she incorporated a density-related
simulation and found, “It's a lot faster, more effective, because they have all the different
items they can drop, but you don't have a big mess that you're cleaning up.” Both
Christine and Lily described instructional simulation implementation that allowed
students to collect data and engage in scientific inquiry. These lessons could have been
implemented using non-virtual materials that allowed students to practice these same
inquiry-based skills. However, as Christine noted, simulations may represent the most
authentic material for inquiry for certain science content. In addition, lessons utilizing
simulations permit greater time for student engagement in scientific inquiry since time for
hands-on lab cleanup is not necessary.
The data indicates that the professional development did indeed convince
participants of the instructional value of simulations resulting in adoption by 35% of
28
participants. As a result, there was a significant increase in the participants who used
simulations pre- (17%) to post- (52%) professional development. Participants utilized
simulations to develop students’ content understanding, especially when other curricular
options were unavailable or overly cumbersome to implement. In addition, participants
used simulations to foster students’ accurate scientific inquiry and to implicitly encourage
accurate NOS understandings. Improved participant confidence and the belief that
simulations fostered science content understanding led to a significant increase in the
number of participants that used simulations during science instruction following
professional development.
What Variables Hindered Simulation Adoption?
Although 52% of participants used simulations in the year following their
participation in the summer component of the VISTA PD, no evidence existed that the
other 48% did. Year-end Perceptions Survey responses revealed technological,
pedagogical, and contextual barriers to participants’ simulation adoption. In several
instances, members within a school team perceived and experienced these barriers alike.
However, in other instances school team members did not experience technology
integration barriers similarly.
Of the 67 participants who completed the year-end Perceptions Surveys, 52
(77.6%) described at least one barrier to simulation use. The most common barriers
reported were insufficient computer access (68%), limited instructional time (17%), and
lack of age or content appropriate simulations or accompanying instructional materials
(14%). During interviews, participants described their experiences with these barriers.
29
For example Carolyn explained her challenge accessing a desirable number of computers
for instructional simulation use:
Each grade level, 3rd-5th, has two laptop carts per grade level. So that’s about a class set. But in our school, we have seven to eight teachers on all three teams. So that’s not even getting those laptops once a week. (Carolyn, Interview)
Several participants also lamented they only had students for a maximum of 30 minutes
for science instruction on a given day, which made deep engagement with simulations
difficult and served to discourage their implementation. However, Christine designed a
means of overcoming this barrier, “[Because] the time for science instruction has been
reduced I have decided to move to a model where the simulation is explained in class, but
students will need to access it as part of homework” (Christine, EL Use Survey).
Christine explained that her student population had home internet access, which allowed
her to creatively overcome the instructional time-related barrier.
Many participants also described difficulty finding simulations that covered
specific standards-based content or had accompanying instructional materials designed
for a specific grade level. Felicia indicated she did not use simulations as often as she
wanted because of limited instructional time and “simulations are not as elementary
friendly as they could be. Most of the templates are more geared toward middle and high
school” (Year-end Perceptions Survey). Felicia’s use of the word “template” reveals that
it may not necessarily be the simulation itself that exceeded the comprehension level of
her students, but that the existing student hand-outs exceeded her students’ ability.
Felicia was one of several participants who conflated the actual simulation with
accompanying worksheets when they were available. Participant dependence on pre-
made student worksheets and conflation of worksheets with the actual simulation may
30
have, in some instances, prevented simulation integration even though participants could
have made their own supporting resources or modified existing resources.
Often, participants from the same school expressed similar struggles incorporating
simulations. This suggests that at some schools certain barriers were institutionalized
enough to be experienced by all team members. At other schools with multiple
participants, differences in participants’ beliefs and experiences led them to identify and
perceive potential barriers differently. Table 6 uses two representative schools to convey
the difference in barriers perceived by participants at the same school.
The two teachers at School X described different barriers to simulation use.
Personal beliefs regarding optimal integration methods and problems with wireless
connections defined Jasmine’s challenges. Jasmine’s colleague Olive also referenced
technical difficulties that may or may not have been related to wireless connectivity.
Olive also highlighted limited instructional time to be a difficult barrier to overcome
when trying to integrate simulations. However, both Jasmine and Olive utilized
simulations in their science instruction, which indicated they did not perceive these
barriers to be overwhelming.
Table 6
Computer Simulation Integration Barriers (Year-end Perception Surveys)
Participant Barrier(s) Barrier type School X (Simulation use by both participants)
31
Jasmine Simulations are great but when you can have all students on a computer it is more effective. For each student to have an in-depth experience it is better for them to interact with the simulation…Computer wireless is not very reliable in our school
Computer Access Technical Infrastructure
Olive The factors that affect my ability to use computer simulations are the limited time I actually get to spend with my students and the sporadic technical difficulties
Instructional Time Technical Infrastructure
School Y (Simulation use by 2 0f 5 participants)Ciara
Lonnie
I loved using simulations and so did the students! My limitations were availability of the computers... I prefer to allow the students to either work alone or with a partner on a specific simulation... I do not think whole group is as effective, but it is still much better than not having access at allI love computer simulations and feel they are important for the students… It is difficult to use with students other than whole group because of the availability of computers
Computer Access
Computer Access
Max The problem I had using computer simulations was being able to get in to use the computer lab. I was also told by the assistant principal that we should be doing reading and math when in the computer lab, not computer simulations for science
Computer Access Administration/School
Policy
Jessie I have the desire to use simulations but with such a large class of 33 students we didn’t have adequate computers
Computer access
Rory One of the main factors was time and that the administration did not totally understand how the methods we learned through VISTA could benefit students…The pacing is too fast. Our main focus was not science. When the students had individual time to use the computer, they were only allowed to do JLAB or RTI’s
Instructional Time Administration/School
Policy
32
At School Y, 4 of the 5 teachers mentioned inadequate computer access and 2
referenced an administrative policy that they perceived prevented simulation use. There
was no evidence that Jesse and Rory used simulations during the academic year while the
other 3 participants at School Y did. In fact, Ciara and Lonnie took the time to extol the
value of simulations for science instruction despite the perceived barriers. For Jesse and
Rory, it is possible the educational value they ascribed to simulations did not warrant the
additional effort it would take to overcome perceived implementation barriers. These
examples highlight the institutional and personal nature of implementation barriers
participants experienced. Furthermore, the instructional value participants ascribed to
simulations either ameliorated or solidified certain perceived implementation barriers.
Additional evidence existed that participants’ beliefs about simulations were a
barrier to implementation. At both schools, participants mentioned a desire to use
simulations for individual or paired student work rather than whole group instruction.
Jasmine, Ciara, and Lonnie believed that whole group instruction was not as effective or
engaging for students versus individual computer use. Because of this pedagogical
belief, the availability of computers became a barrier. If participants believed whole
group instruction was equal to or more effective than individual computer use, limited
access to computers would not be a barrier assuming the teacher had access to a
projection screen and at least one classroom computer.
Summary
Twenty-three (35%) additional participants used simulations post-professional
development compared with pre-professional development. More than half of all
observed post-professional development lessons involved inquiry-based teaching.
33
Participants spoke about several perceived benefits to simulation use including the
content-related support they provided, their simplicity in use, and the fact they allowed
participants to provide students important technology use that engaged students in
scientific work. The majority of participants described at least one barrier to simulation
use. However, participants experienced similar barriers differently highlighting the
individual and subjective nature of technology integration barriers, particularly access to
computers. In addition, participants at different schools identified different integration
barriers indicating each school reflects a context that hinders or supports innovation
adoption in unique ways.
Discussion/Implications
The purpose of this study was to determine whether Innovation Adoption Theory
could serve as a useful guide to encourage elementary science teachers’ simulation use.
Although a number of studies have examined classroom-based outcomes of technology-
related professional development (Gerard et al., 2011; Graham, et al., 2009; Guzey &
Roehrig, 2009), very few studies examined elementary teachers specifically and to our
knowledge, none have examined simulation use exclusively. In addition, IA Theory is
rarely applied to educational settings (Rogers, 2003) and to the researchers’ knowledge
has not been specifically applied to educational computer simulation use and professional
development. Therefore, the results of this investigation begin to fill a research void by
documenting simulation use among 67 elementary teachers for one year prior to and one
year following professional development aligned with IA theory. This pre-/post-research
design that included lesson observations is rare in professional development studies
34
(Lawless & Pellegrino, 2007) and allowed for changes in participants’ instruction to be
more directly attributed to the professional development.
Innovation Adoption Theory in Professional Development
Unfortunately, little educational technology professional development research
actually documents change in teachers’ practices which makes it difficult to compare this
study’s outcomes with others (Higgins & Spitulinik, 2008; Watson, 2006). However, a
few case studies suggest diffuse education technology professional development may not
yield sustained educational technology integration (Graham et al., 2009; Guzey &
Roehrig, 2009; Zhao & Bryant, 2006). Limited change in teachers’ beliefs and practices
may result from professional development that does not convince participants of the
value of each instructional tool.
This study utilized IA theory in a research domain where it is not commonly
utilized and in a novel way that may help further advance the theory’s utility. Only 8%
of all IA studies have been completed within educational settings (Rogers, 2003). Of
these studies, authors described differences between educational settings, innovation
adoption patterns, and thus drew inferences about school characteristics that might
predict innovativeness Rogers, 2003; Straub, 2009). Our study attempted to determine
how to facilitate adoption rather than just describe why adoption occurs. As a result of
this novel application, we were able to identify, within the context of science education
computer simulations, the utility of IA Theory in professional development as well as its
shortfalls.
As described in the methods, the elements of the VISTA PD were consistent with
IA Theory and best professional development practices (Ketelhut & Schifter, 2011; Pope
35
et al., 2008). IA Theory suggests professional development that focuses specifically on a
single educational technology innovation allows implementers to highlight the benefits
and provide specific support for that particular innovation as teachers attempt to use the
technology tool for the first time. The professional development in the present study
initially made participants aware of simulations (stage 1), highlighted their instructional
benefits for certain science content and inquiry-based pedagogy (stage 2), provided
content-relevant lesson planning opportunities and encouraged initial use within a camp-
setting in which traditional school context concerns were eliminated (stages 3 and 4) to
increase the likelihood the participants would reach the final stage of simulation
adoption. This study demonstrated that professional development that explicitly
addresses the five stages of the innovation adoption model within the context of a single
educational technology can result in significant adoption and instructional integration.
Elementary Teachers and Simulations
Teachers often find inquiry-instruction challenging to incorporate and therefore
reform is difficult even with the best-designed professional development (Anderson,
2002; Waight & Ebd-El_Khalick, 2007; Windschitl, 2002). Further, teachers often need
to first get accustomed to and comfortable with new technologies before attempting to
use them for inquiry-based teaching (Varma, Husic, & Linn, 2008; Schnittka & Bell,
2009). In some instances, participants may not have implemented educational
technologies for desirable inquiry-based instruction following participation in other
educational technology preparation programs because of these programs’ diffuse nature
that could not address specific affordances and appropriate instructional use of each
educational technology (Dunleavy et al., 2007; Graham et al., 2009). For example,
36
Graham et al. (2009) found participants reported greater confidence using technology to
teach about science rather than do science after participants learned about and were given
access to digital microscopes, simulations and other digital technologies. In contrast, the
professional development described in the present investigation increased participants’
confidence using simulations and subsequent inquiry-based implementation by helping
participants reach stage 3 in the innovation adoption model. Most participants were
convinced simulations could improve their science instruction and facilitate inquiry-
instruction.
This finding is important in light of the fact that most participants did not have
science-related degrees and likely did not have extensive scientific inquiry experience.
Previous research documents elementary teachers’ limited content knowledge and inquiry
experience (Ginns & Watters, 1995; Schoon & Boone, 1998). This study suggests that
when elementary teachers are explicitly made aware of the fact simulations support
inquiry-based learning of difficult to visualize science phenomena, they will judiciously
take advantage of these affordances to strengthen their science instruction.
Barriers to Simulation Adoption
A comparison of School X and Y (given in results) and analysis of interview data
reveals two very important barriers to simulation adoption: school context and teachers’
beliefs. Although participants at both schools mentioned poor technology infrastructure
or inadequate computer access, only participants at school Y mentioned administrative
policies that impeded instructional simulation use. This indicates that while the
professional development treated all participants similarly, differences in school cultures
may have influenced adoption in schools in unique ways. In particular, a top-down
37
administrative approach that did not reflect high regard for science instructional time or
technology integration may have superseded and overcome participants’ desire to use
simulations.
Teachers’ beliefs about educational technologies can facilitate or hinder
technology adoption and desirable instructional use (Neiss, 2008). Previous educational
technology integration research has focused on teachers’ beliefs regarding the benefits of
educational technologies for different instructional purposes (Dawson & Heinecke, 2004;
Neiss, 2008). This study revealed an additional belief that may serve as a barrier to
educational technology use, especially computer simulations. Despite emerging evidence
that whole group use of simulations fosters more exploratory, critical thinking, and talk
between teachers and students (Smetana & Bell, 2014), many teachers in the present
study expressed a belief that it was more productive for students to use simulations in a
one-to-one student-to-computer ratio. In addition to providing opportunities for teachers
to probe and identify students thinking during whole group instruction, the teacher can
pace student movement through a simulation, direct student attention to critical
components, and model scientific thinking and reasoning. As a result, whole group
instruction has the potential to overcome some other difficulties that may accompany
educational technology use including student computer use habits and classroom
management concerns (Guzey& Roehrig, 2009; Smetana & Bell, 2014; Wecker, Kohnle,
& Fisher, 2007).
Participants’ belief that whole group instruction with simulations is less effective
than small group or individual instruction led to participants’ perception that there were
insufficient computers available to support curricular simulation integration. This finding
38
highlights the subjective nature of many technology integration barriers and identifies a
common belief that should be addressed during computer simulation professional
development. Teachers need to identify and confront their beliefs about optimal means
of instructional simulation integration. As simulations become more common, are
integrated into standardized tests (Quellmalz, Timms, Silberglitt, & Buckley, 2012), and
pressure for instructional use grows (NRC, 2011), teachers will need to confront and
overcome the belief that a one-to-one student to computer ratio is ideal and that computer
access that does not support this ratio is an integration barrier.
One limitation of Innovation Adoption Theory for professional development
design is an absence of the consideration of participants’ beliefs. During stage 2 of the
adoption process, teachers need to become aware of the benefits of simulations and the
benefits of whole group vs. individual or small group use. A one-to-one student to
computer ratio can offer its own advantages, but these do not necessarily supersede those
of whole group instruction or other implementation structures (Dunleavy et al., 2007;
NRC, 2011). Unless simulation professional development programs address this
prevalent belief, insufficient computer access will continue to be the most commonly
cited barrier to educational technology use (Pennell & Ewing-Taylor, 2012). Thus,
professional development implementers should facilitate conceptual change by modeling
whole group simulation use, explicitly addressing the benefits of whole group
instructional use, provide participants opportunities to practice using simulations for
whole group instruction, and give participants opportunities to reflect on these practice
lessons.
39
Limitations
One of the major limitations of the study is that instructional simulation use could
not be documented for a longer duration after the professional development. Other
studies have documented a sigmoidal curve that depicts a social group’s adoption of an
innovation (Rogers, 2003). Once individuals in a population are aware of an innovation
there is typically an initial rapid rate of adoption followed by a steady slow increase until
a plateau is reached in the percent of the population using an innovation (Rogers, 2003).
It is possible our study only captured adoption among the early adopters and not among
the entire teacher participant group that occurred over a more extended time.
Although 52% of participants used simulations following the professional
development, 48% did not. Some critics may argue that the study may not have been as
successful in facilitating participants’ simulation adoption as one would like. However,
the innovation adoption is a process that occurs within social settings (Rogers, 2003). In
addition, some people are more likely to adopt an innovation more quickly than others.
Innovations diffuse within social networks as individuals communicate about the
innovation and as late adopters observe the innovation’s use among their peers and
continue to develop a positive attitude that ultimately leads to adoption (Rogers, 2003).
Another limitation of the study is that our data collection methods may not have
permitted accurate reflection of the percent of participants that used simulations.
Quarterly Lesson Reports documented each participant’s instruction for 28 days in the
academic year both pre- and post- professional development and were used to determine
the percent of participants that sued simulations. While this data collection method may
have cataloged many teachers breadth of instructional practices, it may not have been
40
sufficient to capture simulation use among all participants. Thus, the percent of
participants that sued simulations pre- and/or- post- professional development may be
conservative. Future studies should consider surveying participants to triangulate other
data sources used to document simulation use.
Implications/Future Research
This study utilized a novel framework, Innovation Adoption Theory, traditionally
utilized in business rather than education settings. Baseline data demonstrated
elementary teachers do not widely utilize simulations during instruction and there are
very real opportunities for curricular knowledge growth. The results of this investigation
demonstrated how professional development aligned with Innovation Adoption Theory
can foster elementary teachers’ simulation adoption. Future research should examine
secondary teachers simulation use and professional development outcomes guided by
Innovation Adoption Theory. In addition, it is likely Innovation Adoption Theory,
applied to professional development programs involving other additional educational
technologies will yield positive outcomes. This assumption should be examined using a
range of innovations.
Despite a significant increase in the percent of participants that used simulations
post professional development, there was no evidence 48% of the participants used
simulations following the professional development. The belief that a one-to-one student
to computer ratio is optimal resulted in many participants perceiving a barrier related to
limited computer availability. Thus, we recommend the innovation adoption model be
augmented to address participants’ beliefs when applied to professional development. In
addition to the five adoption stages outlined by Rogers (1985), we posit an additional
41
stage and modification to the reflective stage are necessary to explicitly identify and
confront participants’ beliefs (Table 7). This augmented model includes an opportunity
for professional development implementers to help participants articulate their beliefs
about the innovation and best pedagogical use (new stage 2) so that during stage 3 these
beliefs can be considered as implementers work to convince participants of the
innovations’ value. Conceptual change research indicates that initial conceptions, when
not wholly accurate, need to be subsequently reconsidered after a discrepant event
(Posner, Strike, Hewson, & Gertzog, 1982; Vosniadou, 1994). After implementers lead
participants through an initial implementation attempt, the final reflection should afford
an opportunity for participants to re-examine their initial beliefs. Thus, the final
innovation adoption stage should not only be a reflection of the success/failure of the
initial implementation experience, but also a re-examination of initial beliefs identified in
stage 2. The challenge to this proposed model is that it necessitates at least two
professional development meetings or an alternative means for participants to revisit and
discuss their initial beliefs with implementers or instructional coaches.
42
Table 7
Comparison of Original and Modified Innovation Adoption Models
Original Five Stage Model Modified Six Stage Model for PDStage Description Stage DescriptionStage One: Participant Awareness/Introduction
Implementers make participants aware of innovation.
Stage One: Participant Awareness/Introduction
Implementers make participants aware of innovation.
Stage Two: Overview Benefits
Implementer overviews general benefits
Stage Two: Beliefs
Participant articulates beliefs about innovation, including when and how it should be used
Stage Three: Decision to use/not use
Participant decides whether or not to attempt to use innovation
Stage Three: Overview Benefits
Implementers overview the innovation benefits and address participants alternative conceptions about how/when to us the innovation through modeling and explicit instruction
Stage Four: First Innovation Use
Participant attempts to use innovation for the first time.
Stage Four: Decision to Use
Participant decides whether or not to attempt to use innovation.
Stage Five: Reflection
Participant reflects innovation use and decides to either fully adopt or reject the innovation.
Stage Five: First Innovation Use
Participant attempts to use innovation for the first time.
Stage Six: Reflection
Participant reflects on initial innovation use experience and revisits initial beliefs identified in stage two. The participant decides to either fully adopt or reject the innovation.
The participants in the study explicitly recognized the value of simulations to
complement gaps in their content knowledge and thus improve science instruction.
Furthermore, in contrast to previous research, participants used simulations to help
students do science rather than simply learn about it. Students in participants’ post-
43
professional development lessons that included simulations engaged in data collection
and analysis to answer an overarching research question to explore science concepts that
would have been difficult using traditional materials. The participants in the present
study used simulations to facilitate inquiry teaching in general and to expand the breadth
of potential science content the pedagogy could be married with. These professional
development outcomes indicate professional development that highlights the affordances
of simulations for inquiry learning and may help overcome common inquiry teaching
barriers. Even greater adoption of simulations may be possible if participants are urged
to articulate their beliefs about optimal implementation structure (one-to-one student
computer ratio) and are given opportunities to confront those beliefs using the Modified
Innovation Adoption Model.
This research was supported by funding from the U.S. Department of Education Investing in Innovation (I3) grant program. However, the results presented here do not necessarily represent the policy of the U.S. Department of Education, and you should not assume endorsement by the Federal government.
44
References
Abd-El-Khalick, A., Boujaoude, S., Duschl, R., Lederman, N., Mamlok-Naaman, R.,
Hofstein, A., Treagust, D., & Tuan, Hsiao-Lin. (2004). Inquiry in science
education: International perspectives. Culture and Comparative Studies, 397-418.
Aldrich, C. (2009). Learning online with games, simulations and virtual worlds. San
Francisco, CA: Jossey-Bass.
Anderson, R. (2002). Reforming science teaching: What research says about inquiry.
Journal of Science Teacher Education, 13, 1-12.
Chiu, J. L., & Linn, M. C. (2012). The role of self-monitoring in learning chemistry
with dynamic visualizations. In Zohar, A., Dori, Y., J. (Eds), Metacognition in
science education (pp. 133-164). New York: Springer.
Dawson, K., & Heinecke, W. (2004). Conditions, processes and consequences of
technology use: A case study. Technology, Pedagogy and Education, 13, 61-82
de Jong, T. & van Joolingen, W. R. (1998). Scientific discovery learning with computer
simulations of conceptual domains. Review of Educational Research, 68, 179-
201.
Dega, B. G., Kriek, J., & Mogese, T. F. (2013). Students conceptual change in
electricity and magnetism using simulations: A comparison of cognitive
perturbation and cognitive conflict. Journal of Research in Science Teaching, 50,
677-698.
Dunleavy, M., Dexter, S., & Heinecke, W. F. (2007). What added value does a 1:1
student to laptop ratio bring to technology-supported teaching and learning?
Journal of Computer Assisted Learning, 23, 440-452.
45
Finkelstein, N. D., Adams, W. K., Keller, C. J., Kohl, P. B. Perkins, K. K., Podolofsky,
N. S., & Reid, S. (2005). When learning about the real world is better done
virtually: A study of substituting computer simulations for laboratory equipment.
Physical Review Special Topics – Physics Education Research, 1(010103), 1-8.
Gerard, L. F., Varma, K., Corliss, S. B., & Linn, M. C. (2011). Professional
development for technology-enhanced inquiry science. Review of Educational
Research, 81, 408-448.
Gibson, H. L., & Chase, C. (2002). Longitudinal impact of an inquiry-based science
program on middle school students’ attitudes towards science. Science
Education, 86, 693-705.
Ginns, I. S., & Watters, J. J. (1995). An analysis of scientific understandings of
preservice elementary teacher education students. Journal of Research in Science
Teaching, 32, 205-222.
Graham, C. R., Burgoyne, N., Cantrell, P., Smith L., St. Clair, L., & Harris, R. (2009).
TPACK development in science teaching: Measuring the TPACK confidence of
inservice science teachers. TechTrends, 53(5), 70-79.
Guzey, S. S., & Roehrig, G. H. (2009). Teaching science with technology: Case studies
of science teachers’ development of technology, pedagogy, and content
knowledge. Contemporary Issues in Technology and Teacher Education, 9, 25-
45.
Hesse-Biber, S., N. (2010). Mixed methods research: Merging methods with practice.
The Guilford Press: NY
46
Higgins, T., E., & Spitulinik, M, W. (2008). Supporting teachers' use of technology in
science instruction through professional development: A literature review.
Journal of Science Education and Technology, 17, 51-521.
Ireland, J. E., Watters, J. J., Brownlee, J., & Lupton, M. (2012). Elementary teachers'
conceptions of inquiry teaching: Messages for teacher development. Journal of
Science Teacher Education, 23, 159-175.
Ketelhut, D. J., & Schifter, C. C. (2011). Teachers and game-based learning: Improving
understanding of how to increase efficacy of adoption. Computers & Education,
56, 529-546.
Kinzie, M., Strauss, R., & Foss, J. (1993). The effects of an interactive dissection
simulation on the performance and achievement of high school biology students.
Journal of Research in Science Teaching, 30, 989-1000.
Klahr, D., Triona, L. M., & Williams, C. (2007). Hands on what? The relative
effectiveness of physical versus virtual materials in an engineering design project
by middle school children. Journal of Research in Science Teaching, 44, 183-
203.
Kubicek, J., P. (2005). Inquiry-based learning, the nature of science, and computer
technology: New possibilities in science education. Canadian Journal of
Learning and Technology, 31 (1).
Lawless, K. A., & Pellegrino, J. W. (2007). Professional development in integrating
technology into teaching and learning: Knowns, unknowns, and ways to pursue
better questions and answers. Review of Educational Research, 77, 575-614.
47
Liao, Y. & Chen, Y. (2007). The effect of computer simulation instruction on student
learning: A meta-analysis of studies in Taiwan. Journal of Information
Technology and Application, 2, 66-79.
Merriam, S. B. (2009). Qualitative research: A guide to design and implementation. San
Francisco, CA: Jossey-Bass
Morrison, J. A, (2013). Exploring exemplary elementary teachers' conceptions and
implementation of inquiry science. Journal of Science Teaching & Education, 24,
573-588.
National Research Council (NRC). (1996). The national science education standards.
Washington, DC: National Academy Press.
National Research Council (NRC). (2011). Learning science through computer games
and simulations. Washington, DC: National Academy Press.
National Research Council (NRC). (2012). A framework for K-12 science instruction.
Washington, DC: National Academy Press.
Neiss, M. L. (2008). Guiding preservice teachers in PCK. Technological pedagogical
content knowledge for educators. Ed: AACTE Committee on Innovation and
Technology.pp. 223-250 New York, NY: Routledge
Pennell, S., & Ewing-Taylor, J. (2012, March). 21st Century Learning and Teaching
through Integrating Technology into Inquiry-Based Professional Development. In
Society for Information Technology & Teacher Education International
Conference (Vol. 2012, No. 1, pp. 2055-2060).
Plass, J. L., Milne, C., Homer, D., Schwartz, R. N., Hayward, E. O., Jordan, T.,
Verkuilen, J., Florre, N., Wang, Y., & Barrientos, J. (2012). Investigating the
48
effectiveness of computer simulations for chemistry learning. Journal of
Research in Science Teaching, 49, 394-419.
Pope, M., Jayroe, T., Franz, D., & Hamil, B. (2008). Teacher candidates and
technology: making integration happen. National Forum of Applied Educational
Research Journal, 21(3), 1-9.
Posner, G. J., Strike, K. A., Hewson, P. W., & Gertzog, W. A. (1982). Accommodation
of a scientific conception: Toward a theory of conceptual change. Science
Education, 66, 211-227.
Quellmalz, E. S., Timms, M. J., Silberglitt, M., & Buckley, B. C. (2012). Science
assessment for all: Integrating science simulations into balanced state science
assessment systems. Journal of Research in Science Teaching, 49, 363-393.
Rogers, E. M. (1985). New product adoption and diffusion. Journal of Consumer
Research, 2, 290-301.
Rogers, E. M. (2003). Diffusion of Innovations (5th Edition). New York, NY: Free
Press.
Schnittka, C., & Bell, R. L. (2009). Preservice biology teachers’ use of interactive
display systems to support reforms-based science instruction. Contemporary
Issues in Technology and Teacher Education, 9(2), 131-159.
Schoon, K. J., & Boone, W. J. (1998). Self-efficacy and alternative conceptions of
science preservice elementary teachers. Science Education, 82, 553-568.
Schwartz-Shea, P., & Yanow, D. (2012). Interpretive research design: Concepts and
processes. New York, NY: Routledge.
49
Smetana, L., & Bell, R. L. (2011). Computer simulations to support science instruction
and learning: A critical review of the literature. International Journal of Science
Education, 1-34.
Smetana, L. K., & Bell, R. L. (2014). Which setting to choose: Comparison of whole-
class vs. Small-group computer simulation use. Journal of Science Education and
Technology, 23, 481-495
Sterling, D. R., & Frazier, W. M. (2010). Maximizing uncertified teachers potential.
Principal Leadership, 10, 48-52.
Sterling, D. R., Matkins, J. J., Frazier, W. M., & Logerwell, M. G. (2007). Science camp
as a transformative experience for students, parents, and teachers in the urban
setting. School Science and Mathematics, 10, 134-148.
Straub, E. T. (2009). Understanding Technology Adoption: Theory and Future
Directions for Informal Learning. Review of Educational Research, 79, 625-649.
Sun, K., Lin, Y., & Yu, C. (2008). A study on learning effect among different learning
styles in a web-based lab of science for elementary students. Computers &
Education, 50, 1411-1422.
Trundle, K., & Bell, R. (2010). The use of a computer simulation to promote conceptual
change: A quasi-experimental study. Computers & Education, 54, 1078-1088.
Varma, K., Husic, F., & Linn, M. C. (2008). Targeted support for using technology
enhanced science inquiry modules. Journal of Science Educational Technology,
17, 341-354.
Vosniadou, S. (1994). Capturing and modeling the process of conceptual change.
Learning and Instruction, 4, 45-69.
50
Waight, N., & Abd-El-Khalick, F. (2007). The impact of technology on the enactment
of "inquiry" in a technology enthusiast's sixth grade science classroom. Journal
of Research in Science Teaching, 44, 154-182.
Watson, G. (2006). Technology professional development: Long-term effects on teacher
self-efficacy. Journal of Technology and Teacher Education, 14, 151-165.
Wecker, C., Kohnle, C., & Fischer, F. (2007). Computer literacy and inquiry learning:
When geeks learn less. Journal of Computer Assisted Learning, 23, 133-144.
Winberg, T. M., & Berg, A. R. (2007). Students’ cognitive focus during a chemistry
laboratory exercise: Effects of a computer simulated pre-lab. Journal of Research
in Science Teaching, 44, 1108-1133.
Windschitl, M. (2000). Supporting the development of science inquiry skills with
special classes of software. Educational Technology Research and Development,
48, 81-95.
Windschitl, M. (2002). Framing constructivism in practice as the negotiation of
dilemmas: An analysis of the conceptual, pedagogical, cultural, and political
changes facing teachers. Review of Educational Research, 72, 131-175.
Zacharia, Z. C. (2007). Comparing and combining real and virtual experimentation: An
effort to enhance students' conceptual understanding of electric circuits. Journal
of Computer Assisted Learning, 23, 120-132.
Zhao, Y., & Bryant, F. L. (2006). Can teacher technology integration training alone lead
to high levels of technology integration? A qualitative look at teachers’
technology integration after state mandated technology training. Electronic
Journal for the Integration of Technology in Education, 5(1), 53-62.
51
Appendix AVISTA ExploreLearning® Use Survey Questions
This survey is designed to learn more about your use of ExploreLearning® Gizmos®. Your answers will be blinded.
Name: School:Date:
1. Did you use computer simulations during science instruction prior to this year? Never Seldom Frequently Very
Frequently2. What aspect(s) of the ExploreLearning website do you use:
A) Gizmos 1 2 3 4 5
B) Teacher Guides 1 2 3 4 5
C) Student Exploration Sheets 1 2 3 4 5
D) vocabulary guides 1 2 3 4 5
E) assessment questions 1 2 3 4 5
F) “class" features 1 2 3 4 5
G) textbook correlations 1 2 3 4 5
H) state/province standards 1 2 3 4 5
I) help center 1 2 3 4 5
J) user submitted materials 1 2 3 4 5
k) sharing lists 1 2 3 4 5
l) Gizmo recommendations 1 2 3 4 5
52
m) other (open-ended response)
Never Seldom Frequently Very Frequently
3. What types of lessons do you use Gizmos for?A) Inquiry-based instruction 1 2 3 4 5
B) Problem-based instruction 1 2 3 45
C) Nature of science instruction 1 2 3 45
D) Content/Concept-focused instruction 1 2 3 45
Never Seldom Frequently Very Frequently
4. Of the times you use Gizmos rate the frequency with which you use them in each of the following formats:
A) Whole class (teacher presents and manipulates 1 2 3 4 5 Gizmo in front of entire class)
B) Small group/ pairs of students working together 1 2 3 4 5 C) Individual (students work individually) 1 2 3 4 5
D) Rotating centers 1 2 3 4 5
E) Other (please describe) Not Somewhat Very
Effective Effective Effective5. Rate the effectiveness of the ExploreLearning 1 2 3 4 5
VISTA professional development in preparingyou to incorporate Gizmos in your science instruction.
6. Rate the user-friendliness of the 1 2 3 4 5ExploreLearning website.
53
7. Approximately, how often do you use ExploreLearning Gizmos in instruction?
Daily Several times/week Once a week Twice a month Monthly Once a year Never
8. List all of the Gizmos you have used in the last 3 months.
9. Can a member of the VISTA Research and Evaluation team contact you to clarify a survey response or learn more about your instructional Gizmo use?
No Yes
Appendix BInterview Protocol
Topic 1: Science Education Computer Simulation Adoption
1. Tell me about your experiences using simulations. 2. What are some of the simulations you have used during your science
instruction?
Topic 2: Science Content and Computer Simulation Use
1. Describe any science content you have used simulations to help you address in your science instruction. Do you think the simulation was effective in helping teach the content? Why?
Topic 3: Implementation Patterns
1. Describe any instances you have used computer simulations during your science instruction for inquiry teaching. Probes: Why would you describe this lesson as inquiry-based? Why did you choose to use a simulation instead of another activity for this lesson?
2. Describe any instances you have used computer simulations during your science instruction to help students understand the nature of science. Probes: Are there any particular nature of science aspects you find simulations most helpful in addressing? Why? Are there any nature of science aspects simulations are not useful in addressing? Why?
3. Have you used a simulation within a problem-based learning unit? If so, describe the unit and how the simulation was used. Probes: Why would you say this was a problem-based unit? Do you think the simulation helped students solve the overarching problem? If so, how?
4. Have you ever used a computer simulation to help students prepare for a future activity, such as a presentation or hands on lab? If so explain. Probe:
54
Do you think the simulation helped students be successful in the other activity? Explain.
Topic 4: Advantages/Disadvantages of Science Education Computer Simulations
1. What are some of the advantages or disadvantages you have found to using simulations? Probes: In what ways have they affected the content you teach? In what ways have they affected the science skills you teach?
2. Why do you choose to use a simulation during science instruction instead of a hands- on lab?
3. Describe how you incorporate a simulation into a lesson. Probes: Do you have students use the simulation before introducing the subject matter or afterwards? Do you use handouts or other supplementary materials to guide them through the simulations? If so, where have these supplemental materials come from? Do you use one computer/projector to give whole class instruction or do students work individually or in groups/pairs? What factors do you consider when making these choices?
4. Describe your role when students are using a simulation. Probes: What sorts of comments might you make to students? How often do you visit a student using a simulation? Describe the amount of and type of support/guidance you give individual students or the class. How is this support similar and/or different to the support you provide during hands on labs? Why do you provide the instructional support that you do?
5. Describe any evidence you have that simulations were not successful in helping students learn the intended content/skills. Probes: Describe comments students made that indicated lack of understanding? Describe instances when students asked classmates for help? How did assessments indicate understanding/or lack of understanding of content/skills? Why do you think the simulation was not helpful in the lesson you described.
6. Do some of your students have different experiences or responses to computer simulations? If there are differences, why do you believe these exist? Explain.
7. What are some simulation characteristics you consider when choosing one to incorporate into your lesson?
Topic 5: Barriers
Describe any factors that would make you more likely to use more simulations in future lessons. Probes: Describe any recommendations you have for further professional development? How would you like that training to occur (where,
55
duration, content)? Would changes in access to computer technology change your use of computer simulations?
Appendix CQuarterly Lesson Reports
Section I. Background InformationObserver: Observation # (bold one): 1 2 3 4Teacher Name: School: Grade Level/Content Area: Date: Start Time: End Time: Total number of students in class:
Section II. Contextual Background Ask teacher before observing:
A. Objective(s) for lesson:
B. How does lesson fit in the current context of instruction? (e.g. connection to previous and other lessons; What topics/ activities/ lessons occurred in the three science lessons prior to this lesson? What topics/ activities/ lessons will be covered in the three science lessons following this lesson?) All blanks should be completed and answers should be based on the teacher’s interpretation of the lesson, not the coach’s.Y = yes, the lesson includes this criteria, N = no, the lesson does not include this criteria, DK = participant indicates they either don’t know what the criteria means or whether the lesson meets the criteria
Days Preceding Days FollowingDay 1 Day 2 Day 3 Today Day 1 Day 2 Day 3
Topic(s)Activities . . .Problem-based Learning?Nature of Science?Inquiry?Technology?
56
Note: If you indicated “yes” for PBL, NOS, Inquiry, Tech briefly describe below what made it (why you think it is) a PBL/NOS/Inq/Tech lesson.
C. Classroom setting. Describe anything about the classroom layout that would constrain the teaching of science.
D. Other relevant details about the time, day, students, or teacher that you think are important? (i.e.: teacher bad day, day before spring break, pep rally previous hour, etc.)
Section III. Description of events over time (indicate time when the activity changes). (You may complete this section or include the notes you took on this lesson.) Make sure that you describe the activity.
Time Description of events
Please attach any other documentation from the classroom observation.
57
58