Upload
billie-rothenberger
View
220
Download
1
Tags:
Embed Size (px)
DESCRIPTION
Continuing to Assess Technology in Math Classroom. Not specific to the use of clickers. 2009
Citation preview
Continuing to Assess 1
Running head: CONTINUNING TO ASSESS TECHNOLOGY
Continuing to Assess Technology in Math Classroom for State Assessments
Jeremy Hendrix
Brett Tracy
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 2
Abstract
Every Arizona student has to pass the state mandated Arizona Instrument to Measure
Standards (AIMS) test before they can graduate from high school. Ensuring some weaker
students pass this test is seemingly impossible due to their lack of involvement in their own
learning, their teachers’ lack of caring, or the failure to conducting a proper review before the
test. A central problem in preparing for the AIMS test is providing immediate feedback on
practice tests in the classroom setting. This feedback is vital to all teachers and students because
it makes them aware of deficiencies as students prepare to take the AIMS test. A dull and
uninviting environment causes student participation and engagement in lessons to be less than
desirable. Students need to be engaged in material in a way that they find stimulating and
entertaining if they are too succeed. In order to assist in preparation for AIMS, interactive
technology was implemented into a math classroom in an Arizona high school for a two year
period to determine if a specific combination of technology would affect student success on the
AIMS assessment. Results over the investigated period of time showed, that when students
received immediate feedback and remediation during preparation for the AIMS students had
increased engagement which led to higher assessment scores and ratings.
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 3
Introduction
While teachers prepare students to take state mandated tests, both teachers and students
are usually unaware of how well the students are acquiring the necessary skills and knowledge in
order to pass these tests. The passage of time during which no feedback occurs can be
detrimental to students’ development and progress while preparing for state mandated testing.
Once it is discovered students have fallen behind, the time needed to go back and review during
so mastery can occur can delay teachers and students assessment of students’ knowledge of the
material. The teacher can become frustrated in having to review material again and feel
pressured to get back on a timeline that has been set previously to cover all important and
required concepts for state mandated testing. The students, in addition to their development and
progress being stifled, can become frustrated with the material and stop learning.
With the development of a new interactive technology, student engagement,
participation, and performance can be monitored and assessed quickly. These technology tools,
such as interactive whiteboards (SmartTech’s Smartboard) and Student Response Systems
(eInstruction’s Classroom Performance System), allow teachers to create dynamic lessons and
assess student learning immediately. These new tools encourage students to become engaged as
active participants in their own learning, and allow them to maintain a comfort level with the
material that is being presented.
There has been several research studies conducted on student engagement and
participation in education over the past several years. Such research has demonstrated a direct
correlation between student success and the amount of time they are actively engaged (Wu &
Huang, 2007). However, research studies focused on outcomes when technology is infused into
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 4
the course have occurred primarily at the post-secondary level. At the conclusion of a previous
study, (Tracy & Hendrix, 2008) on the topic, the researchers recommend a study in which
…one teacher collects all of the data … [in order to] will take the element of different teachers and their styles out of the equation. By incorporating this modification, the data should solidify the idea that technology tools which focus on engagement and participation will have an impact on student assessment. (Tracy & Hendrix, 2008)
he purpose of this study is to incorporate this recommendation in a high school math class to
T
determine if interactive technology affects student learning at the secondary level and
successfully prepares students for state mandated testing.
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 5
Context
The community where the study took place is located in central Arizona in the
southwestern United States. In 2002, the State of Arizona mandated the Arizona Instrument to
Measure Standards (AIMS) as a requirement to ensure that its student population met a
satisfactory performance level in order to graduate with a high school (Arizona Revised Statutes
in Section 15 Article 701).
The study was conducted in a school district comprised of nine traditional and one
alternative high school. The student population of this district is approximately 15,000 students
with 900 teachers and administrators. The school district’s mission statement is to “Empower
All Students for the Choices and Challenges of the Twenty-first Century.” In the 2008 Arizona
LEARNS School profiles, seven of the high schools in this district were designated “Excelling,”
the highest possible ranking. The remaining two traditional schools received “Highly
Performing” labels, the second-highest ranking.
The school where the study took place has approximately 1700 students locally on its
campus as was classified as “Excelling.” The high school where the study was conducted was
recently re-classified as a “Title I school”, in which 60% of the students qualify for free and
reduced lunch. The school’s mission statement is “Every student can learn”.
There were approximately 450 sophomore students --the focal group of the study --in the
school each of the last two years. The mathematics area – the academic area focused on in the
study -- consists of twelve highly-qualified teachers in the area of mathematics of whom five
teach sophomore level mathematics. The AIMS review materials used during the study were
used by all five teachers.
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 6
During the first year of research there were 54 students who participated in the study vs. 515
students that took the AIMS math test. In the second year of research 56 students participated in
the study vs. 460 students that took the AIMS math test. In the study, the two “experimental”
groups were compared to each other as well as the rest of the student population – the “control
group” -- during the appropriate academic years.
Most of the technologies utilized by the sophomore teachers are the same in that classrooms
have a computer at the teacher’s station, a projector, and an interactive whiteboard. The only
exception is one technology, a Student Response Systems, which was used by the students in
these experimental groups only.
The Student Response Systems manufactured by eInstruction is a classroom set of
handheld devices that allow students to communicate acquired concepts to their teachers in real
time. Teachers can assess these acquired skills in test format or during a classroom lecture. This
interactive technology promotes student engagement, participation and knowledge acquisition.
Currently, eInstruction’s classroom performance system is being used by over 4 million students
in more than 100,000 K-12 and higher education settings (einstruction.com, 2009)
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 7
Theoretical Framework
Change is consistently occurring in all areas of life. Schools districts and individual
schools are not an exemption to this fact and there have been many changes in school districts,
schools, and classrooms. One of the more prominent changes in these entities has focused on
technology availability and integration. In many areas, money for technology integration has not
been distributed equally amongst states, schools districts, schools, or classrooms (Snyder &
Prinsloo, 2007). Due to this disparity of technology within schools and classrooms the pressing
issues of what technology to purchase, how it is implemented, and how can it change instruction
for students and teachers have begun come to the fore. In 2002, the United States federal
government passed the No Child Left Behind Act (Synder & Prinsloo, 2007) which includes
(Title II Section D) a stated goal of assisting all students to cross the “digital divide” to ensure
that they become technologically literate (Synder & Prinsloo, 2007). How can school districts
and schools help students become technology literate if they cannot purchase and implement
technology into their classrooms?
These events place the successful use and implementation of technology in schools and
classrooms squarely on the shoulders of administrators, teachers, and instructors. Teachers and
instructors will benefit from using technology in their classrooms if their students are motivated
and use technology in the class Great care must be taken when adopting new technologies into
classrooms because they can interfere with the learning process (Hashemzadeh & Wilson, 2007).
In order to avoid this interference, some classroom materials and philosophies would have to be
abandoned in favor of a total redesign of the course (Hedberg, 2006). These new courses would
have to extend beyond the current implementation -- and move beyond just passing information
to students -- into making the students more interactive with the information (Hedberg, 2006).
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 8
Los Angeles-based educator and 1-to-1 computing advocate Gary Stager believes that when
educators become aware of new ways for students to learn, they also realize that many of the
traditional ways we expect students to learn are ineffective (Wambach, 2006). Christensen’s
(Hedberg, 2006) idea of disruptive innovations describes how these changes with technology can
occur and states:
Disruptive innovations or technology is one that eventually takes over the existing dominant technology in the market, despite the fact that the disruptive technology is both radically different from the leading technology and that it often initially performs less successfully that the leading technology according to existing measures of performance, but over time the functionality or the attributes of the new way of doing things replace the other technologies.
This idea of disruptive innovation can then be implemented into the concept that in order to
achieve specific desired effects many tools, instead of one, may need to be employed to create a
range of interactive activities using technical resources by the teacher to create motivation in the
student so he/she will commit more time and energy to learning (Hedberg, 2006).
There have been many research studies conducted on student engagement, participation,
and performance in education over the past several years. Research that has focused specifically
on student participation and engagement has shown that there is a direct correlation in student
success to the time they are actively engaged (Wu & Huang, 2007). However, research studies
that investigated student participation, engagement, and performance with technology infused
into the course have occurred primarily at the post-secondary level. These studies have focused
only on the implementation of a specific form of technology, instead of integrating multiple
technologies into the course. In order to determine if technology can help improve student
engagement, participation, and ultimately performance, all of the technologies need to be
investigated as a single form.
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 9
One of the systems researched to investigate student engagement, participation, and
performance are remote devices that students used to answer questions and provided personal
thoughts to the teacher regarding information being presented during class activities and
presentations. These systems commonly referred to as Student Responses Systems, have begun
to be implemented into educational settings with varying results. Pemberton, Borrego, and
Cohen (2006) noted that with the use of student response systems final grades were not
significantly different between similar groups, but the value of the learning and teaching in
psychology classes that used the Student Response System was much higher than the psychology
classes that did not use this technology at Texas Tech University. In addition, a study conducted
by Texas Instruments and the University of Texas demonstrated that the ability to respond to
questions anonymously created a non-threatening environment in which students felt they were
equals, causing greater participation and engagement in class activities and discussions (Davis,
2003).
Interactive whiteboards are a second form of technology that has been researched to
determine their effects on student engagement, participation, and performance. Interactive
whiteboards bring on a different form of student interaction with the teacher and students.
Haldane (2007) noted in her research that although teachers at the beginning of her study did not
feel interactive whiteboards would affect their planning and teaching of lessons, they did note
that the interactive whiteboard had significantly changed their preparation and teaching practices
by the end of the study. She continued, “It is the user of the board who chooses whether or not
to take full advantage of the digital whiteboard’s interactive potential. The digital board simply
provides an opportunity for interactivity to occur” (p. 259). It is this interactivity that allows the
questions and suggestions to be posed by students, to help ensure that everyone in the classroom
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 10
has grasped the concept to further their own knowledge base. In addition, interactive
whiteboards help in keeping the momentum of the class consistent. Teachers can simply tap an
icon to bring up previous or new material to review at any time during class activities. This is in
contrast to having to make sure the information is consistently available on the static whiteboard,
which may get erased at any time.
Lastly, many forms of technology are still relatively new to education (Moallem,
Kermani, & Chen, 2005), which is especially the case when incorporating online testing systems.
Paper and pencil tests traditionally do not show if students are acquiring the needed knowledge
and skills through the instructional methods that their teachers are currently practicing
(Woodfield & Lewis, 2003). Due to the time it takes to assess results from paper and pencil
tests, many important missing pieces of information are lost in order to move on in the class. If
the information was immediately available teachers could look at tests scores quickly, decide if
the students met expectations in order to explore new topics in the class, or if concepts needed to
be re-taught using different methods before moving on so the information makes sense at that
present moment (Woodfield & Lewis, 2003). The ability to facilitate instantaneous feedback
through online testing allows the teachers and students to quickly assess their own strengths and
weaknesses in their teaching and learning (Moallem, Kermani, & Chen, 2005). This form of
testing has proven to be more motivating than traditional test forms (Woodfield & Lewis, 2003).
Teachers who have implemented this form of testing have additionally reported that student
interest and attention have increased when using the computer to take class tests (Woodfield &
Lewis, 2003). In most cases, teachers have reported that purchased online tests are appropriately
challenging to all students, measure individual student performance, provide data that can be
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 11
compared and analyzed for specific purposes, and closely engage students and teachers in the
educational process (Woodfield & Lewis, 2003).
Instead of looking at each of the technical systems listed above individually, if a teacher
were to look at them as a whole, Christensen’s idea of disruptive innovation would allow for an
investigation into the impact on student motivation, participation, and performance. In order for
successful tasks to occur with multiple forms of technology the following properties need to be
in place within the classroom and supported by the teacher: 1) the tasks to be completed by the
student are complex enough to engage students in upper level thinking and performance skills; 2)
they exemplify authentic work and learning; 3) they are open enough to encourage different
learning approaches, but constrained enough so they do not become out of hand; and 4) records
and information can be collected and calculated quickly for assessment (Collins, Hawkins, &
Frederiksen, 1993). By integrating interactive whiteboards, anonymous response systems, and
immediate feedback systems together, students can accomplish these tasks in addition to
interacting with the information as an individual or as a group, practicing self-assessments,
taking assessments, and receiving feedback immediately (Collins, Hawkins, & Frederiksen,
1993). All of these pieces together can help facilitate non-dominating conversations, debates,
and arguments so that ideas and knowledge can be shaped or reshaped allowing students to own
the information and ensuring success in the future (Masters & Oberprieler, 2004). To confirm
that learning, participation, and motivation are occurring it is suggested that work discussed
should be tied to a classroom assignment. This would serve as encouragement for students to
participate more in class (Masters & Oberprieler, 2004).
With the teachers integrating multiple tools to increase participation, motivation, and
performance in their classrooms, they now have vast amounts of data about their students at a
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 12
moment’s notice. Teachers who use all of these tools can track how students are interacting and
performing in class (Morris, Finnegan, & Wu, 2005). With this information in hand, teachers
can then direct students towards areas that refinement and review needs to occur (Morris,
Finnegan, & Wu, 2005). This immediate directional change by the teacher demonstrates to the
student the teacher is invested in their success in class. By the students knowing the teacher is
invested in them, the students then begins to grow an excitement for learning and their
confidence continues to grow so they are prepared for success on the next homework
assignment, quiz, test, and the future. Teachers can have an enormous impact on their students
by holding them accountable to the material that is being presented.
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 13
Method
The district has implemented various forms of technology into classrooms to better prepare
students for the AIMS test, including interactive whiteboards and LCD projectors. However,
with these technologies student success on the AIMS math assessment has not improved and, in
fact, has remained relatively unchanged throughout the last five years with a few moderate
increases and decreases in the rate at which students pass. There have been no formal
comparison studies to see if these technologies have made an impact on the students’ AIMS test
scores and the technologies have been implemented sparingly and not used as a whole.
The high school district, in which the school-site where the study took place, has
implemented a three-week AIMS preparation module into the curriculum of all sophomore math
classes in conjunction with the technology they have previously implemented. This preparation
came into effect due to low scores by too many students’ on prior AIMS tests. The experimental
investigation coincided with the delivery of the AIMS preparation module lasting three weeks
and including a pre-test, a post-test, and four review packets. The pre-test and post-test, although
proctored by the teachers in their classrooms, are graded and analyzed by the high school district.
Upon analysis completion pre-test and post-test scores are reported back to the teachers in a time
delayed state. The four review packets consist of multiple choice questions designed to simulate
the AIMS math assessment. Their intended purpose is for the student to review questions that
have similar format, scope and sequence, and level of difficulty to the actual AIMS math
assessment.
Based of a previous study (Tracy and Hendrix, 2008), the best technology build out was
to incorporate all the technologies previously implemented by the district with the addition of the
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 14
Student Response System. With the addition of this interactive technology teachers and students
were able to communicate directly about what concepts were mastered in preparation for the
AIMS math assessment. During the study the Student Response System was used in the testing
mode to instantaneously show the students and teacher what concepts were mastered from the
district created review module. From this information the teacher was able to immediately
review only the concepts that were not at a mastery level, so that students could continue to
progress through the material in a timely fashion. When compared to other teachers who had to
manually grade the review packets and return them the next day, the student response system
accomplished this time intensive task immediately. In addition, the Student Response System
provided immediate analysis of student progress helping the teacher better prepare the students
to focus only on weaker concepts. Whereas the teachers who had to grade by hand were at a
disadvantage because they had to calculate this information on their own.
In the two study groups the students would spend approximately 20 to 30 minutes of the
allotted class time to work on small portions of the review packets and input their answers using
the Student Response System. The remaining 20 to 30 minutes was spent covering the questions
that were missed by the students. With the use of the Student Response System the teacher knew
exactly which students missed which questions. This allowed the teacher to direct the attention
towards the students that needed remediation and away from the students that had already
mastered these concepts.
During this remediation time the teacher would call on students that missed particular
questions to work the problem on the interactive whiteboard and then would poll the class using
the Student Response System so the student could get peer feedback to determine if they did the
problem correctly. If the problem was done correctly the teacher would move on to the next
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 15
problem. If the problem was done incorrectly another student was randomly selected to rework
the same problem on the interactive whiteboard to fix any mistakes that were made. The class
was then polled again to make sure of mastery on this particular problem. The original student
called to the interactive whiteboard, was then questioned to make sure that they understood the
mistake they had made and why the correction was needed.
Once an entire review packet was completed, the teacher would post the grades of all
students for the class to see. When the teacher posted these scores students names were
displayed next to their scores for the whole class to see. The teacher made sure to tell the class
that every student is here to learn and nobody is to be made fun of for their score. After the
posting of the overall score for the review packet, the teacher then posted the individual question
report that also had students names displayed with what they had previously answered on the
question. The teacher then went over this again question by question, filling in any gaps that the
students may still have had on these problems. This process was repeated for the four review
packets as well as the pre and post-tests.
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 16
Findings
It is hypothesized that the implementation of technology on reviews for state mandated
testing reveal that the students are engaged, that the students will participate, and that their scores
will have an increase of at least 10% in a three-week period. The goal of this study is to prove
that you can get students to improve in preparation for state mandated testing at a faster rate
using technology to foster your review session than without such technology.
The use of several technology tools acting as one will not only increase student engagement
and participation, but will also increase performance on the state mandated AIMS math
assessment. The students will receive immediate feedback on their learning so they can ask
questions for clarification and refinement. All students will at least achieve the “meets” rating
on their AIMS assessment. Thirty percent of all the students will receive an “exceeding” rating
on their AIMS assessment.
The study is centered on a group of 54 students (Year 1 grouping), and a group of 56 students
(Year 2 grouping) who took a district pre- and post- assessment. The study measured growth for
the three-week periods in which the district review module was implemented. The ultimate
success will be measured by the AIMS ratings for each student. Minimal growth would be
defined as zero to ten percent, whereas substantial growth would be any growth over 10%. The
number of students who will achieve a “meets” or “exceeds” rating on the AIMS assessment will
determine the ultimate success of the project. After the state reports back the AIMS results to the
district the researchers will be able to determine if the intervention was a success or not.
The study participants not only met the desired outcomes that were set forth, they
completely exceeded them. With the substantial growth goal being set at 10%, only one student
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 17
out of the 54 students in study group #1 (Appendix A & B) , and 3 students out of the 56 students
in study group #2 (Appendix C & D), had less than substantial growth. The highest level of
growth experienced in study group #1 was 50% while the highest in study group #2 was a 61%
increase in scores during the three week study window. The average growth for all the students
in study group #1 was 23% and in study group #2 was 29.97% (see Appendix E- J for
comparison data). This result was rather unexpected, but proves that the use of technology that
the students are familiar with had a profound impact on the district reported scores.
The last part of this study was to see if this interactive technology would ultimately have
a profound impact on the AIMS math assessment. While there would be no pretest to show
growth on the AIMS math assessment, one can compare the study group with the entire school
population. Quantitative data analysis demonstrated that the reviews associated with the
implemented technologies helped students receive a high score on their AIMS math assessment
when compared to students who received reviews without the implemented technology. Of the
54 students who participated in the study during year 1, zero failed or approached on the AIMS
assessment. All 54 students (100%) met the minimum passing score on the AIMS assessment.
The average scored of this group was 747 with 900 being a perfect score. Of the 54 students, 25
of them received an “exceeding” rating, the highest rating on the assessment. The remaining 29
students received the “meet” rating, the second highest rating on the assessment. Of these 29
students, 11 of them missed the “exceeding” rating by one correct response on the assessment.
46% of the students “exceeded” on the AIMS math assessment. When compared to the rest of
the student population only 87 students received an “exceeding” rating on their AIMS test, which
means that 29% of the total exceeding population of the school came directly from the study
group (see appendices K, L, M for AIMS data).
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 18
Of the 56 students who participated in the study during year 2, zero failed or approached
on the AIMS assessment. All 56 students (100%) met the minimum passing score on the AIMS
assessment. The average scored of this group was 756 with 900 being a perfect score. Of the 56
students, 27 of them received an “exceeding” rating, the highest rating on the assessment. The
remaining 29 students received the “meet” rating, the second highest rating on the assessment.
Of these 29 students, 7 of them missed the “exceeding” rating by one correct response on the
assessment. 48% of the students “exceeded” on the AIMS math assessment. Included in these
scores was one student who scored a perfect 900 on their AIMS math assessment. Due to a state
embargo on the school and district data, we do not at this time have access to any other students’
scores (see appendices K, L, M for AIMS data).
When comparing study group #1 to study group #2 there was an average increase of nine
points. In study group #2 the average score was six points above the required score of 750 to get
an “exceeding” rating. Whereas in study group #1 the average score was three points below the
required score of 750 to get an “exceeding” rating. Qualitative data analysis demonstrates that
this increase in average raw score is due to higher expectations from the teacher at the beginning
of the review module, more familiarity with the technology and method, a change from a teacher
centered model to a student centered model, and class buy in due to knowledge from study group
#1.
The initial growth data was staggering enough to justify that this research has been a
success, and after the actual AIMS numbers to the study are added there was an amazing
differentiation of how this technology impacted the way students can prepare for state mandated
testing. The research shows that the study groups learned and retained the information being
presented to them for a longer period of time. The study groups were able to take the
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 19
information and apply it to a high stakes test regardless of any predisposed fear of testing. They
used the knowledge gained through the technology to achieve something that the researcher
never thought was possible.
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 20
Implications
The researchers learned many things from this study. First the researchers learned that
regardless of what students you have, if you give them the necessary tools to be successful they
can and they will be successful. Educators need to look at what tools students can benefit the
most from and try to create a learning environment in which students can achieve greater
success. Secondly, the researchers learned that technology is already all around us, and we must
embrace it to help our technology savvy students learn in the same way they play. Students use
technology all the time, as teachers we are doing our students a disservice by taking away the
one thing that keeps them going, technology.
Although the use of technology, the desired outcomes, and way the data was obtained
never change, the class structure and instruction did modify over time to met student needs. In
the first year of research, the researcher followed a teacher centered model where he would
constantly try to help out students using a simple explanation of how to do the problems. In
second year of the research, the researchers let the students explain more of how they did the
problems and why they did the problems that way. Instead of this being a teacher driven review
session as in year one, the researchers would let it be a student centered review session where
they put in all the input to other students in the second year. The change of instruction can be the
indicator why students performed better earlier in the study window in the second year over the
first year.
Due to the benefits of the student centered model being implemented student
accountability was enhanced. The students in the first year of the study were only accountable to
themselves, which takes a special student to remain on task and keep focus for the whole entire
time of the study. As the researcher, posting of student scores without names for all to see
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 21
created a level of accountability without the humility of other students seeing your name. Doing
this increased the anxiety level of the student just one step more and allows them to have a sense
of accountability placed on them by the other students in the room. It goes back to the old
saying; it takes a community to raise a child. In this aspect you are allowing the students to help
each other out. The researchers foresaw weaker students seeking out stronger students for
assistance and clarification in order to improve their own scores. This idea of transparency for
all students is generally not well received by the educational community and is looked down
upon. The main thing about this is you have to maintain an open learning environment for all
students to be successful.
In conducting further research using this same type of technology and method, the
researcher recommends that a learning environment be created where students feel comfortable
sharing ideas with the instructor. A positive learning climate must be maintained in order to
achieve the same amount of success. Classroom management is a huge key to the successful
results obtained from this study. Keeping students on task and free from misbehaving is
absolutely vital to maintain an engagement level you need for this study. Lastly being a good
teacher and adapting to the variety of students is necessary. The teacher cannot be rigid in their
presentation, must be willing to stop and take the time to answer any necessary questions that the
students have.
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 22
References
Collins, A., Hawkins, J., & Frederiksen, J. (1993, 1993/1994). Three Different Views of
Students: The Role of Technology in Assessing Student Performance. Journal of the
Learning Sciences, 3(2), 205. Retrieved October 30, 2007, from Academic Search
Premier database.
Davis, S. (2003). Observations in classrooms using a network of handheld devices. Journal of
Computer Assisted Learning, 19(3), 298.
EInstruction-Simple Solutions. Real Results. 23 June 2009 <http://www.einstruction.com/>.
Haldane, M. (2007, September 1). Interactivity and the Digital Whiteboard: Weaving the Fabric
of Learning. Learning, Media and Technology, 32(3), 257. (ERIC Document
Reproduction Service No. EJ772462) Retrieved September 30, 2007, from ERIC
database.
Hashemzadeh, N., & Wilson, L. (2007, September). Teaching With The Lights: Out What Do
We Really Know About The Impact Of Technology Intensive Instruction?. College
Student Journal, 41(3), 601-612. Retrieved October 30, 2007, from Academic Search
Premier database.
Hedberg, J. (2006, July). E-learning futures? Speculations for a time yet to come. Studies in
Continuing Education, 28(2), 171-183. Retrieved October 30, 2007, from Academic
Search Premier database.
Hodge, S., & Anderson, B. (2007, September 1). Teaching and Learning with an Interactive
Whiteboard: A Teacher's Journey. Learning, Media and Technology, 32(3), 271. (ERIC
Document Reproduction Service No. EJ772451) Retrieved September 30, 2007, from
ERIC database.
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 23
Masters, K., & Oberprieler, G. (2004, May). Encouraging equitable online participation through
curriculum articulation. Computers & Education, 42(4), 319. Retrieved October 30, 2007,
from Academic Search Premier database.
Moallem, M., Kermani, H., & Chen, S. (2006, January 11). Handheld, Wireless Computers: Can
They Improve Learning and Instruction?. Computers in the Schools, 22(3-4), 93. (ERIC
Document Reproduction Service No. EJ736525) Retrieved September 30, 2007, from
ERIC database.
Morris, L., Finnegan, C., & Wu, S. (2005, July). Tracking student behavior, persistence, and
achievement in online courses. Internet & Higher Education, 8(3), 221-231. Retrieved
October 30, 2007, from Academic Search Premier database.
Pemberton, J., Borrego, J., & Cohen, L. (2006, January 1). Using Interactive Computer
Technology to Enhance Learning. Teaching of Psychology, 33(2), 145. (ERIC Document
Reproduction Service No. EJ736342) Retrieved September 30, 2007, from ERIC
database.
Snyder, I., & Prinsloo, M. (2007). Young People's Engagement with Digital Literacies in
Marginal Contexts in a Globalised World. Language & Education: An International
Journal, 21(3), 171-179. Retrieved October 30, 2007, from Academic Search Premier
database.
Tracy, B., & Hendrix, J. (2008). Using Technology in the Secondary Classroom to Assess
Engagement, Participation, and Performance. Masters of Technology Education Action
Research Project. Northern Arizona University.
Woodfield, K., & Lewis, J. (2003, January). Getting on Board With Online Testing. T.H.E.
Journal, 30(6), 32. Retrieved November 4, 2007, from Academic Search Premier
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 24
database.
Wambach, C. (2006, September). From Revolutionary to Evolutionary: 10 Years of 1-to-1
Computing. T H E Journal, 33(14), 58-59. Retrieved October 30, 2007, from Academic
Search Premier database.
Wu, H., & Huang, Y. (2007, September 1). Ninth-Grade Student Engagement in Teacher-
Centered and Student-Centered Technology-Enhanced Learning Environments. Science
Education, 91(5), 727. (ERIC Document Reproduction Service No. EJ772556) Retrieved
September 30, 2007, from ERIC database.
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 25
Appendix A
Class #1 (Year 1) from pre-test to post test
Pre-test
In Class #1
Growth #1
In Class #2
Growth #2
Post-Test
Growth Post
Overall Growth
78 84 6 88 4 88 0 10 60 72 12 82 10 90 8 30 72 80 8 82 2 82 0 10 52 66 14 78 12 88 10 36 62 76 14 86 10 88 2 26 68 74 6 90 16 92 2 24 36 66 30 82 16 86 4 50 48 51 3 54 3 72 18 34 72 78 6 92 14 96 4 24 46 82 36 90 8 96 6 50 86 90 4 92 2 98 6 12 76 76 0 86 10 96 10 20 76 86 10 86 0 96 10 20 52 65 13 66 1 80 14 28 54 58 4 62 4 67 5 13 74 76 2 86 10 88 2 14 34 56 22 80 24 82 2 48 64 70 6 86 16 90 4 26 36 50 14 65 15 78 13 42 82 88 6 94 6 98 4 16 80 82 2 90 8 94 4 14 82 80 -2 86 6 94 8 12 64 84 20 84 0 90 6 26 46 76 30 82 6 84 2 38 38 48 10 65 17 76 11 38
61.52% 72.56% 11.04% 81.36% 8.80% 87.56% 6.20% 26.44%
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 26
Appendix B
Class #2 (Year 1) from pretest to post test
Pre-test
In Class #1
Growth #1
In Class #2
Growth #2
Post Test
Growth Post
Overall Growth
64 74 10 86 12 92 6 28 60 68 8 73 5 84 11 24 52 59 7 66 7 68 2 16 70 75 5 80 5 84 4 14 76 88 12 88 0 92 4 16 62 63 1 72 9 80 8 18 54 62 8 92 30 96 4 42 40 53 13 54 1 58 4 18 54 67 13 78 11 80 2 26 60 68 8 71 3 86 15 26 74 76 2 82 6 88 6 14 80 84 4 90 6 92 2 12 46 60 14 73 13 80 7 34 82 82 0 94 12 100 6 18 46 62 16 69 7 84 15 38 52 66 14 73 7 82 9 30 86 86 0 94 8 98 4 12 64 69 5 76 7 80 4 16 74 86 12 86 0 92 6 18 78 78 0 80 2 86 6 8 74 82 8 86 4 90 4 16
64.19% 71.81% 7.62% 79.19% 7.38% 85.33% 6.14% 21.14%
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 27
Appendix C
This is class #1 (Year 2) from pre-test to post-test
Pre-test
In Class Test #1
Growth #1
In Class Test #2
Growth #2
Post-Test
Growth post
Overall Growth
52 70 18 74 4 82 8 3058 90 32 86 -4 98 12 4064 88 96 8 3253 76 23 80 4 90 10 3778 82 4 92 10 98 6 2075 78 3 90 12 84 -6 958 86 28 84 -2 96 12 3878 86 8 86 0 92 6 1472 74 2 78 4 88 10 1644 68 24 78 10 76 -2 3267 74 7 82 8 84 2 1753 78 90 12 3747 78 31 80 2 94 14 4781 88 7 96 8 88 -8 767 82 15 92 10 92 0 2572 84 12 86 2 92 6 2067 80 13 82 2 92 10 2564 68 4 72 4 88 16 2481 92 11 84 -8 98 14 1769 90 21 90 0 100 10 3172 88 16 82 -6 92 10 2058 68 10 64 -4 86 22 2861 78 17 82 4 88 6 2758 86 28 84 -2 96 12 3856 74 18 76 2 80 4 2453 80 27 80 0 90 10 3775 82 7 88 6 100 12 25
64.19 80.08 15.89 82.74 2.66 90.74 8 26.56
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 28
Appendix D
This is class #1 (Year 2) from pre-test to post-test
Pre-test
In Class Test #1
Growth #1
In Class Test #2
Growth #2
Post-test
Growth post
Overall Growth
58 86 28 70 -16 86 16 28 53 80 27 78 -2 90 12 37 50 84 34 78 -6 96 18 46 75 86 11 94 8 98 4 23 78 82 4 88 6 94 6 16 50 64 14 72 8 82 10 32 44 74 30 74 0 84 10 40 64 86 22 88 2 88 0 24 33 70 37 82 12 94 12 61 58 78 20 88 10 92 4 34 56 72 16 82 10 92 10 36 72 80 8 84 4 94 10 22 47 68 21 84 16 80 -4 33 42 78 90 12 48 64 64 0 74 10 96 22 32 39 74 35 72 -2 94 22 55 75 82 7 88 6 92 4 17 50 82 32 82 0 88 6 38 75 78 3 82 4 90 8 15 53 70 17 70 0 90 20 37 42 78 36 66 -12 86 20 44 75 90 15 92 2 98 6 23 58 74 16 78 4 92 14 34 50 68 18 74 6 86 12 36 42 66 24 78 12 90 12 48 56 84 28 82 -2 88 6 32 47 82 35 86 4 94 8 47 86 94 8 88 -6 92 4 6 58 64 6 82 18 82 0 24
56.90 77.14 20.25 80.48 3.34 90.28 9.79 33.38
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 29
Appendix E
Growth of Math Classes when studying for Arizona State Math
Assessment (percent scores)- 1st Year
64 67.1281.48 86.88
69.14 68.8678.33 84.19
Pretest In-Class Test #1 In-Class Test #2 Post Test
Group 1 - Year 1 Group 2 - Year 1
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 30
Appendix F
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 31
Appendix G
Growth Comparison in Math Classes when studying for
Arizona State Math Assessment (raw score) - 1st Year
3.12
14.365.4
19.28
-0.289.48 5.86
15.05
Growth Betweenpre-test and in
class #1
Growth between Inclass Test #1 and
in class #2
Growth between inclass test #2 and
Post Test
Growth betweenPre-test to Post-
Test
Group 1 - Year 1 Group 2 - Year 1
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 32
Appendix H
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 33
Appendix I
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 34
Appendix J
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 35
Appendix K
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 36
Appendix L
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 37
Appendix M
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 38
Appendix N
Copyright Jeremy Hendrix, Brett Tracy 2009
Continuing to Assess 39
Copyright Jeremy Hendrix, Brett Tracy 2009
Appendix O