7
ABSTRACT Electronic student response technologies (SRT) are capable of assessing teaching and learning methods in real time, and they offer an exceptional means of introducing active learning protocols in classes with large enrollments. These wireless systems allow students to key in responses with remote control units to questions posed by an instructor in the classroom. Student responses then are displayed in real time, allowing both students and instructors to gauge student comprehension instantaneously. From Spring 2002 to Spring 2003, we utilized SRT in 4 sections of a high-enrollment introductory Earth Science course (Geosc 020: Planet Earth) at Penn State University. We conducted a multi-faceted assessment of the use of SRT in our course that included quantitative and qualitative perception data from students enrolled in the course and faculty/administrator visitors to our classroom. Our preliminary assessment of the pedagogical merits of SRT in our course suggests that this technology is an effective tool for introductory geoscience education. INTRODUCTION A number of studies have shown that traditional methods for teaching science to non-science majors are limited in their effectiveness (Pinet, 1995; Dufresne et al., 1996; Mazur, 1997; DeCaprariis, 1997; Gerace et al., 1999; McManus, D.A., 2002; McConnell et al, 2003). In the conventional approach, instructors present information in a lecture format and the students passively gather that information by listening and taking notes. Several newer teaching models stress active student participation, in which students process material presented in class via discussion with the instructor and peers, problem solving, and group activities interspersed with periods of listening and note-taking (Angelo and Cross, 1993; Bykerk-Kauffman, 1995; Mazur, 1997; Yuretich et al., 2001; Crouch and Mazur, 2001). However, even when more progressive teaching methods are employed, it often is difficult to gauge student involvement, interest, and level of comprehension. Electronic student response systems present one method for overcoming these barriers to engagement and effective learning (Dufresne et al., 1996; Mestre et al., 1997; Wenk et al., 1997; Gerace et al., 1999; Judson and Sawada, 2002; Meltzer and Manivannan, 2002; McConnell et al., 2003). When an instructor can query and collect responses from every individual in his or her classroom, the instructor can gauge instantaneously what students understand, and more important, which concepts they are failing to grasp. This feedback allows the teacher to spend less time on material that already has been processed in favor of focusing on ‘problem areas’. Inspired by our on-site observation of classroom re- sponse systems used in physics and biology courses at the University of Massachusetts at Amherst in 2001, we decided to test similar technologies in Geosc 020: Planet Earth, a general education course at Penn State with mean semester enrollments of ~145 students per section (Table 1). In Spring 2002 we initiated our experimenta- tion with a commercially manufactured electronic re- sponse technology produced by eInstruction called the Classroom Performance System (CPS). While response systems have been successful instruments of active learning in university-level physics, chemistry, and biol- ogy courses for well over a decade (Littauer, 1972; Dufresne et al., 1996; Mestre et al., 1997; Gerace et al., 1999; Judson and Sawada, 2002; Meltzer and Manivannan, 2002), SRT use in the collegiate Earth Sci- ences is in the early stages of implementation (McConnell et al., 2003). The primary pedagogical goal of our SRT integration project was the creation of an active-learning environment in a large introductory Geoscience course (Geosc 020), which previously had been taught in traditional lecture format. Specific objectives of the project were to encourage active student participation in Geosc 020, to enhance student communication and collaboration skills, to develop problem solving skills during class time, and to increase student attendance in class from the historic mean attendance of ~50% by mid semester. As a necessary component of this effort, we wished to evaluate the perceived effectiveness of SRT technology in enhancing both student learning and instructor teaching. To address this requirement, we conducted a multi-faceted assessment of student opinions on the benefits and disadvantages of using this technology in our class. We compiled quantitative data via extensive student surveys and student attendance data. We collected qualitative data from oral student surveys and from interviews with faculty peers who observed our operation of SRT in our classes (Greer et al., 2002; Heaney and Greer, 2003). METHODS What is the Classroom Performance System? Student response technology has been in development for several years. The ClassTalk system (Dufresne et al., 1996; Webking, 1998; Mestre, pers. comm., 2001), pioneered primarily by the physics faculty at the University of Massachusetts at Amherst, is a hard-wired response system with software that allows an instructor to view a real-time ‘map’ of the classroom. Each student response is color-coded by seat and student name or identification number. Unfortunately, installation of Greer and Heaney - Real-time Analysis of Student Comprehension 345 Real-Time Analysis of Student Comprehension: An Assessment of Electronic Student Response Technology in an Introductory Earth Science Course Lisa Greer Washington and Lee University, Geology Department, Lexington, VA 24450, [email protected] Peter J. Heaney Penn State University, Department of Geosciences, University Park, PA 16802, [email protected]

Real-Time Analysis of Student Comprehension: An Assessment of

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Real-Time Analysis of Student Comprehension: An Assessment of

ABSTRACT

Electronic student response technologies (SRT) arecapable of assessing teaching and learning methods inreal time, and they offer an exceptional means ofintroducing active learning protocols in classes withlarge enrollments. These wireless systems allow students to key in responses with remote control units toquestions posed by an instructor in the classroom.Student responses then are displayed in real time,allowing both students and instructors to gauge studentcomprehension instantaneously. From Spring 2002 toSpring 2003, we utilized SRT in 4 sections of ahigh-enrollment introductory Earth Science course(Geosc 020: Planet Earth) at Penn State University. Weconducted a multi-faceted assessment of the use of SRTin our course that included quantitative and qualitativeperception data from students enrolled in the course andfaculty/administrator visitors to our classroom. Ourpreliminary assessment of the pedagogical merits of SRTin our course suggests that this technology is an effectivetool for introductory geoscience education.

INTRODUCTION

A number of studies have shown that traditionalmethods for teaching science to non-science majors arelimited in their effectiveness (Pinet, 1995; Dufresne et al.,1996; Mazur, 1997; DeCaprariis, 1997; Gerace et al., 1999;McManus, D.A., 2002; McConnell et al, 2003). In theconventional approach, instructors present informationin a lecture format and the students passively gather thatinformation by listening and taking notes. Several newerteaching models stress active student participation, inwhich students process material presented in class viadiscussion with the instructor and peers, problemsolving, and group activities interspersed with periodsof listening and note-taking (Angelo and Cross, 1993;Bykerk-Kauffman, 1995; Mazur, 1997; Yuretich et al.,2001; Crouch and Mazur, 2001). However, even whenmore progressive teaching methods are employed, itoften is difficult to gauge student involvement, interest,and level of comprehension.

Electronic student response systems present onemethod for overcoming these barriers to engagementand effective learning (Dufresne et al., 1996; Mestre et al.,1997; Wenk et al., 1997; Gerace et al., 1999; Judson andSawada, 2002; Meltzer and Manivannan, 2002;McConnell et al., 2003). When an instructor can queryand collect responses from every individual in his or herclassroom, the instructor can gauge instantaneouslywhat students understand, and more important, whichconcepts they are failing to grasp. This feedback allowsthe teacher to spend less time on material that already

has been processed in favor of focusing on ‘problemareas’.

In spired by our on-site ob ser va tion of class room re -sponse sys tems used in phys ics and bi ol ogy courses atthe Uni ver sity of Mas sa chu setts at Amherst in 2001, wede cided to test sim i lar tech nol o gies in Geosc 020: PlanetEarth, a gen eral ed u ca tion course at Penn State withmean se mes ter en roll ments of ~145 stu dents per sec tion(Ta ble 1). In Spring 2002 we ini ti ated our ex per i men ta -tion with a com mer cially man u fac tured elec tronic re -sponse tech nol ogy pro duced by eInstruction called theClass room Per for mance Sys tem (CPS). While re sponsesys tems have been suc cess ful in stru ments of ac tivelearn ing in uni ver sity-level phys ics, chem is try, and bi ol -ogy courses for well over a de cade (Littauer, 1972;Dufresne et al., 1996; Mestre et al., 1997; Gerace et al.,1999; Judson and Sawada, 2002; Meltzer andManivannan, 2002), SRT use in the col le giate Earth Sci -ences is in the early stages of im ple men ta tion(McConnell et al., 2003).

The primary pedagogical goal of our SRT integration project was the creation of an active-learningenvironment in a large introductory Geoscience course(Geosc 020), which previously had been taught intraditional lecture format. Specific objectives of theproject were to encourage active student participation inGeosc 020, to enhance student communication andcollaboration skills, to develop problem solving skillsduring class time, and to increase student attendance inclass from the historic mean attendance of ~50% by midsemester.

As a necessary component of this effort, we wishedto evaluate the perceived effectiveness of SRTtechnology in enhancing both student learning andinstructor teaching. To address this requirement, weconducted a multi-faceted assessment of studentopinions on the benefits and disadvantages of using thistechnology in our class. We compiled quantitative datavia extensive student surveys and student attendancedata. We collected qualitative data from oral studentsurveys and from interviews with faculty peers whoobserved our operation of SRT in our classes (Greer et al., 2002; Heaney and Greer, 2003).

METHODS

What is the Classroom Performance System?Student response technology has been in developmentfor several years. The ClassTalk system (Dufresne et al.,1996; Webking, 1998; Mestre, pers. comm., 2001),pioneered primarily by the physics faculty at theUniversity of Massachusetts at Amherst, is a hard-wiredresponse system with software that allows an instructorto view a real-time ‘map’ of the classroom. Each studentresponse is color-coded by seat and student name oridentification number. Unfortunately, installation of

Greer and Heaney - Real-time Analysis of Student Comprehension 345

Real-Time Analysis of Student Comprehension: An Assessmentof Electronic Student Response Technology in an IntroductoryEarth Science CourseLisa Greer Washington and Lee University, Geology Department, Lexington, VA 24450,

[email protected]

Peter J. Heaney Penn State University, Department of Geosciences, University Park, PA 16802,[email protected]

Page 2: Real-Time Analysis of Student Comprehension: An Assessment of

hard-wired electronic response systems is prohibitivelyexpensive for many universities.

Fortunately, recent advances in student responsetechnology now offer sophisticated systems atdramatically reduced costs. After researching a variety of commercially produced applications, we selectedeInstruction’s Classroom Performance System. The latest version of CPS is relatively inexpensive to both theinstructor and the student. Expenditures for the CPSpackage from the instructor’s side include the purchaseof the eInstruction application software, an infraredreceiver (which is encapsulated in a plastic diskmeasuring 4 inches in diameter), and a central processorthat sorts and relays the receiver output to a standardpersonal computer. In addition, each student is requiredto purchase a remote control unit and register it witheInstruction through the company’s website(www.eInstruction.com). Several SRT manufacturers,including eInstruction, have developed agreements with academic publishing companies to package theirsystems with specific textbooks at markedly reducedcosts to the instructor and to the student.

Because the CPS technology is wireless, it is highlyportable, and it can accommodate responses from up to~500 students (170 student responses per signalreceiver). Thus, the operation of these units in theclassroom is logistically straightforward. Before thelecture, the instructor enters the day’s lecture questionsinto the CPS application database. At the appropriatepoint during the lecture, the instructor can call up theCPS program and present the desired question to theentire class using a standard LCD projector connected tothe computer. Students may be asked individually to key in their answers with their remote control units before orafter extensive discussion in peer groups. The studentresponse data are then displayed in a histogram or otherformat using the LCD projector. Total collection timestypically fall under one minute. This technology thusallows the instructor to gauge student perception andunderstanding of a given topic in real time and to alterlecture content accordingly. In addition, students canmeasure their level of understanding of course materialand concepts relative to their peers.

The system software is upgraded frequently, and it is adaptable to a variety of instructional modes, includingstudent examinations. Response data can be analyzed ina number of ways. Instructors can monitor individual orgroup response success from one lecture to the next, andthey can cross-correlate performance on individualquestions that are offered repeatedly over the history ofthe course. Thus far, we have explored only the in-classquestion-and-answer capabilities of the CPS application.

HOW DO WE USE SRT IN OURCLASSROOMS?

To date, SRT has been employed in 4 of our class sections of Geosc 020 over 3 semesters. We introduced thetechnology simultaneously in two separate sections ofGeosc 020 during Spring 2002. Subsequently, Greercontinued its use in Fall 2002, and Heaney employed SRT in the Spring 2003 semester (Table 1).

The questions that we posed can be classified intocategories that include Quantitative Problems, PopularMisconceptions of Science, Applied Reasoning, andCreative Thinking as outlined below:

Quantitative problems require students to performmathematical calculations based on a formula orphysical model presented in class, as exemplified byradioactive decay (Figure 1). Many of our non-sciencemajors tend to ‘freeze’ when confronted with questionsof this sort on exams. A major virtue of the SRT is that theinstructor may better gauge the ability of all students tohandle these exercises in a format where the correctresponse does not affect a student grade.

Popular misconceptions of science questions offer anexcellent means of ‘hooking’ indifferent students into anew lecture topic. Paradoxical observations, such as theEarth-Sun distance in winter and summer, surprisestudents and force them to reconsider their assumptionsabout the natural world. The SRT reveals how widelydistributed such misconceptions are.

Applied reasoning questions are especially appropriatefor visually based interpretations of geologicphenomena, such as the Laws of Superposition andCross-Cutting Relationships (Figure 2). Additionally,these problems are useful when a framework forunderstanding a geological concept has been provided(e.g., definition of a mineral), and student understanding of that framework must be immediately gauged (e.g., Is a snowflake a mineral?).

Creative thinking issues, which may include multiple orsubjective responses, may be less well suited forSRT-assisted instruction in small class environments.The free-wheeling thought process that attends a

346 Journal of Geoscience Education, v. 52, n. 4, September, 2004, p. 345-351

Term Enrollment* InstructorSpring 2002 155 HeaneySpring 2002 125 Greer

Fall 2002 210 GreerSpring 2003 92 Heaney

Table 1. Enrollment data for the 4 semesters of SRT use in Geosc 020. *Note: Fluctuations in enrollmentare historically determined by scheduled class times

Figure 1. Example of a quantitative analysis questionpresented in class with SRT.

Page 3: Real-Time Analysis of Student Comprehension: An Assessment of

consideration of why gemstones are more valuable thanwater, for example, is tricky to encapsulate in themultiple-choice SRT format, but in large classes the SRTquestion helps to launch class-wide discussion.

The questions shown in Figures 1 and 2 demonstratethe ability of the SRT software to incorporate accessoryillustrations by inputting files in common graphicsformats (e.g., jpeg, tiff). Although we experimented withquestions having multiple correct answers or no correctanswer, students clearly indicated that they werefrustrated unless only one of our proposedmultiple-choice answers was correct. In addition, in thefirst stages of our SRT experimentation, we askedanywhere from 0 to 5 questions per 50 minute classperiod. In response to student feedback, we shortlysettled on 2 to 3 as the optimal number per 50 minuteclass session for this particular course.

Even though we constructed our lecturesindependently, we shared the same course syllabus,lecture outline, course structure, and grading schemethroughout each semester. Therefore, we each developed our own database of questions, generally sharingquestions and responses only after each class period.This approach allowed us to maintain our ownclassroom ‘style’ during the semester. Typically, weadopted both a simple question and response and amodified think-pair-share approach to SRT presentation(Mazur, 1997). We generally would present material,pose a problem, and then ask for an immediate responseor allow students to discuss the SRT question, often afterasking each student to write down an initial individualresponse. After students had responded electronically tothe question at hand, the student response results werediscussed in an instructor-facilitated group format toensure that answers were interpreted correctly.

ASSESSMENT PROTOCOL

During each semester of SRT integration we attempted to assess the student perception of SRT effectiveness usinga variety of feedback methods. These initial efforts atassessment of SRT effectiveness included: Attendancedata; Quantitative student survey data; Qualitativevisitor feedback; and Qualitative input from selectedGeosc 020 students as outlined below.

Attendance data - The CPS software records studentattendance as it registers individual student responses.We have compared our attendance rates with thehistorical values for Geosc 020.

Quantitative student surveys - Student surveys thatsolicited the students’ impressions of the effectiveness ofSRT were conducted in Spring 2002 (Greer and Heaney),Fall 2002 (Greer), and Spring 2003 (Heaney). Studentscompleted these surveys at mid semester on a voluntaryand anonymous basis in class. The survey for bothsections of Geosc 020 in Spring 2002 contained 30questions. Nine of these questions dealt with technicalaspects of the CPS system and were no longer pertinentfollowing upgrades by eInstruction. Consequently, these questions were eliminated for the evaluations in the Fall2002 and Spring 2003 semesters. The resulting21-question survey is reproduced in Figure 3.

Qualitative data from Geosc 020 students - Inaddition to the quantitative student survey data, moststudents submitted qualitative commentary onperceived SRT effectiveness as part of the anonymousstudent surveys described above (Figure 3) and theuniversity-mandated course assessment forms at Penn

Greer and Heaney - Real-time Analysis of Student Comprehension 347

Figure 2. Example of an applied reasoning questionpresented in class with SRT. Modified from Figure4.16, Press and Siever, 2001 [Permission of W.H.Freeman].

Figure 3. Midterm student survey form used to assessperceived SRT effectiveness in Geosc 020. Questionsnot pertaining to SRT are excluded.

Page 4: Real-Time Analysis of Student Comprehension: An Assessment of

State known as the Student Ratings of TeachingEffectiveness (SRTEs).

Qualitative visitor feedback - During the first semesterof SRT integration, Greer sent an open invitation to PennState faculty and administrators to visit her classroomvia the Center for Excellence and Learning (CELT) atPenn State. Approximately 30 faculty, administrators,and graduate students from across the University Parkcampus observed her class in Spring 2002. Visitorsincluded representatives from a variety of departments(physics, biology, chemistry, economics, politicalscience, English and others) as well as CELT, theUndergraduate Studies Program, and the Penn StateTeaching and Learning Consortium. Visitors wereencouraged to provide any feedback on an informalbasis.

RESULTS OF SRT ASSESSMENT

Attendance - Over the past 10 years, attendance inGeosc 020 had fallen well below an estimated 50% by themidpoint of each semester, despite the use of daily orweekly quizzes (in part as attendance monitors) whichhave counted for as much as 10-15% of the course grade.By the final month of classes, student representationgenerally ranged between 30% and 40%, as measured by

the return of the university-mandated studentevaluations.

Our use of the CPS system enabled us to measureattendance on a day-to-day basis. We encouragedstudent attendance by counting SRT responses(regardless of correct/incorrect answers) as 15% of thefinal grade. This practice was not exceptionally differentfrom the weekly or daily quizzes that had been used toencourage attendance in previous years. We observedthat daily lecture attendance rates were as high as 90% inthe middle of the semester and mean attendance forGeosc 020 ranged from 81% to 84% over the foursemesters assessed. These numbers were confirmed byhead counts in order to allow for the possibility thatabsent students had handed their remote control units tofriends who entered responses on their behalf. Typicallythe discrepancies between the attendance numberscalculated by CPS and the direct headcounts were on theorder of +/-2%. Often headcounts revealed a highernumber of students than remotes, as students sometimesforgot to bring their remotes to class.

Quantitative Student Surveys - Based on our late-termassessment questionnaires, a majority of studentsbelieved that the integration of SRT into the class lectureincreased the quality of course content and facilitatedhigher-order learning during class time. Results for fivekey survey questions are highlighted in Figure 4.

The majority of students surveyed in each section(65-77%) felt that the SRT helped them gauge their levelof understanding of course material. An even higherpercentage of students (71-85%) agreed that SRT usereinforced important concepts presented in lecture. Aslightly lower number of students believed SRT to be aneffective teaching and learning tool in Spring 2002(54-57% with 13-16% in disagreement and the restremaining neutral). However, the positive response tothis question increased dramatically in Fall 2002 andidentically in Spring 2003 to 75% in agreement and only8% in disagreement. This improvement might have beenattributable to a refinement of our pedagogical approachand to upgrades in the CPS hardware. Overall, between

348 Journal of Geoscience Education, v. 52, n. 4, September, 2004, p. 345-351

Figure 4. Results from selected student surveyquestions.

Figure 5. Weighted response correlation for eachassessment question for each instructor.

Page 5: Real-Time Analysis of Student Comprehension: An Assessment of

65% and 81% of students believed that SRT helped themlearn.

The percentage of student respondents who wouldrecommend SRT use in Geosc 020 (63-67%) as well as inother courses at Penn State (67-71%) was remarkablyhigh in Spring 2002, especially in light of the numeroustechnical difficulties encountered during the firstsemester of SRT adoption (as this was the first use of theCPS system in a university-level course; the company has since addressed these early software and hardwareproblems). These numbers increased significantly in Fall2002 and Spring 2003, with 88% and 89%, respectively,recommending SRT use in future sections of Geosc 020.Similarly, 86% in both of these sections desired the SRT in their other courses at Penn State.

We are attempting to determine the degree to whichour assessment responses reflect the merits of the SRTrather than the idiosyncrasies of the instructors. Arigorous statistical analysis of these data is in process and will be treated in a separate paper. Nevertheless, agraphical presentation (Figure 5) demonstrates that theSRT system elicited similar responses from the studentsdespite differences in teaching style. To construct Figure5, we calculated a weighted response for each assessment question for each instructor for each semester. Forexample, if a given assessment question yielded thefollowing results: 10% Strongly Agree (1); 30% Agree (2);30% Neutral (3); 20% Disagree (4); and 10% StronglyDisagree (5), then the weighted mean responsecalculated for that question is 2.9. The closeness of theresponse set for the two instructors is a strong indicationthat the student reaction to the SRT was independent ofthe instructional style.

QUALITATIVE OBSERVATION

Students - Some of the more enlightening insights intothe advantages and disadvantages of the SRT came fromanonymous written student comments. Many of thesewere effusively positive. Almost all negative commentswere related to the expense of the remotes and to the useof SRT as a monitor of attendance, since SRTparticipation was counted as 15% of their final grade.Below we have included a sampling of the commentsreceived. Of the completed surveys, over 75% of students expressed qualitative opinions. We judge the followingcomments as representative of the comments received.

What do you like best about the CPS system?

‘I liked the CPS system because it gave shy kidsthe chance to participate. It also helped intracking one’s own progress. It helped everyoneto get involved in such a large class.’

‘I liked the CPS questions, I felt they gave you anincentive to a) go to class and b) pay someattention throughout the course instead of juststudying for the exams.’

‘The CPS questions were fun! They encouragedinvolvement with the lecture and made meproblem solve.’

‘Although at first I thought the CPS was a littleodd (i.e. more suited for jr/sr high school) Ilearned to appreciate the way it brought the classtogether and made us think.’

‘I’m anti-modern technology so I thought that Iwould hate it at first. However, it has reallyincreased my level of participation and I don’tfeel put on the spot when doing so.’

‘I like that it lets me see if I am understanding thelecture or not and it truly does give a nice breakfrom straight lecturing. Also, since it [my answer] doesn’t count as a grade, I don’t have to stressabout it - instead, I can relax, listen, and try mybest to answer correctly.’

‘Dude, it’s like playing video games. It has mademe stop bringing my Game Boy to class.’

‘It makes solving problems more fun.’

What do you like least about the CPS system?

‘I think the CPS System is an extremely expensiveway to take attendance. This class is one of themost expensive classes I have ever taken at PSU.’

‘It forces me to come to class.’

‘It takes up class time when she should belecturing.’

‘I thought the CPS is overrated and expensive.’

‘It takes attendance and that determines if youwere in class and sometimes I forgot my remoteand was still in class but didn’t get credit forbeing there.’

‘Man, it’s expensive. It cut into my beer budgetthe first week of class. Bummer.’

‘I do not so much dislike the system as I dislikethe way the teacher uses the system’ [i.e., as anattendance monitor].

‘Probably the fact that it’s not used in moreclasses.’

Faculty - Overall visitor feedback was extraordinarilypositive although several visitors shared insightfulobservations concerning SRT logistics and presentationstyle. Observations focused on how students appeared to respond to the system, how much of the class time mightsuccessfully be devoted to SRT question periods, whichgroups of students seemed to be responding more or lesswillingly to the process, etc. These encounters led todiscussions of such issues as the appropriate time forcollection and display of SRT results and methods forencouraging active group discussions of SRT questions.

DISCUSSION

Impacts on Learning - Whether the SRT system actually improves student understanding and retention of thematerial presented in lecture is not resolved by theassessment results of this study. Neither of us taughtGeosc 020 prior to Spring 2002. Thus we cannot comparestudent performance with the SRT to that without thistechnology. On the other hand, our assessment results

Greer and Heaney - Real-time Analysis of Student Comprehension 349

Page 6: Real-Time Analysis of Student Comprehension: An Assessment of

demonstrate unequivocally that our students perceivedthe SRT as a significant enhancement to the learningprocess (Figure 4). The strongly positive reaction to theelectronic response technology is especially striking inlight of the out-of-pocket expenses associated with SRTuse. On average, 33% responded that the purchase andregistration of the remotes fell outside their schoolbudgets. That nearly 90% of our respondents in Fall 2002and Spring 2003 nevertheless favored the continuedusage of the SRT in Geosc 020 is a convincing argumentfor student belief in the effectiveness of this tool. Webelieve the increase in classroom attendance fromprevious years can be attributed to SRT use. Thisphenomenon has been noted by other users of SRTtechnology as well (Littauer, 1972; Judson and Sawada,2002)

This study also cannot resolve the degree to which an interactive classroom setting, peer instruction, andthink-pair-share techniques may have enhanced ourcourse independent of the SRT system itself. Theliterature certainly supports the merits of these and other teaching techniques for use without interactivetechnology (Mazur, 1997; Cummings et al., 1999; Meltzerand Manivannan, 2002). However, the quantitative andqualitative data presented indicate that students believethat SRT is a useful teaching tool.

We suggest that students are attracted to the SRTsystem because it promotes active learning in alarge-class environment. When used in concert withpeer-instruction techniques, students became involvedin the teaching process. They were eager to argue overpossible answers with each other as well as with theinstructor. In addition, student comments suggest thatthe SRT offers students a unique sense of ownership ofcourse content. After a student ‘vote’, students appearedemotionally invested in the results. They often cheeredwhen they learned that they picked the correct answer orgroaned when they discovered that they were incorrect.This technology significantly increased the level ofstudent-instructor interaction, making the course a moreenjoyable experience for us as well as for our students.

Impacts on Teaching - The incorporation of SRTrequires some front-end labor on the part of the facultymember. Setting up the hardware before class is fairlysimple and can be accomplished within 10 minutes,though we always employed undergraduate interns toassist with this process. Learning how to manipulate theCPS software also is straightforward, and loading 2-3questions typically took only 5 minutes once thosequestions had been developed. Perhaps the greatestchallenge for each of us involved the creation ofquestions that are challenging, geared to the appropriatelevel, and intellectually valuable. Since our assessmentsrevealed that students enjoy SRT in part because thequestions provide timely breaks from the routine oflecture, we tried to separate our question and responseexercises by 15- or 20-minute intervals. Consequently,the construction of our lectures had to be tightlyintegrated to the development of our question sets. Inaddition, we attempted to engage the students inproblems that could be solved deductively ingroup-learning situations, and thus the SRT processinfluenced the material that we chose to teach and theway in which we presented it. Rather than introducing atopic with a detailed description of the currentlyaccepted model for a given geologic process, theelectronic response system inspired us to construct our

lectures using an inductive and often historicalperspective. This approach laid the foundation for thestudents to infer the important conclusions on their own.Lastly, we believe SRT is best used when instructors arewilling to remain flexible in the classroom. We wereoften surprised by the student response results and hadto adjust our lecture presentation accordingly. A similarphenomenon has been documented by other SRT users(Meltzer and Manivannan, 2002).

Impacts on Course Enrollment - Undergraduatestudents rarely enroll in introductory Earth Sciencecourses with the intention of becoming professionalgeoscientists or even of majoring in the subject. Toooften, courses in the Earth Sciences are viewed as thepath of least resistance towards satisfying a university’sgeneral education requirement. Thus, students in Geosc020 perceive themselves as unlikely scientists, either byability or by desire. Many have vague (and oftenmisguided) notions of what ‘science’ is. Courses such asours often provide the only chance to expose thesestudents to scientific philosophy at the university level.We hope that one of the successful outcomes ofincorporating SRT in our courses will be an increase inenrollment, and we hope that the popularity of thistechnique drives an increase in geoscience majors.

CONCLUSIONS

Our experiences with electronic student responsesystems at Penn State University have convinced us thatthe SRT is capable of creating a rapport between theprofessor and the student in large classroomenvironments. Both our quantitative and our qualitativeassessments strongly support this conclusion. Furthertesting is required to gauge whether SRT is superior tothe traditional lecture approach as a means of improvingclass comprehension. Nevertheless, keeping generaleducation students in physical and intellectualattendance is the largest challenge for introductory levelcourses, and the fact that students perceive the system asengaging and effective has persuaded us that thistechnology justifies the initial expense in time and labor.We are eager to see this technology in widespread use sothat a large pool of Earth science instructors can sharetheir insights to maximize the impact of electronicresponse systems.

FUTURE WORK

While preliminary assessment results indicate that theintegration of SRT in the Geosc 020 curriculum has beensuccessful in its initial phase of development, we believethat the full extent to which SRT can be an effectiveactive-learning tool has not yet been adequatelyassessed. We hope to extend the development of SRT useto other science instructors at Penn State and arecurrently integrating SRT in a Historical Geology courseat Washington and Lee University. We hypothesize thatSRT technology can be a highly effective learning tool inthe different learning environments that characterizestate-funded research-1 universities and small privateliberal arts colleges. However, it is less clear whether thesystem can be used in both settings in an identicalfashion, or how the usage must be tailored to theaudience in order to maximize its effectiveness. We arecautiously predicting that the benefits of electronic

350 Journal of Geoscience Education, v. 52, n. 4, September, 2004, p. 345-351

Page 7: Real-Time Analysis of Student Comprehension: An Assessment of

response technology will translate across a broad arrayof university-level Earth Science curricula.

ACKNOWLEDGEMENTS

This project was funded in part by the Penn State Centerfor Excellence in Learning and Teaching, the Penn StateCollege of Earth and Mineral Sciences, and the Penn State Department of Geosciences. We wish to thank J. Mestre,R. Phillis, and R. Yuretich at the University ofMassachusetts at Amherst for sharing their experiencesand observations of electronic student technology. Weare sincerely grateful to Dustin Houseman, Dan Fay, andMeredith Hill for help with data compilation anddevelopment and operation of SRT in our classrooms.We thank Stephen Piazza, Diane Enerson, RudySlingerland, and Tanya Furman for helpful discussions.We are most grateful to S. Linneman, D. Meltzer, and C.Cervato, whose detailed reviews significantly improvedthis manuscript.

REFERENCES

Angelo, T.A. and Cross, K.P., 1993, Classroomassessment techniques: A handbook for collegeteachers, 2nd Edition. Jossey-Bass.

Bykerk-Kauffman, A., 1995, Using cooperative learningin college geology classes, The Journal of GeoscienceEducation, v. 43, p. 309-316.

Crouch, C.H. and Mazur, E., 2001, Peer-instruction: Tenyears of experience and results, American Journal ofPhysics, v. 69, p. 970-977.

Cummings, K., Marx, J., Thornton, R., and Kuhl, D., 1999, Evaluating innovations in studio physics. PhysicsEducation Research, American Journal of PhysicsSupplement, v. 67, p. S38-S44.

DeCaprariis, P.P., 1997, Impediments to providingscientific literacy to students in introductory surveycourses, The Journal of Geoscience Education, v. 45,p. 207-210.

Dufresne, R.J., Gerace, W.J., Leonard, W.J., Mestre, J.P.,and Wenk, L., 1996, Classtalk: A classroomcommunication system for active learning, Journal of Computing in Higher Education, v. 7, p. 3-47.

Gerace, W.J., Dufresne, R.J., and Leonard, W.J., 1999,Using technology to implement active learning inlarge classes, University of Massachusetts PhysicsEducation Research Group Technical ReportPERG-1999#11-Nov#2, 22 p.

Greer, L., Heaney, P.J., and Houseman, D., 2002,Integration and assessment of electronic studentresponse technology in the geoscience classroom,Geological Society of America Program withAbstracts, v. 34, p. 296.

Heaney, P.J. and Greer, L., 2003, Using electronicresponse technology to teach mineralogy in largeclass settings, Clay Minerals Society AnnualMeeting, Program and Abstracts, p. 73.

Judson, E. and Sawada, D., 2002, Learning from past andpresent: Electronic response systems in collegelecture halls, Journal of Computers in Mathematicsand Science Teaching, v. 21, p. 167-181.

Littauer, R., 1972, Instructional implications of a low-cost electronic student response system, EducationalTechnology, p. 69.

Mazur, E., 1997, Peer Instruction: A User’s Manual,Prentice Hall, 253 p.

McConnell, D., Steer, D., and Owens, K., 2003,Assessment and active learning strategies forintroductory geology courses, Journal of GeoscienceEducation, v. 51, p. 205-216.

McManus, D.A., 2002, The two paradigms of educationand peer review of teaching, Journal of GeoscienceEducation, v. 49, p. 423-434.

Meltzer, D.E. and Manivannan, K., 2002, Transformingthe lecture-hall environment: The fully interactivephysics lecture, American Journal of Physics, v. 70, p. 639-654.

Mestre, J.P., Gerace, W.J., Dufresne, R.J., and Leonard,W.J., 1997, Promoting active learning in large classesusing a classroom communication system, In: Thechanging role of physics departments in modernuniversities: Proceedings of InternationalConference on Undergraduates Education (Redishand Rigden, eds.), American Institute of PhysicsConference Proceedings, 399, p. 1019-1021.

Pinet, P.R., 1995, Rediscovering geological principles bycollaborative learning, Journal of GeoscienceEducation, v. 43, p. 371-376.

Press, F. and Siever, R., 2001, Understanding Earth, 3rd

Edition, W.H. Freeman and Company, New York,573 p.

Webking, R.H., 1998, Classtalk in Two DistinctlyDifferent Settings. From: ‘Classtalk at University ofTexas El Paso’, CDROM authored at the Dept. ofPolitical Science, Univ. of Texas - El Paso, September1998.

Wenk, L., Dufresne, R., Gerace, W., Leonard, W., andMestre, J., 1997, Technology-assisted active learningin large lectures. In: Student-active science: Modelsof innovation in college teaching, A.P. McNeal andC. D’Avanzo (eds.), Fort Worth, TX, SaundersCollege Publishing, p. 431-451.

Yuretich, R.F., 2002, Assessing higher-order thinking inlarge introductory science Classes, Journal ofCollege Science Teaching.

Yuretich, R.F., Khan, S.A., Leckie, R.M., and Clement, J.J., 2001, Active-learning methods to improve studentperformance and scientific interest in a largeintroductory oceanography course, Journal ofGeoscience Education, v. 49, p. 11-19.

Greer and Heaney - Real-time Analysis of Student Comprehension 351