6
July 2004 Journal of Engineering Education 205 ROMAN T ARABAN Department of Psychology Texas Tech University MATTHEW W. HAYES Department of Psychology Texas Tech University EDWARD E. ANDERSON Department of Mechanical Engineering Texas Tech University M. P. SHARMA Department of Chemical and Petroleum Engineering University of Wyoming ABSTRACT Engineering courses offer students multiple resources for learning; however, it is not clear how much time students devote to these resources or how effective they are for mastering the course material. We examined students’ use of learning resources in introductory thermodynamics through the use of activity logs, and the relation of these self-reports to objective measures of course performance. Self-report data revealed that students generally favored some resources over others, that these resources were mutually supportive of instruction, and that measures of resource use were significantly correlated with each other and with academic performance. We suggest that knowledge of how students allocate time to course resources, and correlations between the use of these resources and course performance, could assist instructors in course and curriculum planning. Keywords: study time, learning resources, thermodynamics I. INTRODUCTION Technology has greatly expanded the possibilities for instruc- tion. Course instructors can place syllabi, lecture notes, interactive homework problems, virtual labs, and other course resources, on the World Wide Web (henceforth the Web). Textbooks come with tutorials, electronic tools, and Web sites offering supplemental material and activities [1]. Several questions face instructors when implementing these resources: will students use the resources; do the resources place appropriate demands on students; and will use of the resources assist in learning the course material [2, 3]? In this paper we describe a study in which we asked introductory thermo- dynamics students to create a record of the resources that they used and the time spent with each resource. We hypothesized that these self-reports could be used to identify which learning resources stu- dents used, which in turn would allow us to examine the resources that aided learning, and ways in which resources were mutually supportive in the learning process. If these self-reports were suc- cessful in revealing what students did and how effective those be- haviors were, then they could be used for course planning. How students spend their time has been of interest to educators since the early 1900s. College educators of the time wondered: “Are they [students] overworked or is college a country club where young people may find rest and pleasant relaxation between social engage- ments?” [4]. One of the earliest studies [5] was published in 1915 in which questionnaires were distributed to 1,715 students in 56 differ- ent college courses. Often, students were asked to keep a careful record of all their activities for a week—to record the distribution of their time in study, extra-curricular activities, leisure, and sleep, but in other cases they were asked to simply indicate how much time they spent in various activities, including study, for the entire week. Varia- tions of the activity log method continue to be used today [3, 6, 7]. Monitoring students’ allocation of time for academic work has also been applied to engineering. Ressler and Lenox [3] described a time survey that is used extensively in civil engineering courses at the United States Military Academy. For each of the 40 lessons in the course, students anonymously indicate the number of minutes they spent outside of class to prepare for the lesson. The faculty use the survey results for course management, evaluation, and change. The survey allows faculty to ask whether students are spending an appro- priate amount of time preparing for class, and to monitor, evaluate, and modify course demands as deemed appropriate to the goals of the course and program. The time survey has been in continuous use since 1988. Over the long-term it has provided objective depic- tions of courses in terms of temporal demands on students. One outcome was to reduce course demands from about 120 minutes to about 60 minutes per lesson in order that the course might fit in more effectively with the overall program. (See also [7]). The research presented here is part of a joint effort of Texas Tech University, the University of Wyoming, and other partners, to develop computer-based materials for introductory thermody- namics. The two primary additions to the thermodynamics course that have been implemented over the last several years are required weekly on-line homework exercises and a CD with supplemental learning materials [8, 9] that span the content of the entire course. The on-line homework and CD materials were readily accessible to all students, and we wanted to know how much they would use the materials, and whether the materials had a measurable effect on learning. We were interested in understanding the use of these ma- terials in the context of a detailed picture of students’ academic be- haviors in this course. Therefore, students provided considerably more information about their activities than is required in an in- strument like the time survey [3, 7]. Giving Students Time for the Academic Resources that Work

Giving Students Time for the Academic Resources that Work

  • Upload
    m-p

  • View
    214

  • Download
    2

Embed Size (px)

Citation preview

Page 1: Giving Students Time for the Academic Resources that Work

July 2004 Journal of Engineering Education 205

ROMAN TARABANDepartment of PsychologyTexas Tech University

MATTHEW W. HAYESDepartment of PsychologyTexas Tech University

EDWARD E. ANDERSONDepartment of Mechanical EngineeringTexas Tech University

M. P. SHARMADepartment of Chemical and Petroleum EngineeringUniversity of Wyoming

ABSTRACT

Engineering courses offer students multiple resources forlearning; however, it is not clear how much time students devoteto these resources or how effective they are for mastering thecourse material. We examined students’ use of learning resourcesin introductory thermodynamics through the use of activity logs,and the relation of these self-reports to objective measures ofcourse performance. Self-report data revealed that studentsgenerally favored some resources over others, that these resourceswere mutually supportive of instruction, and that measures ofresource use were significantly correlated with each other andwith academic performance. We suggest that knowledge of howstudents allocate time to course resources, and correlationsbetween the use of these resources and course performance, couldassist instructors in course and curriculum planning.

Keywords: study time, learning resources, thermodynamics

I. INTRODUCTION

Technology has greatly expanded the possibilities for instruc-tion. Course instructors can place syllabi, lecture notes, interactivehomework problems, virtual labs, and other course resources, onthe World Wide Web (henceforth the Web). Textbooks comewith tutorials, electronic tools, and Web sites offering supplementalmaterial and activities [1]. Several questions face instructors whenimplementing these resources: will students use the resources; dothe resources place appropriate demands on students; and will useof the resources assist in learning the course material [2, 3]? In thispaper we describe a study in which we asked introductory thermo-dynamics students to create a record of the resources that they used

and the time spent with each resource. We hypothesized that theseself-reports could be used to identify which learning resources stu-dents used, which in turn would allow us to examine the resourcesthat aided learning, and ways in which resources were mutuallysupportive in the learning process. If these self-reports were suc-cessful in revealing what students did and how effective those be-haviors were, then they could be used for course planning.

How students spend their time has been of interest to educatorssince the early 1900s. College educators of the time wondered: “Arethey [students] overworked or is college a country club where youngpeople may find rest and pleasant relaxation between social engage-ments?” [4]. One of the earliest studies [5] was published in 1915 inwhich questionnaires were distributed to 1,715 students in 56 differ-ent college courses. Often, students were asked to keep a carefulrecord of all their activities for a week—to record the distribution oftheir time in study, extra-curricular activities, leisure, and sleep, but inother cases they were asked to simply indicate how much time theyspent in various activities, including study, for the entire week. Varia-tions of the activity log method continue to be used today [3, 6, 7].

Monitoring students’ allocation of time for academic work hasalso been applied to engineering. Ressler and Lenox [3] described atime survey that is used extensively in civil engineering courses at theUnited States Military Academy. For each of the 40 lessons in thecourse, students anonymously indicate the number of minutes theyspent outside of class to prepare for the lesson. The faculty use thesurvey results for course management, evaluation, and change. Thesurvey allows faculty to ask whether students are spending an appro-priate amount of time preparing for class, and to monitor, evaluate,and modify course demands as deemed appropriate to the goals ofthe course and program. The time survey has been in continuoususe since 1988. Over the long-term it has provided objective depic-tions of courses in terms of temporal demands on students. Oneoutcome was to reduce course demands from about 120 minutes toabout 60 minutes per lesson in order that the course might fit inmore effectively with the overall program. (See also [7]).

The research presented here is part of a joint effort of TexasTech University, the University of Wyoming, and other partners,to develop computer-based materials for introductory thermody-namics. The two primary additions to the thermodynamics coursethat have been implemented over the last several years are requiredweekly on-line homework exercises and a CD with supplementallearning materials [8, 9] that span the content of the entire course.The on-line homework and CD materials were readily accessible toall students, and we wanted to know how much they would use thematerials, and whether the materials had a measurable effect onlearning. We were interested in understanding the use of these ma-terials in the context of a detailed picture of students’ academic be-haviors in this course. Therefore, students provided considerablymore information about their activities than is required in an in-strument like the time survey [3, 7].

Giving Students Time for the AcademicResources that Work

Page 2: Giving Students Time for the Academic Resources that Work

In this study, we asked the following questions:

� How much time did students allocate to each of the learningresources available to them in introductory thermodynamics?This included the materials that we had developed, as well asmore traditional academic resources, like the class lecture,lecture notes, and the textbook.

� How did the use of specific resources correlate with the use ofother resources?

� Did the use of learning resources correlate with homeworkand in-class test grades?

In the Conclusions section that follows, we consider how datalike these can be used to inform curriculum development and theadjustment of learning activities to students’ needs and preferences.

II. CASE STUDY

A. ParticipantsStudents from the University of Wyoming (UWyo) and Texas

Tech University (TTU) who were enrolled in Thermodynamics(UWyo: ES2310) or Engineering Thermodynamics I (TTU:ME2322) participated in this study. Both are three-credit courses.Data were collected from six cohorts of students over three semesters.Out of the 256 students who completed the courses, a total of 211submitted activity logs, for a response rate of 82 percent. Over half ofthe respondents (52.5 percent) were juniors, and the remaining wereeither sophomores (27.0 percent) or seniors (20.5 percent). The meannumber of engineering credits completed was 36.46 hours (h) [stan-dard deviation (SD) � 28.14], and the mean course load during thesemester the log was completed was 15.04 h (SD � 2.31).

B. Materials and ProcedureCourse materials included a textbook, Thermodynamics: An

Engineering Approach [10], an interactive CD that was provided withthe textbook, on-line homework problems, and 2.5–2.7 h of lectureeach week (Tuesday and Thursday meetings: TTU; Monday, Wednesday, Friday meetings: UWyo). The CD and on-line homework problems covered the first ten chapters of the text-book, which were the chapters covered in the course. The CD,which came with the textbook, incorporated active learning meth-ods, and included interactive exercises, graphical modeling, andphysical world simulations, as well as narrated content pages with il-lustrations, equations, graphs, and tables [9, 11]. Each on-linehomework problem set contained up to five multiple-choice ques-tions, most of which contained randomly selected problem parame-ters, and provided feedback indicating the correct responses. Theproblem sets drew on a database of 670 questions. The homeworkproblems were made available over the Web. Students also had ac-cess to a course Web site that included a syllabus, abbreviated lecturenotes, and links to on-line homework problems and other Web sites.The materials used were the same in each semester, with only minorchanges. Students at both universities were assigned problems fromthe textbook and on-line homework problems for part of the coursegrade. They took three to four in-class tests and a final examination.

An activity log was constructed, which consisted of an instruc-tion page detailing how participants were to use the log and a set oflabels and definitions for the students to use to code their study-related activities. The labels, which were developed in earlier pilotstudies, were mutually exclusive and covered all possible studentstudy behaviors—see Table 1. The log provided space for studentsto record the date, start time, end time, and activity they engaged in.The instructions explained that it was possible to use multiple re-sources simultaneously, and students were instructed to list all the

206 Journal of Engineering Education July 2004

Table 1. Labels and definitions used for the activity logs.

Page 3: Giving Students Time for the Academic Resources that Work

resources they used at any given time. Background questions wereincluded at the end of the log packet, including a question about thestudent’s expected grade for the course.

The course instructor distributed activity logs throughout the se-mester in order to collect representative samples of data. Studentsrecorded all activities related to the thermodynamics course for oneweek (seven days) using the activity labels shown in Table 1. A stu-dent completed the log only once during the semester. Four of thesix cohorts submitted logs anonymously; the remaining two includ-ed names in order to allow us to correlate the time spent on the vari-ous academic activities with performance measures.

C. Results and DiscussionThe 211 respondents made a total of 1,505 log entries, or 7.13 en-

tries on average per week. The overall response rate (82 percent) to theactivity log and the average number of entries supported our initial hy-pothesis that detailed activity logs could be used in thermodynamics tolearn more about students’ academic behaviors. Globally, the data re-vealed that mean class attendance per week was 2.07 h (SD � 1.05),and mean study time was 6.91 h (SD � 3.96). The reasonableness ofthe class-attendance figure (based on a 2.5–2.7 hour maximum) suggested that the latter statistic for mean study time was not exagger-ated. If that were so, then these students exceeded the standard expectation of two hours of study per week for each course credit.

In the statistical analyses, significance was defined as a probability-value of p � 0.05, and marginal significance as 0.10 � p � 0.05.An analysis of the six cohorts of students using a 2 (School: TTU,UWyo) � 3 (Academic Term) analysis of variance (ANOVA)showed significant variation in study time by school [TTU

(N � 118): 7.78 h (SD � 4.06), UWyo (N � 93): 5.81 h (SD �3.57); F(1, 205) � 11.63, p � 0.002]; marginally significant varia-tion by term [Term 1 (N � 62): 5.86 h (SD � 4.06), Term 2 (N �56): 7.56 h (SD � 3.87), Term 3 (N � 93): 7.22 h (SD � 3.86); F(2, 205) � 2.74, p � 0.067]; and a significant interaction betweenthe two factors [F(2, 205) � 3.49, p � 0.04], the latter due largelyto the greater variation across semesters at UWyo compared toTTU. These significant differences in study time across semestersand schools were noteworthy because the two instructors were attempting to teach in a relatively consistent manner. We may at-tribute some of these differences to variation in student characteris-tics across cohorts of students (e.g., background knowledge, level ofability, level of interest), and part to the normal variation in courseimplementation from semester to semester.

How much time did students allocate to each of the learning resourcesavailable to them in introductory thermodynamics? In order to addressour first question, the raw data were compiled in several steps. Inthe first step, times from the log entries were summed into 84 mu-tually exclusive categories that corresponded to unique single be-haviors, like “Reading Text,” and unique multiple behaviors, like“Reading Text & Text Problems,” that participants had recorded intheir logs. The categories were rank ordered according to the num-ber of participants (labeled Users in Figure 1) who contributed tothe category (range � 0 for Supplemental to 176 for Lecture). Thedisplay of frequencies in Figure 1 is for the overall sample of 211and for the largest single cohort of 64.

The distribution of frequencies revealed several importantpoints. First, there were only a few behaviors that students consistently reported. The five most frequently reported were

July 2004 Journal of Engineering Education 207

Figure 1. Number of students reporting use of learning resources.

Page 4: Giving Students Time for the Academic Resources that Work

attending lecture, completing textbook problems, completing on-line problems, reading the textbook, and using the CD. This wastrue in the overall sample as well as in the TTU sample. Second, re-lated to the purpose of this study, the two newly developed re-sources that were added to the curriculum-the required on-linehomework problems and the supplemental CD—were among theresources that students frequently reported using. This indicatedthat these new elements were successfully incorporated into the cur-riculum, at least insofar as students used them. Finally, many of thebehaviors (not shown in Figure 1) were unique to a single individ-ual. In the overall sample, 42 of the coded behaviors were men-tioned by a single student, and in the TTU sample, 23 were men-tioned by only one student. Thus, the data showed that much of thetime students perceived themselves as engaging in a predominantactivity, like solving textbook problems, but at other times they re-ported more complicated uses of multiple resources, like solvingtextbook problems, reading the textbook, and reviewing lecturenotes.

How did the use of specific resources correlate with the use of other re-sources? Before addressing the second and third questions posed inthis study, several points should be made clear. The remaininganalyses were conducted using the Texas Tech sample displayed inFigure 1. This sample was chosen because it had a relatively highnumber of participants (N � 64) compared to the other samples

(the next largest sample had 32 participants), and because studentsin this sample submitted their logs with names, thus the relation oflearning resources to course grades could also be examined. Thissample was from the third semester of on-line homework and CDimplementation, so that most of the “bugs” in these resources hadbeen eliminated. Correlation analysis was applied to these data.This type of analysis was appropriate because it could reveal thestrength and also the direction of the associations between learningresources. Nonparametric correlation analyses (i.e., calculation ofSpearman’s rho) were applied because the data did not follow normal-curve distributions; nonparametric techniques are recom-mended in this context [12, 13]. Finally, only the five most frequentbehaviors were analyzed, simply because it did not make sense toform generalizations about behaviors that most students did not en-gage in (i.e., those with low Users in Figure 1), nor was it prudentfrom a statistical perspective to correlate data distributions in whichmost data points were zero.

The results are summarized in Table 2. Statistically significantinter-correlations between resources suggested that students wereusing these resources in a mutually supportive fashion, not in acompensatory manner. The strongest association was between solv-ing textbook problems and solving on-line problems. Students whodid relatively more of one did relatively more of the other. Related-ly, solving on-line problems was positively and significantly

208 Journal of Engineering Education July 2004

Table 2. Means (Standard Deviations) for weekly use of learning resources (in hours), course grades, and Spearman’s rho for the use ofresources and grades for the Texas Tech sample.

Page 5: Giving Students Time for the Academic Resources that Work

correlated with using the CD. Relatively higher lecture attendancewas also associated with relatively more use of the CD. Using re-sources in a compensatory manner would have resulted in negativecorrelations. None of the negative correlations in these data reachedstatistical significance.

Did the use of learning resources correlate with homework and in-class test grades? In the Texas Tech sample, course grades, providedby the instructor, were used as a measure of course performance,and consisted of a weighted homework grade (for textbook and on-line homework problems), the weighted average of four in-classtests, and the final exam grade. (The weightings were those used bythe instructor to assign course grades.)

By far, the strongest correlation in Table 2 is between on-linehomework problems and the final exam grade. There was also amarginally significant correlation between on-line problems andthe in-class test average. The on-line homework problems were de-signed to give students quick, but effective, feedback about theirperformance. The fact that students spent less than 30 minutes(0.43 h) on average per week solving on-line homework suggestedthat on-line homework provided a good indicator of what studentswere learning, however, much of that learning may have been tak-ing place through the coordinated use of other learning resources.Consistent with this possibility, note the strong positive correlationsin Table 2 between solving on-line homework problems, solvingtextbook problems, and using the CD. Because of these strongcross-correlations, a likely possibility is that students used these sev-eral resources in conjunction, in order to achieve higher test andfinal scores. We also found that the time students spent solving textproblems was positively correlated, albeit marginally, with thehomework grade. This correlation made sense, indicating that themore time students devoted to text problems, the higher theirhomework grade.

As a test of the reliability of the activity logs, we correlated par-ticipants’ expected course grade, which they provided on the activitylogs, with the measures in Table 2. Expected course grades correlat-ed with actual homework grades and final-exam grades, and mar-ginally with test averages, indicating that students had a reasonablygood sense of their grade potential. The survey was able to capturethese intuitions. Expected grades also were significantly correlatedwith the times students reported working on text problems, andmarginally with their times on on-line problems, suggesting an un-derlying belief among the students that academic effort led to acad-emic reward, which the activity logs and survey question were ableto capture.

In summary, this case study was important in several ways. Thisstudy demonstrated the use of activity logs for monitoring students’learning behaviors. The data resulted in significant statistical effectsthat were interpretable and that made sense. An analysis of the con-tribution of the dominant behaviors in the course suggested that notall efforts of the student were equally beneficial. At a more specificlevel, the activity-log data were helpful in understanding the role ofnewly-introduced curriculum elements—the on-line homeworkproblems and the CD—in the context of the full gamut of learningresources available to the students. The time that students devotedto solving on-line homework problems, to solving textbook prob-lems, and to related activities, were predictive of performance. To-gether, these effects suggested that there might be ways to adjustthe course curriculum to make it more efficient in bringing aboutcontent proficiency in students.

IV. CONCLUSIONS

In the introduction, we suggested that course planning, particu-larly incorporating new resources into a course, raises several ques-tions: will students use the resources; do the resources place appro-priate demands on students; and will use of the resources assist inlearning the course material? In this paper we demonstrated a general method of asking students to report on their academic behaviors in order to use those data for curriculum decisions.

We believe that there is a need to draw attention to the time factorin curriculum decisions because instructors tend not to think in thoseterms. Pauk [14] noted that professors did not make assignments interms of time. They would not say to students “spend two hours read-ing,” but instead “Read Chapters 1 and 2.” However, in order to avoidoverloading the curriculum [2] and give students the time they needfor the academic resources that work, time must become a factor inthe curriculum; see also [6, 7]. Because of the potential variability intime allocated by students per course, by students in different majors,and at different points in an academic program, it is not possible todetermine exactly how much time students should spend in a course;however, instructors can establish a mean and range for study times intheir courses as a baseline for curriculum planning [6, 7].

The data revealed what we might call the “good student” syn-drome—the good student draws on several learning resources inorder to learn the course material. There was a clear pattern of usingmultiple learning resources. We cannot generalize these findings toother groups without additional data. However, we believe that wehave demonstrated a method that could be useful in beginning tolearn more about what students actually do in general and in thecontext of specific courses. In the present study, it was clear thatmajor course modifications could not involve adding additional re-quirements without dropping existing requirements because stu-dents were already meeting or exceeding the normal demands forstudy outside of class. The results suggested that shifting emphasisto computer-based resources could require reducing the number ofassigned textbook problems. Getting students to spend more timereading the textbook would also require some modification to thestructure and demands of the course [15].

In the sample investigated here, study times were correlated withobjective measures of performance. This finding is consistent withother research where it was found that students’ use of computer-based materials in an upper-division psychology course was correlatedwith in-class test performance at magnitude of about r � 0.30—acorrelation that was relatively consistent and reliable across multipletests [16]. Thus, there is growing evidence that academic time islinked to academic behavior. There are three related ways in whichthis relation between time and learning can aid in curriculum design.First, instructors must allocate sufficient time to a resource or activityif it is to have the intended effect on learning. Second, if students donot use a resource, or if there are weak correlations between a resourceand performance, this may be a signal to the instructor that the re-source is not effective or that it is poorly implemented. Third, by de-signing a course that provides students with sufficient time to com-plete valued activities, and by selecting the most effective materials andtasks for students, instructors can devise more efficient learning possi-bilities for students, and can avoid overloading the curriculum [2].

There are several limitations to the present case studies. We onlysampled one week of a student’s semester, self-reports are subject to inaccuracies, and sample sizes were limited. Distance education venues

July 2004 Journal of Engineering Education 209

Page 6: Giving Students Time for the Academic Resources that Work

may be able to overcome many of these limitations through moreconsistent, unobtrusive time-stamping of student interactions, andthe potential to collect data from large samples, thus providing fur-ther validation of the present methodology and the importance ofstudy time as a factor in learning. We believe that our approach hasgood prospects for future applications, especially in contexts inwhich instructors are developing state-of-the-art instructional ma-terials [1] but not necessarily considering the practical constraintson students to engage those materials. Our criterion measures werehomework and test grades, however, the method of tracking acade-mic behaviors may also be applied when the criteria are measures ofcourse objectives related to ABET engineering criteria [17] in orderto discover which behaviors are reliably correlated with the achieve-ment of those objectives.

ACKNOWLEDGMENTS

This research was supported, in part, by a grant from the National Science Foundation, NSF-CCLI 0088947 and a grantfrom the Texas Tech University Graduate School.

REFERENCES

[1] Özer, T., Kenworthy, J.G., Brisson, E., Cravalho, G., and G.H.McKinley, “On Developments in Interactive Web-based Learning Mod-ules in a Thermal-Fluids Engineering Course,” International Journal ofEngineering Education, Vol. 19, No. 2, 2003, pp. 305–315.

[2] Zielinski, T.J., Brooks, D.W., Crippen, K.J., and J.L. March,“Time and Teaching,” Journal of Chemical Education, Vol. 78, No. 6, 2001,pp. 714–715.

[3] Ressler, S.J., and T.A. Lenox, “The Time Survey: A Course Development Tool That Works!,” Proceedings of the 1996 American Societyfor Engineering Education Annual Conference and Exposition, Washington,D.C.

[4] Hutchinson, R.G., and M.H. Connard, “What’s in a CollegeWeek?,” School and Society, Vol. XXIV, No. 625, 1926, pp. 768–772.

[5] King, I, “An Inquiry into Certain Aspects of the Study Habits ofUniversity Students,” School and Society, Vol. II, No. 49, 1915, pp. 824–828.

[6] Hayes, R., Kotovsky, K., Kozbelt, A., and A. Weissman, “WhereDoes Students’ Time Go?,” Center for Innovation in Learning at CarnegieMellon, Research Brief, Vol. 1, No. 2, 1999, pp. 1-4.

[7] Bailey, M., Floersheim, R.B., and S.J. Ressler, “Course AssessmentPlan: A Tool for Integrated Curriculum Management,” Journal of Engi-neering Education, Vol. 91, No. 4, 2002, pp. 425-434.

[8] Taraban, R., Anderson, E.E., Sharma, M.P., and M.W. Hayes,“Monitoring Students’ Study Behaviors in Thermodynamics,” Proceedingsof the 2002 American Society for Engineering Education Annual Conferenceand Exposition, Montreal, Canada.

[9] Anderson, E.E., Sharma, M.P., and R. Taraban. “Application ofActive Learning Techniques to Computer-Based Instruction of Introduc-tory Thermodynamics. Proceedings of the 2002 American Society for Engi-neering Education Annual Conference and Exposition, Montreal, Canada.

[10] Cengel, Y.A., and Boles, M.A., Thermodynamics: An EngineeringApproach, 4th Ed., McGraw-Hill: Boston, MA, 2001.

[11] Anderson, E.E., Taraban, R., and Sharma, M.P. (in press), “Im-plementing and Assessing Computer-Based Active Learning Materials InIntroductory Thermodynamics,” International Journal of Engineering

Education. Also in on-line version at �http://www. ijee.dit.ie/OnlinePapers/Interactive_Papers.html�.

[12] Conover, W.J., Practical Nonparametric Statistics (3rd Ed.), JohnWiley: New York, 1999.

[13] SPSSS. SPSS-X User’s Guide, Author: Chicago, IL, 1988.[14] Pauk, W., How To Study In College. Boston: Houghton Mifflin

Company, 1962.[15] Thomas, J.W., Bol, L., Warkentin, R.A., Wilson, M., Strage, A.,

and W.D. Rohwer, Jr., “Interrelationships among Students’ Study Abilities,Self-Concept of Academic Ability, and Achievement as a Function of Char-acteristics of High-School Biology Courses,” Applied Cognitive Psychology,Vol. 7, 1993, pp. 499–532.

[16] Taraban, R., Rynearson, K., and K.A. Stalcup, “Time as a Variable inLearning on the World-Wide Web,” Behavior Research Methods, Instruments,& Computers, Vol. 33, No. 2, 2001, pp. 217–225.

[17] Felder, R.M., and R. Brent, “Designing and Teaching Courses toSatisfy the ABET Engineering Criteria,” Journal of Engineering Education,Vol. 92, No. 1, 2003, pp. 7–25.

AUTHORS’ BIOGRAPHIES

Roman Taraban is Associate Professor in the Department ofPsychology at Texas Tech University. He received his Ph.D. incognitive psychology from Carnegie Mellon University. His inter-ests are in how undergraduate students learn, and especially, howthey draw meaningful connections in traditional college contentmaterials (e.g., textbooks, lectures, multi-media).

Address: Department of Psychology, Mail Stop 2051, TexasTech University, Lubbock, TX, 79409; telephone: 806-742-3711ext. 247; fax: 806-742-0818; e-mail: [email protected].

Matthew W. Hayes is a graduate student in the ExperimentalPsychology program at Texas Tech University.

Address: Department of Psychology, Mail Stop 2051, TexasTech University, Lubbock, TX, 79409-2051; telephone: 806-742-3711 ext. 249; fax: 806-742-0818; e-mail: [email protected].

Edward E. Anderson is Professor of Mechanical Engineering atTexas Tech University, and he is also the Director of the UniversityTeaching, Learning, and Technology Center. His responsibilitiesat the Center are to train and assist faculty throughout the Universi-ty in applying technology to their teaching. He received his B.S.and M.S. degrees in Mechanical Engineering from Iowa State Uni-versity and Ph.D. degree from Purdue University.

Address: Department of Mechanical Engineering, Mail Stop1021, Texas Tech University, Lubbock, TX, 79409-1021; telephone:806-742-0133; fax: 806-742-0134; e-mail: ed.anderson@ ttu.edu.

M. P. Sharma is Professor of Chemical and Petroleum Engi-neering at the University of Wyoming. He received his Ph.D. de-gree in Mechanical Engineering from Washington State Universi-ty. A current area of interest is conducting research on teaching andlearning methods, especially on the use of synchronous and asyn-chronous tools using Web technology.

Address: 1000 E. University Avenue, Department of Chemicaland Petroleum Engineering, University of Wyoming, Laramie,WY, 82071; telephone: 307-766-6317; fax: 307-766-6777; e-mail:[email protected].

210 Journal of Engineering Education July 2004