4

Click here to load reader

(2006) An assessment of performance between on and off-campus students in an …

Embed Size (px)

Citation preview

Page 1: (2006) An assessment of performance between on and off-campus students in an …

AN ASSESSMENT OF PERFORMANCE BETWEEN ON- AND OFF- CAMPUS STUDENTS IN AN AUTOMATIC IDENTIFICATION AND DATA CAPTURE

COURSE

Stephen J. Elliott1; Eric P. Kukula 2, & Nathan C. Sickler3

1 Stephen Elliott, Ph.D., Assistant Professor, Biometrics Standards Performance, and Assurance Laboratory, Department of Industrial Technology, Purdue University, 401 North Grant Street, West Lafayette, IN 47906, USA, [email protected] 2 Eric Kukula, Research Assistant, Biometrics Standards Performance, and Assurance Laboratory, Department of Industrial Technology, Purdue University, 401 North Grant Street, West Lafayette, IN 47906, USA, [email protected] 3 Nathan Sickler, Research Assistant, Biometrics Standards Performance, and Assurance Laboratory, Department of Industrial Technology, Purdue University, 401 North Grant Street, West Lafayette, IN 47906, USA, [email protected]

Abstract Universities have been exploring distance education for years, but typically do not offer program courses that require laboratory exercises. Had they offered such courses, off-campus (distance) students would have been academically disadvantaged by the inability to participate in hands-on laboratory exercises. Therefore, an undergraduate course in Automatic Identification and Data Capture was designed to accommodate distance students and ensure a laboratory experience comparable to on-campus students. The increased availability of technological advances (broadband Internet) provides an opportunity for distance students to gain comparable knowledge. This paper outlines the experiences of three groups of students completing laboratory experiments: on-campus students, undergraduate distance students, and graduate distance students. The results showed that students who interacted with equipment, experience the same level and extent of learning, whether interaction with the AIDC equipment occurred on-campus or remotely; but a lack of interaction resulted in test scores that were statistically different. Index Terms curriculum development, automatic identification and data capture, distance education

INTRODUCTION

Distance education is popularly understood to mean the planned learning or educational processes that occur with the instructor and student separated either geographically or in time, requiring special methods for course design, instruction, communication, and administrative tasks [1-2]. Distance education is often used in two ways: • As a supplement to brick-and-mortar (on-campus)

instruction • As a stand-alone form of instruction

Distance education as a supplement to classroom instruction is common practice in many traditional higher-education institutions. Instructors deliver classroom lectures, but homework assignments and supplemental material may be retrieved online. This approach is feasible because these institutions have sufficient numbers of on-campus computers with Internet access. The residence halls typically accommodate students’ personal computers by offering in-room Internet access, while students without personal computers can avail themselves of on-campus computer laboratories. Distance education as a stand-alone form of instruction is not as common, but its usage has grown tremendously in the past ten years; note the flourishing numbers of online institutions offering degree programs. Many students choose this type of education because full-time jobs, other commitments, or physical limitations prevent their participation in more traditional approaches to instruction. This approach requires the student to have a reliable computer and Internet connection. In addition, providing a new channel for homework and learning opportunities not limited by time or location is required. This approach to education offers numerous benefits: • Accommodates different learning styles and

schedules • Uses various educational resources or media (e.g.,

paper-based, video, audio, online material) as instructional tools

• Allows usage of multiple communication methods (e.g., e-mail, teleconference, video conference, instant messaging)

• Supports self-directed and self-paced learning style and method [1-5].

It is common to consider distance education in two distinct categories of instruction: synchronous and asynchronous [2,5]. Synchronous instruction focuses on group work at a distance. It requires all students and instructors to interact simultaneously. The advantage of

©2006 WCCSETE March 19 - 22, 2006, São Paulo, BRAZILWorld Congress on Computer Science, Engineering and Technology Education

93

Page 2: (2006) An assessment of performance between on and off-campus students in an …

synchronous instruction is that it occurs in real time, giving participants the “feel” of a traditional classroom; questions can be asked and answered in an immediate manner, and the class interacts as a group. A disadvantage of synchronous instruction is that it requires everyone to be available at the same time, as well as have access to the necessary communication and / or visualization infrastructure, which may not be feasible due to technological, financial, or scheduling constraints. Asynchronous instruction, on the other hand, is flexible. It allows students to choose their own instructional time frame and complete learning materials according to their schedules [2,5]. Asynchronous instruction allows for multiple learning levels and schedules, as this category makes use of video- or audio-taped lectures, e-mail, and Internet-based programs, all of which are time-independent [2,4,5]. Additionally, disciplines that structure courses by supplementing lectures with laboratory exercises offer enhancements in the way of related lecture material [6].

TRANSFORMATION OF THE LABORATORY INTO A VIRTUAL ENVIRONMENT

General Course Structure

Initially, all of the lecture assignments, laboratory assessments, and exams in IT 345 Automatic Identification and Data Capture (AIDC) were exclusively paper-based. When the university adopted the use of WebCT Campus Edition™ course software, additional material was developed to take advantage of the new technology. For the fall 2003 semester, the instructors migrated the course to a truly online environment (WebCT Vista™), dividing the course into modules, each of which covers an individual AIDC technology. Course material was available electronically; tests were graded automatically by the course management system, when applicable. The WebCT Vista™ portal was instrumental in the department’s ability to offer this course to distance students; the technology made it possible for distance students to complete and submit material at their convenience. Nevertheless, there remained two main issues to be addressed: • How would the instructor present lecture material? • How would the instructors deliver and allow students

to complete the laboratory activities?

Adapting the Course Each of the AIDC technology modules was assessed to see whether any modification would be needed to enable distance students to participate fully in the laboratory activity. The modules included lectures as

well as laboratory exercises – both the presentation and the content of the module components had to be scrutinized. A review of literature showed that there were a number of ways to provide laboratory experiences to distance students, including LabVIEW™, Virtual Network Computing software [7,8], or providing the software on a CD-ROM for students to execute at home. A combination of the latter two strategies was chosen.

Lecture Material To provide students with the lecture material, and to ensure that both distance and on-campus students received the same lecture material, each on-campus lecture was video-taped, digitized, and posted to a streaming server. Additionally, distance students were provided with a CD that contained the software necessary to remotely connect to designated on-campus host computers, as well as software or videos for specific laboratory exercises (see discussion below). For a more detailed description on the course material see [9].

Equipment requirements All the laboratory activities were evaluated to ensure their remote accessibility via the Internet. There were three categories of laboratory activity remote accessibility: • The laboratory activities were equivalent (no

differences were detected between the coursework that distance and on-campus students would complete)

• The laboratory was not convertible (i.e., students had to physically interaction with equipment)

• The laboratory was converted with some modifications.

Table 1 shows the results of the assessment of the course’s various laboratory activities.

TABLE I CONVERTIBILITY OF LABORATORY ACTIVITIES FOR DISTANCE

STUDENTS

Laboratory Activity Equivalent Not Convertible

Convertible with

Modifications IT 345 On Line Pre-Test

X

POSTNET Bar Code — Practical

X

Data Density — Post-Test

X

Linear Bar Code — Post-Test

X

PDF 417 — Practical / Post-Test

X

Data Matrix — Practical / Post-Test

X

Verification — X

©2006 WCCSETE March 19 - 22, 2006, São Paulo, BRAZILWorld Congress on Computer Science, Engineering and Technology Education

94

Page 3: (2006) An assessment of performance between on and off-campus students in an …

Laboratory Activity Equivalent Not Convertible

Convertible with

Modifications Practical / Post-Test

Verification — Color Post-Test

X

Configuration Lab — Argox Practical

X

Configuration Lab — Quadrus Practical

X

Configuration Lab — Depth-of-Field Practical

X iButton® — Practical / Post-Test

X

Magnetic Stripe — Practical / Post-Test

X

RFID Evaluation — Practical

X

Voice Recognition — Post-Test

X

Hand Recognition — Post-Test

X

Hand Software — PT2 Assessment

X

Face Recognition — Cognitec Checkoff

X

Face Recognition — Post-Test

X

Fingerprint — Post-Test

X

Fingerprint — Practical X Iris — Panasonic Authenticam™ Submission

X Iris Recognition — Post-Test

X

TEST POPULATION

The on-campus IT 345 course had 86 students enrolled. The course was divided into eight laboratory sections, each of which included between 10 and 12 students. Two teaching assistants taught four sections each. All on-campus students attended the same lecture period. The group of distance students consisted of 27 individuals, each of whom had access to a two-hour block of computing time for performing the laboratory experiments. Distance students had access to the recording of the lecture that on-campus students received. Graduate and undergraduate distance students were grouped together as distance students.

RESULTS

One-way ANOVA tests were conducted to identify and compare various performance measures of both distance education and on-campus students. Each ANOVA was used to test for statistically significant differences between the performance of distance education and on-campus students using α =.05.

Performance measures examined included pre-test scores, total module scores, module scores of material not requiring laboratory activities (no laboratory required), module scores of material supported by laboratory activities that could NOT be modified for distance (laboratory unconvertible), module scores of material supported by laboratory activities that could be modified for distance (laboratory with distance modification), as shown in Table 1, and combined average exam scores. Additionally, all non-zero scores were used in the calculations. Zero scores resulting from students missing exams, assignments, or assessments were removed to prevent the artificial lowering of any group’s performance.

Pre-test scores An analysis of each group’s pre-test scores was performed to determine if any statistically significant difference existed between the groups’ starting knowledge of the course material. The on-campus students had a mean score of 735.3 and median of score of 736 (approximately 41 percent for mean and median).The distance students had a mean score of 714.6 and median score of 720 (approximately 40 percent for mean and median). A comparison of the means and medians proved that outliers were not influencing the mean score for either group. Furthermore, the results of the ANOVA test indicated that no statistically significant difference existed in the starting knowledge of the distance and on-campus students (p = 0.29). This was important to establish, as all other comparisons were based on the assumption that the groups started out with similar knowledge. Had the groups not started out with equal levels of similar knowledge, then the results of further comparisons would have been difficult to interpret.

Module score analysis The total modules score consisted of all students’ post-test, assignment, and practical scores for each module, which were then compared by group. The ANOVA test result indicated that a statistically significant difference existed between the distance education and on-campus students. Specifically, the on-campus students had an average score of approximately 87 percent, compared to 80 percent for distance students. This resulted in a p value of 0.001. However, the r2 value suggested only 10.10 percent of the difference could be explained by the total modules score alone. Therefore, it was desired to determine were the differences were occurring. Module scores were divided into three types, parallel to the “modification required” categories denoted in Table 1. An ANOVA test was run to compare the module scores between the groups when no laboratory activities were required. The results indicate that no

©2006 WCCSETE March 19 - 22, 2006, São Paulo, BRAZILWorld Congress on Computer Science, Engineering and Technology Education

95

Page 4: (2006) An assessment of performance between on and off-campus students in an …

statistical difference exists. However, it is interesting to note that on average, the distance students performed better. The next comparison examined the module scores of the groups when the laboratory activities could not be modified for distance education. Thus, distance students answered post-tests without conducting the supporting laboratory activities. The results indicate a statistically significant difference between the groups’ module scores. Moreover, the on-campus students averaged approximately six percent higher scores, resulting in a p value of 0.036. The last of the module breakdowns consists of the laboratory activities that were modified for distance education. The ANOVA test results indicate that no statistical difference exists between the groups. The distance students only scored approximately three percent lower on average, which is an improvement over module scores resulting from a lack of modifiable laboratory activities. However, the resulting p value of 0.056 is borderline with α =.05, which makes it difficult to report a decision of no difference. In assessing the overall performance of the students in the course, it is interesting to note that the on-campus and distance students performed as well on their exams. The combined average exam score is the resulting average score of the three exams administered throughout the semester. The ANOVA test result indicates no statistical difference; in fact, the averages are almost identical, resulting in a p value of 0.922.

CONCLUSIONS

This paper was written to understand whether there were any differences in how and whether the material in an Automatic Identification and Data Capture course is learned by two different populations. Results of the module score analysis indicate that a statistically significant difference exists between the distance education and on-campus students. However, upon module breakdown (whether it had been modified for distance) and further comparison, the results indicate that the difference in overall module scores can be attributed to instances of modules containing laboratory activities that could not be modified for distance education. In the other two instances, where no laboratory exercises were required, and in laboratory exercises with distance modifications, the groups exhibited results that were not statistically different.

REFERENCES 1. Moore, M. and G. Kearsley (1996). Distance Education: A

Systems View. Belmont, Wadsworth Publishing Company. 2. CDLP (2004). Adult Learning Activities: California Distance

Learning Project. Sacramento. 2004. 3. Driscoll, M. (1998). Web Based Training: Using Technology to

Design Adult Learning Experiences. San Francisco, Jossey-Bass/Pfeiffer.

4. The Commonwealth of Learning (2000). The Use of Multimedia in Distance Education. Vancouver, BC. 2004.

5. Caffarella, R. (2002). Planning Programs for Adult Learners: A Practical Guide for Educators, Trainers, and Staff Developers. San Francisco, John Wiley & Sons.

6. Aktan, B. Bohus, C.A. Crowl, L.A. Shor, M.H. (1996). Distance learning applied to control engineering laboratories. IEEE Transactions on Education. 39(3) p. 320- 326

7. Leiner, R. (2002). Tele-experiements via Internet a new approach for distance education. 11th Mediterranean Electrotechnical Conference, IEEE.

8. Hu, J., D. Cordel, et al. (2004). Virtual Laboratory for IT Security Education. Proceedings of the Conference on Information Systems in E-Business and E-Government (EMISA), Luxembourg.

9. Sickler, N., E. Kukula, et al. (2004). The Development of a Distance Education Class in Automatic Identification and Data Capture at Purdue University. World Conference on Engineering and Technology Education, Santos, Brazil.

©2006 WCCSETE March 19 - 22, 2006, São Paulo, BRAZILWorld Congress on Computer Science, Engineering and Technology Education

96