10
IEEE XXXX, VOL. XX, NO. XX, XXX 2018 1 Assessing the impact of user interface abstraction on online telecommunications course laboratories Johann M. Marquez-Barja, Senior Member, IEEE, Nicholas Kaminski, and Luiz DaSilva, Fellow, IEEE Abstract—In this paper we analize the impact of user inter- face abstraction in remote telecommunications laboratories. We employ four interfaces, ranging from point-and-click to tradi- tional in-lab interfaces, for students to interact with networking equipment. We then survey students following the completion of these laboratory exercises to asses the effect of the different user interfaces on the learning experience. Our analysis indicates a sweet spot in the amount of abstraction provided by the interface, with the web-based rich interactive interface obtaining the most positive feedback from students. Index Terms—remote experimentation, elearning, remote labs, user interface abstraction I. I NTRODUCTION Online courses are, increasingly, a vital part of university teaching, due to advances in eLearning technology and the desire to support a diverse population of distance-learning students. At the same time, the experimentation process is of fundamental importance for both undergraduate and graduate engineering education and research: physical experimentation directly underpins engineering education by providing students with the valuable experience of putting theory into practice. In the eLearning context, there is the need for remote labora- tories and hands-on experimentation to reinforce the learning process. To provide remote physical laboratories that can be accessed online via eLearning platforms, several components must come into play, such as testbeds and laboratory equipment able to be remotely manipulated, networking components to provide the communications between the eLearning platform and the testbed facility, and the interactive multimedia tools to interface the students to the remote lab. There are several initiatives reported in the literature that provide rich, remotely accessible online lab experiences, through a powerful set of multimedia resources ([1]). These initiatives build user interfaces for students to interact re- motely with physical components employed in laboratory experiments. It is key that the abstraction provided by those interfaces not obscure the students’ perception of dealing with real facilities and conditions, as opposed to simply a simulated environment. The question we address in this paper J. M. Marquez-Barja is with the IDLab Research Group–Faculty of Applied Engineering, University of Antwerp–imec, Sint-Pietersvliet 7, 2000 Antwerp, Belgium. email: Johann.Marquez-Barja@{uantwerpen.be,imec.be} Nicholas Kaminski and Luiz DaSilva are with Trinity College Dublin, CONNECT Research Centre, Westland Row 34 Dublin, Ireland. e-mail: {kaminskn,dasilval}@tcd.ie. Manuscript received March xxx, xxxx; revised. is which user interfaces for online laboratories are effective in conveying the learning message without detracting from the real value of hands-on experimentation. To provide answers to the above question, we have assessed student reactions to online labs that provide students with prac- tical hands-on knowledge of telecommunications networks. In our course, students experiment with wireless communications and networks using three different high-performance testbeds, located in Belgium, Australia and Ireland. The remote lab materials associated with two of these testbeds provide dif- ferent levels of interaction and abstraction, allowing students to remotely access the network testbed resources from web interfaces and command line tools. In contrast, students use the third testbed locally, to experiment directly with radio equip- ment. We assessed students’ experiences through anonymous surveys performed at the end of each lab session, revealing the students’ preferences and the skills they gained during the course. Moreover, in order to perform a fair comparison among the different type of interfaces, we conducted some of the same experiments using different access interfaces, evaluating students’ responses to each version of the same experiment. In this paper, we report on the results of the study described above. II. RELATED WORK During the last forty years there has been an evolution on the education-related tools used to enhance educational processes. Zawacki-Richter and Latchem [2] present a solid analysis of the education-related research outcome for the last four decades depicting a transformation from computer- based instruction education -during the mid seventies to mid eighties- to online learning in a digital age -during the last decade-, having experienced stand-alone multimedia, and net- worked computers as tools for collaborative learning during the nineties and early years of the new millennium. Nowadays, regarding laboratories, they can be grouped into traditional in-lab, virtual, and remote laboratories [3], [4]. Traditional laboratories situate the interaction of students with the lab equipment within the laboratory premises, while virtual and re- mote laboratories can be accessed, via internet, from different locations. Virtual laboratories are empowered by simulation software that provides students with a lab experience close to a real lab session [5]. Finally, remote laboratories enable remote access, through software interfaces, to real lab equipment in order to provide a real lab experience from a remote location. In both virtual and remote labs, the software interfaces that

IEEE XXXX, VOL. XX, NO. XX, XXX 2018 1 Assessing the

  • Upload
    others

  • View
    10

  • Download
    0

Embed Size (px)

Citation preview

Page 1: IEEE XXXX, VOL. XX, NO. XX, XXX 2018 1 Assessing the

IEEE XXXX, VOL. XX, NO. XX, XXX 2018 1

Assessing the impact of user interface abstractionon online telecommunications course laboratories

Johann M. Marquez-Barja, Senior Member, IEEE, Nicholas Kaminski,and Luiz DaSilva, Fellow, IEEE

Abstract—In this paper we analize the impact of user inter-face abstraction in remote telecommunications laboratories. Weemploy four interfaces, ranging from point-and-click to tradi-tional in-lab interfaces, for students to interact with networkingequipment. We then survey students following the completion ofthese laboratory exercises to asses the effect of the different userinterfaces on the learning experience. Our analysis indicates asweet spot in the amount of abstraction provided by the interface,with the web-based rich interactive interface obtaining the mostpositive feedback from students.

Index Terms—remote experimentation, elearning, remote labs,user interface abstraction

I. INTRODUCTION

Online courses are, increasingly, a vital part of universityteaching, due to advances in eLearning technology and thedesire to support a diverse population of distance-learningstudents. At the same time, the experimentation process is offundamental importance for both undergraduate and graduateengineering education and research: physical experimentationdirectly underpins engineering education by providing studentswith the valuable experience of putting theory into practice.In the eLearning context, there is the need for remote labora-tories and hands-on experimentation to reinforce the learningprocess.

To provide remote physical laboratories that can be accessedonline via eLearning platforms, several components mustcome into play, such as testbeds and laboratory equipmentable to be remotely manipulated, networking components toprovide the communications between the eLearning platformand the testbed facility, and the interactive multimedia toolsto interface the students to the remote lab.

There are several initiatives reported in the literature thatprovide rich, remotely accessible online lab experiences,through a powerful set of multimedia resources ([1]). Theseinitiatives build user interfaces for students to interact re-motely with physical components employed in laboratoryexperiments. It is key that the abstraction provided by thoseinterfaces not obscure the students’ perception of dealingwith real facilities and conditions, as opposed to simply asimulated environment. The question we address in this paper

J. M. Marquez-Barja is with the IDLab Research Group–Faculty of AppliedEngineering, University of Antwerp–imec, Sint-Pietersvliet 7, 2000 Antwerp,Belgium. email: Johann.Marquez-Barja@{uantwerpen.be,imec.be}

Nicholas Kaminski and Luiz DaSilva are with Trinity College Dublin,CONNECT Research Centre, Westland Row 34 Dublin, Ireland. e-mail:{kaminskn,dasilval}@tcd.ie.

Manuscript received March xxx, xxxx; revised.

is which user interfaces for online laboratories are effective inconveying the learning message without detracting from thereal value of hands-on experimentation.

To provide answers to the above question, we have assessedstudent reactions to online labs that provide students with prac-tical hands-on knowledge of telecommunications networks. Inour course, students experiment with wireless communicationsand networks using three different high-performance testbeds,located in Belgium, Australia and Ireland. The remote labmaterials associated with two of these testbeds provide dif-ferent levels of interaction and abstraction, allowing studentsto remotely access the network testbed resources from webinterfaces and command line tools. In contrast, students use thethird testbed locally, to experiment directly with radio equip-ment. We assessed students’ experiences through anonymoussurveys performed at the end of each lab session, revealingthe students’ preferences and the skills they gained during thecourse. Moreover, in order to perform a fair comparison amongthe different type of interfaces, we conducted some of thesame experiments using different access interfaces, evaluatingstudents’ responses to each version of the same experiment.In this paper, we report on the results of the study describedabove.

II. RELATED WORK

During the last forty years there has been an evolutionon the education-related tools used to enhance educationalprocesses. Zawacki-Richter and Latchem [2] present a solidanalysis of the education-related research outcome for thelast four decades depicting a transformation from computer-based instruction education -during the mid seventies to mideighties- to online learning in a digital age -during the lastdecade-, having experienced stand-alone multimedia, and net-worked computers as tools for collaborative learning duringthe nineties and early years of the new millennium. Nowadays,regarding laboratories, they can be grouped into traditionalin-lab, virtual, and remote laboratories [3], [4]. Traditionallaboratories situate the interaction of students with the labequipment within the laboratory premises, while virtual and re-mote laboratories can be accessed, via internet, from differentlocations. Virtual laboratories are empowered by simulationsoftware that provides students with a lab experience close to areal lab session [5]. Finally, remote laboratories enable remoteaccess, through software interfaces, to real lab equipment inorder to provide a real lab experience from a remote location.In both virtual and remote labs, the software interfaces that

Page 2: IEEE XXXX, VOL. XX, NO. XX, XXX 2018 1 Assessing the

IEEE XXXX, VOL. XX, NO. XX, XXX 2018 2

allow students to interact with the virtual or real equipment,respectively, is a key factor for the success of the learningoutcomes and the lab experience in general [6]. Therefore, inthis paper we focus on measuring the impact of such userinterfaces in remote and in-lab laboratories in general, and thelevel of the abstraction of those interfaces in particular.

Diwakar et al. [7] study the trade-offs among technologicalaspects, such as testbed technology, processing capabilities,and user interfaces; and the pedagogical aspects, such asobjectives of the lab and feedback methods, within the controlsystems domain. Authors analyzed a wide range of technolo-gies for remote experimentation including Matlab, LabView,Java-based systems, and virtual labs. Their findings reveal thatvirtual labs were preferred by engineering students due tothe web-based interfaces that virtual labs provided. Similarly,Lindsay and Good [8] present an analysis of the impactof lab interfaces and their audiovisual feedback on learningoutcomes. These authors analyzed two types of laboratories:remote, and virtual labs, considering web interfaces for bothlabs, based on audio and video streaming from the equipmentto enrich the students’ remote lab experience. Moreover,authors compare such analysis with a previous work of themon traditional labs [9]; such comparison reveals the preferencesof students towards a rich audiovisual web-based interface.

Tsihouridis et al. [10] evaluate the effectiveness of remotelabs in comparison to physical labs, including the interfaces asa factor. The authors applied questionnaires for the assessment,finding that with regard to conceptual understanding, remotelaboratories have a similar impact on students as physicallaboratories, highlighting the importance of the user interfacesin remote labs to provide a close experience to a real lab.Similarly, Marques et al. [11] emphasize the feeling of im-mersion in remote labs as indispensable. The authors, withinan engineering laboratories context, have analyzed severalworks empowered by Virtual Instrument Systems In Reality(VISIR) systems, a remote laboratory for wiring and measur-ing electronic circuits [12], [13], from teachers’ and students’perspectives considering usefulness (pedagogical values) andusability (technical issues) factors and in terms of learningachievements.

An interesting study of the impact of software-based in-terfaces in the design of remote labs is presented by Garcia-Zubia et al. [14], where authors evaluate the most appropriatesoftware at the client and server side to enable the WebLab-Deusto lab, a remote lab system for high school physicslabs. Authors also survey students, assessing their experiencewith the different versions of the lab, which vary from adesktop application to a web application implemented withAjax programming language. The outcome clearly favors webapplication labs due to the ease of use provided by theinterface.

Jourjon et al. [15] demonstrated the effectiveness of theirremote lab and its web-based interface through an evaluationfrom the students’ point of view about learning perception andinterface appreciation. Overall the students rated this approachwell for understanding general and specific subjects from thelecture. Moreover, Shanab et al. [16] evaluate the students’perception of different laboratories including augmented real-

ity (based on remote access), virtual, and hands-on, and theirinterfaces, concluding, based on surveys, that the virtual andaugmented labs were more effective for enhancing students’understanding of the lectures.

As aforementioned, our study presents an evaluation ofthe impact of remote labs, using different interfaces, onstudents and the relationship between the level of abstractionand usability of each type of user interface used and thestudents’ satisfaction level. Our work extends beyond existingworks, such as the ones proposed by Lindsay and Good [9],[8], by focusing on the level of abstraction and usability ofinteractive interfaces. Last, but not least, we take into accountand acknowledge the fact that technology-enhanced learningtechniques, relying on the use of computing devices suchas tablet, laptops, and PCs, already impact on the students’learning abilities, and in the learning process in general [17].

III. METHOD

In this paper we assess the impact on engineering studentsof the level of abstraction of the interfaces employed to accessand to interact with network testbeds. We define level of ab-straction as the degree to which the interface disassociates theuser interaction from the physical testbed. The user interfacesthat we have evaluated range from web-based to traditionalin-lab physical interfaces, which were used for remote and in-lab experimentation. In this section we describe the featuresof those user interfaces, the learning context where the labswere deployed and evaluated, and the assessment tools thatwe used in order to evaluate the impact on students.

A. Learning Context

The learning context is an Electrical and Electronic En-gineering Master’s level course on wireless networks andcommunications, which has both theoretical and practicalcomponents. In this course students are introduced to practicalhands-on networking experimentation through different labo-ratories, reinforcing the lectures. Students apply the conceptsthey have studied theoretically through implementation andreal experimentation on wireless communications and net-works, including software-defined radio, wireless local areanetworks, and network protocols. Students experiment withwireless network testbeds located in Australia, Belgium andIreland, using diverse user interfaces that imply distinct levelsof interaction and abstraction.

For this course, we have followed the IEEE/ACM CS2013joint curriculum for computer science engineering, proposedby the ACM/IEEE-CS Curricula Steering Committee [18],which includes and extends to electrical engineering and otherengineering disciplines. The content of the course was de-signed following the high-level goals of the CS2013 Network-ing and Communication knowledge area, which are: i) thinkingin a networked world, ii) continued/linked advanced study, iii)principles and practice interaction. We created online educa-tional material structured in mini-courses or modules contain-ing defined portions of the CS2013 Body of Knowledge. Thatway, modules can be reused by different institutions. Table Ipresents the syllabus of the course encompassing theory and

Page 3: IEEE XXXX, VOL. XX, NO. XX, XXX 2018 1 Assessing the

IEEE XXXX, VOL. XX, NO. XX, XXX 2018 3

Fig. 1. Level of abstraction of the user interfaces used for experimentation

experimentation from a top-down approach [15], as well asthe user interface applied to deliver the experimentation. Thedescription of such interfaces can be found in Section III-B.

B. User Interfaces

This study assesses the level of abstraction of the differentuser interfaces used by the laboratories and its impact ofengineering students. We classify such user interfaces asshown in Figure 1.

• Web-based point-and-click interface. A web-based point-and-click user interface allows experimenters to interactwith the testbed in a supervised manner, encapsulating anumber of options and operations. The interface providesforms where experimenters can set parameters and launchexperiments to later collect the results. The interfacetranslates the input from the web forms into commandsfor the testbed equipment to launch the experiments. Thenthe results, collected as sql datasets, are available to bedownloaded by the experimenters. Figure 2(a) shows anexample this type of interface.

• Web-based rich interactive interface. The web-based richinteractive interface contains widgets to control and setinput parameters, and to display experimentation outputin real time through a set of dynamic plots. This interfacecan be used within eBooks or imported to different Learn-ing Management Systems (LMSs). Also, the interface canbe displayed in a wide range of computing devices. Thistype of interface is shown in Figure 2(b).

• Command line interface. Command line allows studentsto remotely access the testbed and perform experiments.Experimenters type commands to configure the equip-ment and to run experiments. In this type of interfacethere are no widgets that facilitate the control of the inputor display the results; the results either can be saved tolog files or directly printed to text output, as shown inFigure 2(c).

• Traditional in-lab interface. This approach allows ex-perimenters to interact directly with the equipment butrequires that experimenters be physically located in thelaboratory facilities. Figure 2(d) presents the equipmentused in the in-lab experimentation.

C. Instruments

To assess the student perception of the user interfaceswe asked students to complete an anonymous survey, as aformative assessment activity and as an online assessment toassess both level of satisfaction and understanding of students[19], [20], following each laboratory exercise. These surveys,administered via survey monkey1, collected the subjectiveopinion of the students with regard to the five statements listedbelow. Each question focused on a particular aspect of thestudent’s laboratory experience. Questions were structured asstatements for which students were asked to indicate theiragreement on a five-point likert-type scale [21], consistingof the levels: strongly disagree, moderately disagree, neutral,moderately agree, and strongly agree. Student responses tothese questions provide the basis for our examination of theimpact of the user interfaces employed in laboratory exerciseson student experience.

Survey statements:

1) The lab experiments helped you to understand the con-cepts taught in the lecture.

2) The interface (web or command-based) reduced thedifficulty of the lab.

3) You were always aware that you were performing realexperiments using network equipment located in otherfacilities around the world.

4) The experimentation helped you to self-assess yourprogress in the course.

5) You would use the testbed facility in the future, if youhave access to it.

The analysis of student responses to the above questionnaireis presented in Section IV.

IV. RESULTS

A total of 16 students were enrolled in the course, perform-ing the different labs in two person teams. The lab durationvaried from 2 to 3 hours for a total of 14 sessions, resultingin more than 1300 networking-related experiment executions.

The student responses to the survey questions provide in-sight into the role of abstraction in experimentation interfaces.Unsurprisingly, we find that abstraction offers a potent meansof tempering the difficulty of experimentation by exposingonly the most relevant portions of the testbed facility to thestudents. More interestingly, however, we find that an over-abundance of this abstraction quickly degrades the student ex-perience. In fact, in several cases, it is clear that no abstractionat all is vastly superior to excess abstraction. Therefore, whileabstraction in experimentation interfaces certainly contributesto positive student experiences, this abstraction must be bal-anced to maintain the students’ connection to the underlyingequipment.

We grouped surveys for each individual laboratory exerciseaccording to the interface employed for the exercises inquestion, as described in Section III-B. The proportion ofresponses corresponding to each agreement level from this

1www.surveymonkey.com

Page 4: IEEE XXXX, VOL. XX, NO. XX, XXX 2018 1 Assessing the

IEEE XXXX, VOL. XX, NO. XX, XXX 2018 4

TABLE IWIRELESS NETWORKS AND COMMUNICATION COURSE SYLLABUS

Week Lecturehours

Labhours

Experimentation CS2013 Knowledge area Interface type

1 2 2 TCP/IP networking & HTTP protocol Networking and communication Web-based click-and-point2 2 2 End-to-end data reliable transfer mechanism Networking and communication Web-based click-and-point3 2 2 TCP congestion control Networking and communication Web-based click-and-point4 2 4 WLAN infrastructure - performance evaluation Networking and communication Command line5 2 3 WLAN ad-hoc - performance evaluation Networking and communication Command line6 2 3 802.11 QoS Networking and communication Command line7 2 6 Reinforcement for weeks 4, 5 and 6 Networking and communication Web-based rich interactive8 2 4 WLAN authentication, WEP security Information assurance and security Command line9 2 2 WLAN WPA security Information assurance and security Command line10 2 2 OFDM signalling Systems fundamentals Traditional in-lab11 2 2 Dynamic spectrum access Networking and communication Traditional in-lab

(a) Web-based point and click (b) Web-based rich interactive

(c) Command line (d) Traditional in-lab

Fig. 2. Different lab interfaces evaluated

grouping for each question is displayed in Figure 3. Further-more, these figures display the overall levels of agreementand disagreement for each question and interface. Together,this visualization of student responses provides an indicationof the impact of the user interface on student responses.

Examining Figure 3(a), we can see that the students gen-erally considered the labs to be a helpful experience withregard to their understanding of the subject matter. Thisresult is, on the whole, in line with prior findings that linkexperimentation to a furthering of understanding. In fact, onlythe most abstracted interface, the web-based point-and-clickinterface, failed to elicit a 90% agreement to the first surveyquestion. This marked difference in student responses leadsto our finding that too much abstraction has a detrimentaleffect on student experience. In this particularly case, wesee that the loosening of the connection between studentsand equipment degrades the value of experiment in terms of

student understanding of the underlying material.Figure 3(b) displays students’ feedback on the impact of

the user interface on the difficulty of the laboratory exercise.More than any other question, the response to this surveyquestion highlights the non-monotonic relationship of interfaceabstraction and improvement of student experience. Naturally,the student responses indicate a increasing reduction in labo-ratory difficult as abstraction increases, until a sharp corner isreached. The first portion of this evolution, the period of pos-itive correlation between abstraction and difficulty reduction,follows naturally from the purpose of abstraction in interfacesas a means to hide extraneous detail and focus the studentson meaningful aspects of the experiment. However, the sharpdownturn in responses indicates that there is a limit to thevalue of hiding details in favor of exposing only the relevantaspects to a student. At some point, the limitation of optionsto the student inhibits her ability to explore the concept in

Page 5: IEEE XXXX, VOL. XX, NO. XX, XXX 2018 1 Assessing the

IEEE XXXX, VOL. XX, NO. XX, XXX 2018 5

(a) (b)

(c) (d)

(e)

Fig. 3. Student survey responses in (a) question 1, (b) question 2, (c) question 3, (d) question 4, and (e) question 5 for each user interface.

her own preferred manner. Here the focus on limited optionsbecomes a hindrance to the learning process.

Figure 3(c) assesses the students’ awareness that they wereperforming real experiments using remote facilities, as op-posed to a simulated environment. Overall, the responses re-flect that students were consistently aware that remote facilities

were in use for the laboratory exercise, with two exceptions.When the web-based point-and-click interface was used, whilemost students indicated their awareness that experiments werebeing conducted on real hardware, the portion that indicatethe lack of such a connection is especially significant becausestudents were informed that they were performing experiments

Page 6: IEEE XXXX, VOL. XX, NO. XX, XXX 2018 1 Assessing the

IEEE XXXX, VOL. XX, NO. XX, XXX 2018 6

on top a remote facility, prior to each laboratory exercise.For exercises conducted when students were in-lab, the smalluptick in negative results may well be linked to confusion overthe question’s reference to remote facilities. Once again, anexcessive amount of abstraction in the user interface appearsto undermine the connection to real events, even if explicitlystated by the instructor, ultimately degrading value to thestudent.

Figure 3(d) illustrates the impact of the user interfacesemployed during laboratory exercises on the students’ abilityto self-assess progress during the course. In terms of theinfluence of the user interface, this question investigates thefeedback made available to students that lets them assesstheir understanding of the topic at hand. With the singularexception of user interface 1, students had a high opinion ofthe interactivity offered by the user interfaces in support of selfassessment activities. We find that the limitations associatedwith abstraction beyond some threshold are detrimental to theoperation of the student, focusing, in this case, on the processof self-assessment. Abstraction at this level disconnects thestudent from the experiment, hiding the details that allowstudents to perform self-assessment.

Finally, Figure 3(e) shows the eagerness of students to usethe test facilities in the future. This question explores students’impressions of the facilities employed in laboratory exercises,as experienced through the lens of each user interface. Withinthis regard, this figure captures the students’ overall impressionof the user interfaces provided by each facility. Here we notevariability that resembles that found in Figure 3(b). As seen inthe above figure, student agreement increases with increasingabstraction until some point, after which we see a steep declinein student interest in the facility. As elucidated throughout allthe student responses, the detail hiding associated with a highlevel of abstraction in the user interface also appears to hidethe value of a facility from the student perspective.

Figure 4 displays the distribution of student responses toeach survey question, grouped according to the user interfaceemployed in the laboratory exercise. For the purposes of theseplots each response was assigned a numerical value, reflectingthe level of agreement indicated by the student; in this scheme-2 corresponds to strong disagreement and 2 corresponds tostrong disagreement, with other levels of agreement repre-sented as unit steps between these two extremes. The top ofeach box marks the third quartile, which splits the highestquarter of the responses from the lowest three quarters. Thebottom of the box indicates the first quartile, which splitsthe lowest quarter of the responses from the highest threequarters. The center red line identifies the median response.The upper whisker shows the highest response value less thanthe third quartile plus one and half times the distance betweenthe first and third quartiles. The lower whisker shows thelowest response value within the first quartile minus one andhalf times the distance between the first and third quartiles.All responses outside of the whiskers, marked as crosses, areconsidered outliers. In this way, this set of plots displays theconsensus, or lack thereof, of the student opinions for eachquestion with respect to each user interface.

Examining Figures 4(a), 4(c), and 4(d), we find that student

opinions are generally favorable of the user interfaces em-ployed for laboratory exercises. Additionally we see in thesefigures that students’ opinion regarding the web-based point-and-click interface is typically lower than the other interfaces.Overall we see that students tended to agree with regard toeach of these questions, sometimes remarkably so.

The other two questions exhibited much higher variability,as illustrated in Figures 4(b) and 4(e). In both of thesequestions, students exhibited a consistently high opinion of theweb-based rich interactive interface and somewhat favorable,although much more broadly spread, opinion of the commandline-based interface. Students did not exhibit consensus withregard to the remaining interfaces, more often favoring the in-lab experience to the web-based point-and-click interface. Thisspread of responses for the extremes of abstraction consideredin this study indicates that, while some students could certainlygain value from these edge cases, the level of abstraction mustbe balanced to provide a uniform educational experience.

Figure 5 displays the mean response of students in eachquestion, grouped according to user interface. This figureapplies the same numbering methodology employed for thebox plots of Figure 4. This figure provides a general character-ization of each user interface. The first, and most abstracted,user interface, the web-based point-and-click interface, wasconsidered by students to be the worst of the group, withespecially poor performance in easing laboratory exercisedifficulty and eliciting interest in further use of the associatedremote facility. The second user interface, the web-based richinteractive interface, achieved the best balance of abstractionand was regarded by students as providing the best experiencewith regard to every question. The third user interface, thecommand line-based interface, reached the second positiononly in support for development of understanding and easeof use, but was relegated to third position by students withregard to the other three questions. The response to the in-lab interface indicated that low abstraction user interfacesprovide significant connection to real equipment, ability forself-assessment, and motivation for facility reuse, primarily atthe price of ease of use.

V. DISCUSSION

We have analyzed the impact of different types of userinterfaces on the students’ perception of the remote labs.However, we acknowledge the fact that remote laboratories,and elearning systems, are complex systems, where severallearning components and learner factors come into play to suc-cessfully deliver the desired message and learning outcomes.These learning components include the curriculum design,structure of the learning activities, relevance of the choseninterface technology, didactical and pedagogical approach, andeffective time management [22], while the learner factorsinclude epistemological beliefs, approach to learning, and at-titudes towards technology use [23]. Therefore, we calculatedthe uncertainty coefficient [24] to analyse the relationshipbetween student responses and the user interface used in eachlaboratory exercise. Conceptually, the uncertainty coefficientindicates the proportion of information in one quantity that

Page 7: IEEE XXXX, VOL. XX, NO. XX, XXX 2018 1 Assessing the

IEEE XXXX, VOL. XX, NO. XX, XXX 2018 7

(a) (b)

(c) (d)

(e)

Fig. 4. Box plot of student survey responses in (a) question 1, (b) question 2, (c) question 3, (d) question 4, and (e) question 5 for each user interface.

directly reflects some other quantity. In this case, we usethe uncertainty coefficient to quantify the impact that theuser interface had on the student experience with regardto the questions in the survey, described in Section III-C.Mathematically, the uncertainty coefficient is calculated asthe mutual information between the user interface and the

subsequent student response, normalized by the entropy of theuser interface. Equation 1 presents this calculation in terms ofthe probability of the use of a particular user interface, PI(i),the probability of particular student response, PS(s), the jointprobability of a user interface and student response, PI,S(i, s),and the conditional probability of the use of a user interface

Page 8: IEEE XXXX, VOL. XX, NO. XX, XXX 2018 1 Assessing the

IEEE XXXX, VOL. XX, NO. XX, XXX 2018 8

Fig. 5. Mean student survey response in each question for each user interface

Fig. 6. Influence of user interface on responses to each survey question

given a student response, PI|S(i|s).

U(I|S) =−∑i,s

PI,S(i, s) log2 PI|S(i|s)

−∑i

PI(i) log2 PI(i)(1)

Figure 6 summarizes the findings of the uncertainty co-efficient analysis for each question. As is clear from thisfigure, the user interface is only one of several significantfactors that influence the student experience; others mayinclude: remote facility capabilities, expected grade, studentfeelings about the lecturer or course, or student interest onthe topic as per suggested by the ACM/IEEE-CS CurriculaSteering Committee[18]. Note that the subjective nature of thequestions also impacts the influence of the user interface, asrepresented by the uncertainty coefficient, since the student’sinterpretation of the question or the meaning of ’strongly’ or’moderately’ also impacts the response selected. In fact, theuser interface is one of the only elements in full control of the

educator; as such, the determination of the impact of the userinterface provides insight into a rare aspect of control over theexperience of students.

Our findings about the web-based rich interactive interfacereinforce the use of technology such as the FORGEBoxframework, which is a framework produced by the FORGEproject2 for developing and deploying remote experimentationon high-performance testbeds [25]. FORGEBox is a compo-nent that interconnects and hosts learning interactive contentwith federated resources across Future Internet Research andExperimentation (FIRE) testbeds3, offering a set of servicesthat provide interactivity with the remote resources throughthe usage of web-based rich interactive interfaces empoweredby widgets. Learning Management Systems, eBooks, and anyfuture element that wishes to consume FORGE content, caneither discover and integrate web reference points of widgetsinto their courses or adopt FORGE lab course descriptionsas offered by the FORGEBox platform. A running instanceof FORGEBox is located at www.forgebox.eu/fb, currentlyoffering 40 course modules with several interactive partscovering topics on advanced networking, wireless networksand FIRE facilities usage.

VI. CONCLUSIONS

In this paper we analyse the impact of user interfaceabstraction on the experience of students with regard toeducational experimentation. In the course of this analysis,we discuss four interfaces at different levels of abstractionemployed in the support of laboratory exercises. To support ouranalysis, we have surveyed students following their completionof these laboratory exercises to determine the impact of themechanisms used on their education.

We examine trends in student responses, finding that ab-straction provides a useful methodology for enhancing studentexperience; however, too much abstraction of the experi-mental interface may obfuscate the physical testbed facilitycapabilities and operation. Thankfully, our analysis indicatesthe existence of a balance point of abstraction for studentexperimentation that is free from any such penalties. Moreover,we find the web-based rich interactive interface approachesthis sweet spot of abstraction in all aspects considered here.

In the future, we intend to gain more insight into students’interactions, by developing and applying learning analyticstechnologies into the web-based rich interface. Widgets willbe able to register different actions of the students and collectsuch data for further analysis, thus improving the quality ofthe assessment.

ACKNOWLEDGMENTS

This work has received funding from the European Union’sSeventh Framework Programme for research, technologicaldevelopment and demonstration under grant agreement no.610889 (FORGE).

We also acknowledge support from the Science FoundationIreland under Grants No. 10/IN.1/I3007 and 13/RC/2077 SFI

2www.ict-forge.eu3www.ict-fire.eu

Page 9: IEEE XXXX, VOL. XX, NO. XX, XXX 2018 1 Assessing the

IEEE XXXX, VOL. XX, NO. XX, XXX 2018 9

CONNECT, as well from the University of Antwerp underfunds Academisch Krediet - Wetenschappelijke Dienstver-leningAK180029.

REFERENCES

[1] J. M. Marquez-Barja, G. Jourjon, A. Mikroyannidis, C. Tranoris,J. Domingue, and L. A. DaSilva, “FORGE: Enhancing eLearning andresearch in ICT through remote experimentation,” in IEEE GlobalEngineering Education Conference (EDUCON14), Apr. 2014, pp.1157–1163. [Online]. Available: http://dx.doi.org/10.1109/educon.2014.7096835

[2] O. Zawacki-Richter and C. Latchem, “Exploring four decades of re-search in Computers & Education,” Computers & Education, vol. 122,pp. 136–152, jul 2018.

[3] J. Ma and J. V. Nickerson, “Hands-on, Simulated, and RemoteLaboratories: A Comparative Literature Review,” ACM ComputingSurveys, vol. 38, no. 3, Sep. 2006. [Online]. Available: http://dx.doi.org/10.1145/1132960.1132961

[4] J. M. Marquez-Barja, N. Kaminski, F. Paisana, C. Tranoris, andL. A. DaSilva, “Virtualizing testbed resources to enable remoteexperimentation in online telecommunications education,” in IEEEGlobal Engineering Education Conference (EDUCON15), Mar. 2015,pp. 836–843. [Online]. Available: http://dx.doi.org/10.1109/educon.2015.7096069

[5] R. Bose, “Virtual labs project: A paradigm shift in internet-based remoteexperimentation,” IEEE Access, vol. 1, pp. 718–725, Jan. 2013.

[6] F. Esquembre, “Facilitating the Creation of Virtual and Remote Labora-tories for Science and Engineering Education,” International Federationof Automatic Control IFAC-PapersOnLine, vol. 48, no. 29, pp. 49–58,2015.

[7] A. Diwakar, S. Poojary, R. Rokade, S. Noronha, and K. Moudgalya,“Control systems virtual labs: Pedagogical and technologicalperspectives,” in IEEE International Conference on ControlApplications (CCA), Aug. 2013, pp. 483–488. [Online]. Available:http://dx.doi.org/10.1109/cca.2013.6662796

[8] E. Lindsay and M. Good, “The Impact of Audiovisual Feedback on theLearning Outcomes of a Remote and Virtual Laboratory Class,” IEEETransactions on Education, vol. 52, no. 4, pp. 491–502, Nov. 2009.[Online]. Available: http://dx.doi.org/10.1109/te.2008.930510

[9] E. D. Lindsay and M. C. Good, “Effects of laboratory accessmodes upon learning outcomes,” IEEE Transactions on Education,vol. 48, no. 4, pp. 619–631, Nov. 2005. [Online]. Available:http://dx.doi.org/10.1109/te.2005.852591

[10] C. Tsihouridis, D. Vavougios, and G. S. Ioannidis, “The effectivenessof virtual laboratories as a contemporary teaching tool in theteaching of electric circuits in Upper High School as comparedto that of real labs,” in International Conference on InteractiveCollaborative Learning (ICL), 2013, pp. 816–820. [Online]. Available:http://dx.doi.org/10.1109/icl.2013.6644714

[11] M. A. Marques, M. C. Viegas, M. C. Costa-Lobo, A. V. Fidalgo, G. R.Alves, J. S. Rocha, and I. Gustavsson, “How Remote Labs Impact onCourse Outcomes: Various Practices Using VISIR,” IEEE Transactionson Education, vol. 57, no. 3, pp. 151–159, Aug. 2014. [Online].Available: http://dx.doi.org/10.1109/te.2013.2284156

[12] M. Tawfik, E. Sancristobal, S. Martin, R. Gil, G. Diaz, A. Colmenar,J. Peire, M. Castro, K. Nilsson, J. Zackrisson, L. Hakansson, andI. Gustavsson, “Virtual Instrument Systems in Reality (VISIR) forRemote Wiring and Measurement of Electronic Circuits on Breadboard,”IEEE Transactions on Learning Technologies, vol. 6, no. 1, pp. 60–72,Jan. 2013. [Online]. Available: http://dx.doi.org/10.1109/tlt.2012.20

[13] S. Odeh, J. Alves, G. Alves, I. Gustavsson, M. Anabtawi, L. Arafeh,M. Jazi, and M. Arekat, “A Two-Stage Assessment of the RemoteEngineering Lab VISIR at Al-Quds University in Palestine,” IEEERevista Iberoamericana de Technologias del Aprendizaje, 2015.[Online]. Available: http://dx.doi.org/10.1109/rita.2015.2452752

[14] J. Garcıa-Zubia, P. Orduna, D. Lopez-de Ipina, and G. R. Alves,“Addressing Software Impact in the Design of Remote Laboratories,”IEEE Transactions on Industrial Electronics, vol. 56, no. 12, pp.4757–4767, Dec. 2009. [Online]. Available: http://dx.doi.org/10.1109/tie.2009.2026368

[15] G. Jourjon, S. Kanhere, and J. Yao, “Impact of an e-LearningPlatform on CSE Lectures,” in 16th ACM Annual Joint Conference onInnovation and Technology in Computer Science Education, ser. ITiCSE’11, New York, NY, USA, 2011, pp. 83–87. [Online]. Available:http://dx.doi.org/10.1145/1999747.1999773

[16] S. Abu Shanab, S. Odeh, R. Hodrob, and M. Anabtawi, “Augmentedreality internet labs versus hands-on and virtual labs: A comparativestudy,” in IEEE International Conference on Interactive Mobile andComputer Aided Learning (IMCL), Nov. 2012, pp. 17–21. [Online].Available: http://dx.doi.org/10.1109/imcl.2012.6396444

[17] Y.-T. Sung, K.-E. Chang, and T.-C. Liu, “The effects of integrat-ing mobile devices with teaching and learning on students’ learningperformance: A meta-analysis and research synthesis,” Computers &Education, vol. 94, no. c, pp. 252–275, mar 2016.

[18] Joint Task Force on Computing Curricula, Association for ComputingMachinery (ACM) and IEEE Computer Society, Computer ScienceCurricula 2013: Curriculum Guidelines for Undergraduate DegreePrograms in Computer Science. New York, NY, USA: ACM, 2013.[Online]. Available: https://dx.doi.org/10.1145/2534860

[19] M. Graff, “Cognitive Style and Attitudes Towards Using OnlineLearning and Assessment Methods,” Electronic Journal of e-Learning, vol. 1, no. 1, pp. 21–28, 2003. [Online]. Available:http://www.ejel.org/issue/download.html?idArticle=3

[20] D. May, C. Terkowsky, T. R. Ortelt, and A. E. Tekkaya, “The evaluationof remote laboratories: Development and application of a holisticmodel for the evaluation of online remote laboratories in manufacturingtechnology education,” in 13th International Conference on RemoteEngineering and Virtual Instrumentation (REV), Feb. 2016, pp. 133–142. [Online]. Available: http://dx.doi.org/10.1109/rev.2016.7444453

[21] R. Likert, A technique for the measurement of attitudes, Archives ofPsychology, Ed. The Science Press, 1932, vol. 22, no. 140. [Online].Available: https://books.google.ie/books?id=9rotAAAAYAAJ

[22] S. McIntyre and N. Mirriahi, “Learning to Teach Online,” Aug. 2015.[Online]. Available: http://online.cofa.unsw.edu.au/

[23] J. Lee and H. Choi, “What affects learner’s higher-order thinkingin technology-enhanced learning environments? The effects of learnerfactors,” Computers & Education, vol. 115, pp. 143–152, dec 2017.

[24] T. M. Cover and J. A. Thomas, Elements of Information Theory. JohnWiley and Sons, 2012. [Online]. Available: http://dx.doi.org/10.1002/047174882x

[25] G. Jourjon, J. M. Marquez-Barja, T. Rakotoarivelo, A. Mikroyannidis,K. Lampropoulos, S. Denaziz, C. Tranoris, D. Pareit, J. Domingue,L. A. DaSilva, and M. Ott, “FORGE Toolkit: Leveraging DistributedSystems in eLearning Platforms,” IEEE Transactions on EmergingTopics in Computing, pp. 1–12, 2016. [Online]. Available: http://dx.doi.org/10.1109/TETC.2015.2511454

Johann M. Marquez-Barja is an Associate Pro-fessor at University of Antwerpen - imec, Belgium.Currently, he is with the IDLab research group,which is performing applied and fundamental re-search in the area of communication networks andIoT/distributed systems. Formerly, he was a Re-search Assistant Professor in the CONNECT Cen-tre for Future Networks and Communications atTrinity College Dublin (TCD), Ireland. He was andis involved in several European research projects,being Co-Principal Investigator for several of them.

Currently, he is also the Technical Coordinator of the EU-Brazil FUTEBOLconsortium, becoming Principal Investigator for imec within this project.His main research interests are: 5G advanced heterogeneous architectures;programmable elastic and flexible future wireless networks and its integrationand impact on optical networks; provisioning and dynamic resource allocationtowards dynamic networks in Smart-cities, and IoT clustering. He has studiedin USA, Bolivia, Cuba and Spain. He holds a BSc+MSc Systems Engineeringdegree (Computer Science) graduated with Honours, MSc. in Telematics, MScon Computer Architectures, and a PhD in Architecture and Technology ofComputer and Network Systems from the Universitat Politecnica de Valencia(Spain) -graduated with cum laude honours-. Prof. Marquez-Barja is servingas Editor and Guest editor for various international journals, as well asparticipating in several Technical Programme and Organizing Committees fordifferent worldwide conferences/congresses. He is a member of ACM, anda Senior member of the IEEE Communications Society as well as the IEEEEducation Society.

Page 10: IEEE XXXX, VOL. XX, NO. XX, XXX 2018 1 Assessing the

IEEE XXXX, VOL. XX, NO. XX, XXX 2018 10

Nicholas Kaminski currently is a Research Fellowat the CONNECT (formerly CTVR) Telecommuni-cations Research Centre at Trinity College Dublin(Ireland). He conducts research focused on extend-ing the bounds of wireless technology by deployingtargeted intelligence to act in harmony with flex-ible radio systems. He advances distributed radiointelligence through experimentation-based researchand is co-principal investigator on several Europeanprojects that create and develop experimentationplatforms and services for an open infrastructure. Dr.

Kaminski received his masters degree in Electrical Engineering in 2012 andhis Ph.D. in 2014 from Virginia Tech, USA. During this time he was fundedas a Bradley Fellow at the Bradley Department of Electrical and ComputerEngineering at Virginia Tech. He is a co-inventor on a preliminary patentapplication for his work extending the utility of cognitive radios.

Luiz A. DaSilva is the Professor of Telecommu-nications with Trinity College Dublin. His researchfocuses on distributed and adaptive resource man-agement in wireless networks, and in particular wire-less resource sharing, dynamic spectrum access, andthe application of game theory to wireless networks.Prof. DaSilva is currently a principal investigatoron research projects funded by the National ScienceFoundation in the United States, the Science Foun-dation Ireland, and the European Commission underHorizon 2020. He is the Director of CONNECT, the

Telecommunications Research Centre in Ireland.