174
Developing and Creating Interactive e-Tutorials to support Blended Learning in Selected Modules in the School of Information and Library Studies University College Dublin This capstone report is submitted in partial fulfilment of the requirements for the degree of Master in Library and Information Studies. Dr. Crystal Fulton & Dr. Claire McGuinness August 2014 Adrian Dunne, Robert Fagan, Fiona Farrelly, Jennifer Finnerty, Chelsea Holland, and Mark McLoughlin

Developing and Creating Interactive e-Tutorials

Embed Size (px)

Citation preview

Page 1: Developing and Creating Interactive e-Tutorials

Developing and Creating Interactive

e-Tutorials to support Blended

Learning in Selected Modules in the

School of Information and Library

Studies

University College Dublin

This capstone report is submitted in partial fulfilment of the

requirements for the degree of Master in Library and Information

Studies.

Dr. Crystal Fulton & Dr. Claire McGuinness

August 2014

Adrian Dunne, Robert Fagan, Fiona Farrelly,

Jennifer Finnerty, Chelsea Holland, and

Mark McLoughlin

Page 2: Developing and Creating Interactive e-Tutorials

ii

ACKNOWLEDGEMENTS

We would like to acknowledge the guidance of our supervisors, Dr. Crystal

Fulton and Dr. Claire McGuinness. Their considered input and assistance at

every stage successfully steered us through this process; we are extremely

grateful for this.

We would also like to thank the SILS administrative staff, Claire Nolan and

Lisa Gaffney, for their attentive assistance during the project and the academic

year. We appreciate the collective support and attention received from SILS

academic staff during the year.

We recognize the time and effort of our participants who graciously agreed to

take part in our usability testing. Without their input, the e-Tutorials would not

have reached their current standard.

Finally, we would like to thank our families for their constant support and

encouragement. Their patience during this process was unending, and

strengthened our persistence.

Page 3: Developing and Creating Interactive e-Tutorials

iii

TABLE OF CONTENTS

Page

List of Tables vi

List of Figures vi

Abstract viii

1. Introduction 1

2. Literature Review 5

2.1 Introduction to e-Learning 5

2.1.1 Pedagogy. 5

2.1.2 Learning behaviours in higher education. 8

2.1.3 Current theory on e-learning. 10

2.2 The Use of e-Tutorials 11

2.2.1 Online versus face-to-face learning. 11

2.2.2 Distance learning. 15

2.2.3 Using e-tutorials in UCD. 17

2.2.4 e-Learning and assessment. 18

2.3 e-Tutorial Creation and Design 21

2.3.1 Engaging students in e-learning. 22

2.3.2 e-Tutorial presentation. 23

2.3.3 Bowels-Terry et al.’s best practices for e-Tutorial

creation. 24

2.3.4 e-Tutorial design and accessibility. 25

2.4 Summary of e-Tutorial Design and Creation Best Practices 28

2.4.1 Content. 28

2.4.2 Visual. 28

2.4.3 Audio. 28

2.4.4 Pace. 29

2.4.5 Engagement. 29

Page 4: Developing and Creating Interactive e-Tutorials

iv

Page

2.4.6 Accessibility 29

3. Method 30

3.1 Research Design 30

3.1.1 Competitive Product Survey: Background and

Justification. 31

3.1.2 Think aloud. 34

3.1.3 Eye-tracker. 35

3.1.4 Semi-structured interview. 37

3.2 Ethics 38

3.3 Selection of Participants 39

3.4 Participant Profile 40

3.5 Data Analysis Procedure 43

4. Results 44

4.1 e-Tutorials: Created and Redeveloped 44

4.1.1 Created. 44

4.1.2 Redeveloped. 46

4.2 Key Results from Usability Testing 48

4.2.1 Previous experience of e-tutorials. 48

4.2.2 Layout. 49

4.2.3 Visual. 50

4.2.4 Audio. 53

4.2.4 Quiz. 55

4.2.5 Content. 56

4.2.6 Eye-tracker. 57

5. Discussion 58

5.1 Competitive Product Survey 59

5.2 e-Tutorial Development 60

5.3 Usability Testing 61

Page 5: Developing and Creating Interactive e-Tutorials

v

Page

5.4 Interactive Assessment 62

5.5 Textual, Visual, and Auditory Elements 63

5.5.1 Textual. 65

5.5.2 Visual. 66

5.5.3 Auditory. 66

5.6 Eye-Tracker Analysis 67

6. Conclusion 68

7. Recommendations 69

8. References 72

9. Appendices

Appendix A Human Subjects Exemption from Full Ethical Review

Form

Appendix B UCD Insurance Policy

Appendix C Competitive Product Survey

Appendix D Wireframes

Appendix E Letter of Information

Appendix F Consent Form

Appendix G Interview Protocol

Appendix H Interview Questions

Appendix I Group Reflection

Page 6: Developing and Creating Interactive e-Tutorials

vi

LIST OF TABLES

Page

Table 1 Summary of Competitive Product Survey findings 60

Table 2 Best Practice Guide for Creating an e-Tutorial 70

LIST OF FIGURES

Page

Figure 1 Gender of Participants 41

Figure 2 Age Range of Participants 41

Figure 3 Nationality of Participants 42

Figure 4 Native Language of Participants 42

Figure 5 Digital Footprint and Online Management 45

Figure 6 ProQuest Flow 46

Figure 7 How to Find an Article 47

Figure 8 Evaluating Digital Information 48

Figure 9 Previous Use of e-Tutorials 49

Figure 10 e-Tutorial Layout 49

Figure 11 e-Tutorial Layout Breakdown 50

Figure 12 e-Tutorial Colour Scheme 51

Figure 13 e-Tutorial Colour Scheme Positive Comments Breakdown 51

Figure 14 Visual Elements Size 52

Figure 15 Visual Elements Size Breakdown 52

Figure 16 Audio Clarity 53

Figure 17 Audio Clarity Breakdown 54

Figure 18 Narration Pace 55

Figure 19 Interactive Quiz Questions 56

Figure 20 Participant Time Focused on Screen 57

Figure 21 Evaluating Digital Information e-Tutorial Iteration Process 61

Page 7: Developing and Creating Interactive e-Tutorials

vii

Figure 22 Digital Information and Online Reputation Management 64

Figure 23 Evaluating Digital Information 65

Figure 24 Example of Eye-Tracker Data 67

Page 8: Developing and Creating Interactive e-Tutorials

viii

Abstract

This project focused on e-Tutorial development using a mixed methods

research approach to improve two existing e-Tutorials (“How to find an Article”

and “Evaluating Digital Information”), and create two new e-Tutorials (“Digital

Footprint and Online Reputation Management” and “Flow”) for the School of

Information and Library Studies (SILS) in University College Dublin (UCD). The

process also culminated in the formation of a best practice guide for future e-

Tutorial creation in SILS. The literature review explored the role of e-Learning

and the implementation of digital learning objects in third level education. The

competitive product survey – a form of benchmarking rather than a traditional

survey – established a knowledge base on the most effective e-Tutorial design

currently being used by educational institutions. Interviews conducted with SILS

administrative staff focused on the practical aspect of incorporating SILS

branding within the e-Tutorials. Once the four e-Tutorials were complete, usability

testing was conducted with the voluntary participation of a total of 20 SILS

students, i.e. each e-Tutorial was tested by a different set of 5 students.

Participants were asked to complete one randomly selected e-Tutorial each while

voicing any related opinions or thoughts. Additionally, eye-tracking software

recorded the eye-movements of participants as they watched the e-Tutorial. The

quantitative data recorded by the eye-tracker was used to verify and enhance the

information gathered during the think alouds and semi-structured interviews. The

information gathered in the usability testing was combined to finalise the

structure of the e-Tutorials in the final stages of the project. The data collected

throughout the project has highlighted several wider implications in the field of e-

Learning, such as the process of e-Tutorial creation, and accessible e-Tutorial

design.

Page 9: Developing and Creating Interactive e-Tutorials

1

1. Introduction

The main goal in this project was to create interactive e-Tutorials for

implementation as part of the blended learning format of selected modules in

the School of Information and Library Studies (SILS). The stakeholders (SILS)

have been using the blended learning format in selected modules for quite

some time. Concannon, Flynn, and Campbell (2005, p. 502) describe blended

learning as a combination of face-to-face lectures and tutorials with web-

based course content. Information portrayed in digital learning objects often

needs to be updated as terminology and theories evolve over time. The

project’s stakeholders had reached this point.

In order to produce efficient, up-to-date e-Tutorials, the best ways to

research e-Tutorial creation, to create e-Tutorials using Articulate software, to

perform usability testing, and to analyse data collected from the usability

testing had to be found. Before the project could officially begin, the e-

Tutorials that needed to be created and those to be redeveloped had to be

established. The research design was formulated at the beginning of the

project because, as Kanuka and Kelland (2008, p. 61) have written, failing to

establish goals results in a lack of direction in the research and a failure to

provide meaningful results.

In consultation with our project supervisors, Dr. Crystal Fulton and Dr.

Claire McGuinness, it was established that the general layout and design of

“How to find an Article” and “Evaluating Digital Information” had to be revised

and updated, in addition to creating two new e-Tutorials, “Digital Footprint and

Online Reputation Management” and “ProQuest Flow”. The “Digital Footprint

and Online Reputation Management” was chosen because it will be

incorporated into first year SILS modules and is likely to be the first exposure

Page 10: Developing and Creating Interactive e-Tutorials

2

that undergraduate students have to the existence of digital footprints. The

“ProQuest Flow” e-Tutorial was chosen because Flow is a new, free

bibliographic tool, developed by ProQuest, which will enable students to

improve on their organisation, research, and citation skills for SILS

programmes. “How to find an Article” is an essential guide for students in both

undergraduate and postgraduate programmes, as it assists students in finding

articles efficiently. Finally, students need to be taught how to seek and identify

information of value and quality, because using credible references is an

essential aspect of academic assessments. Therefore, the “Evaluating Digital

Information” e-Tutorial was chosen. The blended learning format of certain

modules in the School of Information and Library Studies is a pivotal element

of their standing as an iSchool. For example, Wang, Shannon, and Ross

(2013, p. 302) have recommended that instructors design their courses in

such a way as to promote self-regulated learning behaviour.

Before creation and development could begin, it was important to establish

the industry standards that would have to meet in creating the e-Tutorials. A

competitive product survey was conducted (see Appendix C) as a method of

benchmarking, rather than a traditional survey, to review six of the twelve

standards that quality consumers recognise, i.e. performance, quick

response, features, reliability, durability, and aesthetics (Shah and Kleiner,

2011). Each of the five institutions chosen were reviewed based on these

standards. The competitive product survey determined that aspects such as

slide layout, pace, and usability were key elements of e-Tutorial design to be

considered. The competitive product survey was an important step, as

Walleck, O’Halloran, and Leader (1991) have argued, the early design phases

of product creation are the most critical in influencing a product’s success.

Page 11: Developing and Creating Interactive e-Tutorials

3

The literature review explored the wider issues surrounding e-Tutorial

implementation such as pedagogy, learning behaviours, and online versus

face-to-face learning. Other factors included distance learning and the use of

e-Tutorials in UCD. It was important to understand the context for

incorporating digital learning objects in course curricula because it reinforced

the point that students learn in various ways and have different skills and

abilities in terms of learning. Biggs and Tang (2011) have argued that it is the

responsibility of the teacher to accommodate the learning needs of all their

students. Accessibility was a key concern in designing the e-Tutorials, i.e. to

accommodate students with varying abilities and those with learning

difficulties, or disabilities. One of the methods chosen to address this was by

giving the student control over the pace of the tutorial; the navigation bar

allows the student to pause, replay, and revisit slides. Text versions of the e-

Tutorials are also available for students with hearing difficulties. Following the

competitive product survey and literature review, the content for each tutorial

was determined, then designed using wireframes (see Appendix D) depicting

the presentation of content in each of the individual e-Tutorial slides.

The mixed methods research approach to usability testing included think

alouds, eye-tracking, and semi-structured interviews. The advantages of using

a mixed methods approach were considered, concluding that it allows

triangulation and validation, and greater confidence in results (Brannen, 1992,

p. 63). Each tutorial was randomly selected for testing by 5 students. A total of

20 students were recruited. Qualitative data from think alouds and semi-

structured interviews were triangulated with quantitative eye-tracker data. For

example, students liked the new layout of the quiz questions, the colours used

in the slides, and the navigation controls. Finally, a best practice guide for

Page 12: Developing and Creating Interactive e-Tutorials

4

future e-Tutorial creation in SILS was developed, drawing on the research,

design, and testing phases of the project. Following usability testing, for

example, we agree with Tobii Technology (September, 2009) in their

preference for retrospective think alouds due to the difficulties participants

encountered with thinking aloud while trying to listen to the e-Tutorials.

Page 13: Developing and Creating Interactive e-Tutorials

5

2. Literature Review

In researching current best practices in e-Tutorial development it was

important to examine theories on blended learning and whether online

learning affects student results. While some students have negative feelings

towards e-Learning, claiming it instigates a feeling of depersonalisation,

others regard it as a freedom from face-to-face learning (Salmon, 2004, p.

38). For example, DiRienzo and Lilly (2014) as well as Margolis, Grediagin,

Koenig, and Sanders (2009) found that students felt course delivery methods

had no significant effect on their learning. Reviewing the discussions on e-

Learning helped establish that not all students learn in the same way, and that

this should be planned for this within the e-Tutorials. Similarly, it was essential

to determine the best practices employed by other institutions developing e-

Tutorials. The successful practices of others informed and improved the e-

Tutorial creation in this project. For example, O’Toole and Keating (2011, p.

30) have argued that students should interact as much as possible with e-

Learning tools, through quizzes, activities, or decision-making scenarios, as

well as employing textual, auditory, and visual elements in slides. Laurillard

(2002) and Westera (2004) both agree with this argument.

2.1 Introduction to e-Learning

2.1.1 Pedagogy.

Theorists have debated whether pedagogy, i.e. the theory of teaching, can

actually assist lecturers in teaching (Rosch and Anthony, 2012; Canning,

2007). In the case of designing e-Learning programmes, O’Toole and Keating

(2011, p. 32) have stated that it is hard to falter if the emphasis is placed on

the art of teaching rather than the technology. In the creation of e-Tutorials,

for this project, we focused foremost on the pedagogy, particularly student-

Page 14: Developing and Creating Interactive e-Tutorials

6

centred learning. Marzano (as cited in Rosch and Anthony, 2012, p. 38) noted

three areas in which pedagogy can be measured as successful, i.e. through

effective instructional strategies, management techniques used in classrooms,

and the design of course programmes. However, some authors have written

that there is no substantial evidence to suggest the strict adherence to and

conscious knowledge of such pedagogy that will result in better teaching

(Yorke, 2000; Canning, 2007). In working closely with lecturers while

designing the e-Tutorials, we sought to employ effective instructional

strategies while ensuring the tutorials would fit into and enhance selected

modules.

A significant factor in the adoption of new educational policies, and in

pedagogical debates, is the progressive inclusion of online learning within

higher education planning and development. Much has been written on the

use of collaborative tools for the enhancement of student learning (DePietro,

2012; O’Toole and Keating, 2011). DePietro (2012), for example, examined

the effectiveness of using Twitter and Wikis separately in two different

modules by focusing on how students produced responses. The study

resulted in mixed experiences for the students. DePietro (2012) argued that

Web 2.0 would lead to a new way of learning and teaching, i.e. Education 2.0.

Whether this is possible has been debated with no definitive outcome (Bhati,

Mercer, Rankin, and Thomas, 2010; Cheawjindakarn, Suwannatthachote, and

Theeraroungchaisri, 2012; and Kanuka and Kellund, 2008). In a review of the

literature on distance learning, Cheawjindakarn et al. (2012, p. 62) identified

several factors critical to the successful adoption of online learning tools,

including institutional management, learning environment, instructional

design, services support, and course evaluation. Bhati, Mercer, Rankin, and

Page 15: Developing and Creating Interactive e-Tutorials

7

Thomas (2010) examined the use of mobile devices, learning management

systems, and the virtual reality program, Second Life, for learning. In their

opinion, the use of these tools in the pedagogy would become much easier if

and when educational institutions actively imposed the policy to support it

(Bhati, Mercer, Rankin, and Thomas, 2010). However, Kanuka and Kellund

(2008, p. 57) have argued, based on the opinions of students who took part in

their study, that we do not yet have a good enough understanding of how

students and teachers are working and communicating online, or sufficient

knowledge of the effects of e-Learning on teaching and learning.

The fast rate of technological change has been highlighted as a barrier to

adopting online and mobile tools for education. Bhati, Mercer, Rankin, and

Thomas, (2010) highlighted some of these barriers: teacher induced (such as

hesitancy or fear of technology); IT support and infrastructure; management-

based; and technological (i.e. the fast rate of technological change). The

stakeholders (SILS) have embraced the adoption of e-Learning tools, such as

e-Tutorials, and have supported their creation in this project. However,

Kanuka and Kellund (2008) offered a negative view of e-Learning adoption. In

what they describe as a thorough review of the literature, (Kanuka and

Kellund, 2008) they could not find a consistent or reliable body of knowledge

directly attributing recorded benefits to the e-Learning technology itself. The

computer is only a medium and, as such, it is limited (Laurillard, 2002, p. 134).

The style of interface being used can limit how students express themselves

(Laurillard’s, 2002, p. 135). For example, Coopman (2009) evaluated

Blackboard 8.0 and found that replies to a thread in the discussion board were

not visible in the default setting, only when the ‘tree view’ and ‘expand all’

options are chosen. Students might find it difficult to locate replies, or this

Page 16: Developing and Creating Interactive e-Tutorials

8

might lead students to believe there are no replies to be viewed at all.

Westera (2004, p. 504) concludes that educational institutions should

preserve pedagogical methods while adopting a supplemental e-Learning

strategy.

2.1.2 Learning behaviours in higher education.

Teachers have always had to adapt their teaching strategies to

accommodate the varying levels of ability in their students (Biggs and Tang,

2011, p. 4). This idea must be taken into account to allow sufficient

preparation of supporting materials for students, in the case of this project, e-

Tutorials. The shift in emphasis from the traditional, didactic teaching

approach to a more student-centred, active approach in teaching practices

must also be considered (Biggs and Tang, 2011, p. 9; Laurillard, 2002) in this

project. Yorke and Knight (2004, p. 30) have written about the implications of

self-theorising, i.e. the beliefs held by teachers about their students are likely

to affect the students' learning. Students’ positive self-belief leads to their

increased motivation to learn (Biggs and Tang, 2011; Concannon, Flynn, and

Campbell, 2005). The e-Tutorials created for this project should therefore

increase students’ motivation to learn by creating a positive learning

environment.

Student retention in universities has been linked to the emotional bonds to

university and peers, and a sense of belonging according to Bowden (2013).

These factors are largely derived from student-to-student, student-institution,

and student-teacher interactions (Bowden, 2013). It is believed that these

forms of interaction build confidence in students and increase their

engagement with the course (Jagannathan and Blair, 2013, p. 5; and

Robinson and Bawden, 2002, p. 52). The use of e-Tutorials to enhance

Page 17: Developing and Creating Interactive e-Tutorials

9

course programmes was considered, which would increase student

engagement and subsequently student interaction, as discussions could arise

amongst students regarding the material conveyed. Course satisfaction has

been a keen area of research for institutions; one example is Wang, Shannon,

and Ross (2013) of Auburn University’s Department of Educational

Foundations, Leadership, and Technology, who investigated the levels of

course satisfaction in relation to student motivation and technology efficacy.

The study found that motivation positively affected course satisfaction (Wang,

Shannon, and Ross, 2013, p. 315).

However, it should not be forgotten that a significant amount of the

responsibility to learn lies with the students themselves (Deepwell and Malik,

2008, p. 6). According to Biggs and Tang (2011, p. 34), “there is no such thing

as an unmotivated student: all students not in a coma want to do something”.

In order for a student to want to learn something, the material must have

some value to the student, and the student must “expect success” (Biggs and

Tang, 2011, p. 35). The expectation of success has been connected to

motivation and self-belief but also to the size of the workload assigned by the

lecturer. Kember and Leung (2006) noticed that a high perceived workload

can be linked to the application of a surface approach to learning. Kember

and Leung (2006, p. 187) developed this theory and collated seven elements

which assist in achieving a perceptively acceptable workload amongst

students, including engaging with students and stimulating their interest in the

content. E-Tutorials for example should be deliberately short and require lots

of interaction to engage the user (O’Toole and Keating, 2011, p. 30-31).

Page 18: Developing and Creating Interactive e-Tutorials

10

2.1.3 Current theory on e-learning.

It is perhaps more important to examine the adoption of new technology in

education from the perspective of the students who must learn to interact with

it in order to produce effective e-Tutorials. Based in the University of Limerick,

Concannon, Flynn, and Campbell (2005) used focus groups and

questionnaires to assess students’ feelings towards blended learning and the

integration of new learning technologies into the pedagogy. A significant issue

for the students was that of support (Concannon, Flynn, and Campbell, 2005,

p. 510-511): technical support, encouragement from peers, and “perceived”

lecturer support. Chatpakkarattana and Khlaisang (2012) advocated for the

use of learner support systems to improve students’ learning abilities. In other

words, students are comfortable with blended learning strategies if they feel

supported both academically and technically. Furthermore, students view e-

Learning as a valuable reinforcement of traditional methods (Concannon,

Flynn, and Campbell, 2005).

Students viewed the availability of academic materials such as lecture

notes and e-Tutorials on the web positively in Concannon, Flynn, and

Campbell’s report (2005, p. 43). However, Stewart, Shifter, and Markaridian

Selverian (2010) revealed a less positive attitude on the part of the students.

Opinions of students from a large university in southern U.S.A. gathered via

telephone survey revealed that 41% of the students surveyed would skip

classes if the materials were available online (Stewart, Shifter, and

Markaridian Selverian, 2010, p. 40). Particularly in undergraduate courses

there will always be a certain amount of absenteeism amongst students

(Concannon, Flynn, and Campbell, 2005, p. 43). Whether or not absenteeism

increases with the adoption of enhanced e-Learning tools, such as interactive

Page 19: Developing and Creating Interactive e-Tutorials

11

e-Tutorials, has not yet been proven. Babb and Ross (2009), however,

studied “The Timing of Online Lecture Slide Availability and its Effect on

Attendance, Participation, and Exam Performance” by conducting a survey of

175 social science students in four different classes. Babb and Ross (2009)

found that attendance is negatively affected when lecture notes are posted

after the class, i.e. when the slides were posted before the lecture

approximately 77% of the students attended the classes, whereas students

attended approximately 60% of classes when the lecture slides were posted

after class.

Another problem noted in the literature is the distraction of socialising

which e-Learning allows (Salmon, 2004, p. 36). Salmon (2004, p. 36) has

written that “online learning offers the affordance of online socializing and

networking. Affordance means that the technology enables or creates the

opportunity, that is, it has an inherent social component. However, Kanuka

and Kellund (2008, p. 54) have refuted this idea arguing that participants in

their study said they did not need social interaction online; they did not

interact in the classroom and probably wouldn’t interact online either.

2.2 The Use of e-Tutorials

2.2.1 Online versus face-to-face learning.

In the literature the main difference identified between online and face-to-

face learning is that of human contact and the medium of discussion. Ter-

Stepanian (2012) has written on her experiences teaching the same art

history class to different students online and in the classroom. In her

comparison of face-to-face and online learning, Ter-Stepanian (2012, p. 43-

46) has strongly supported online learning as a more effective learning

experience considering the discussion forum does not end at the end of the

Page 20: Developing and Creating Interactive e-Tutorials

12

class session but continues throughout the week whereas face-to-face

discussions were limited to the scheduled class time. The difference in

atmosphere has also been noted by Ter-Stepanian (2012, p. 47) as she tried

to establish a friendly, open environment in both her online and face-to-face

classes. One way to establish whether there is a positive atmosphere in the

face-to-face or online setting is to note the number of questions asked by

students. Horspool and Lange’s study (2012, p. 82) reported that students felt

marginally higher levels of comfort in the online course compared to the same

course studied in face-to-face. In the case of incorporating e-Tutorials in

blended learning, establishing course satisfaction may occur in end of term

feedback questionnaires.

Classroom culture was used by Hauser, Paul, Bradley, and Jeffrey (2012,

p.145) as a substitute for transactional distance in studying the effects of

computer self-efficacy and anxiety on learning online versus face-to-face

learning. Transactional distance is a philosophical theory, put forward by

Michael Moore in the early 1990s, which emphasises the psychological

separation between student and teacher rather than the geographical

distance associated with distance learning (Reyes, 2013). Moore argued that

pedagogy has the most significant impact on learning, rather than physical

distance (Hauser, Paul, Bradley, and Jeffrey, 2012, p. 144). Moore further

asserted that the psychological connection between instructor and student

was influenced by three pedagogical components: structure, dialogue, and

autonomy (Reyes, 2013, p. 44). Gorsky and Cospi (as cited in Reyes, 2013)

put forward three reasons why Moore’s theory needed to be tested: many

researchers viewed it as a framework to analyse modes of distance learning

systems; there was a desire in researchers to reduce transactional distance in

Page 21: Developing and Creating Interactive e-Tutorials

13

distance learning courses; and some researchers were already teaching in

higher education courses. One pair of researchers who studied the theory

were Chen and Willtis (as cited in Reyes, 2013) who concluded that

transactional distance theory was a philosophical approach rather than a valid

scientific theory. While transactional distance is not a widely accepted theory

it has influenced the research of many academics, and the construction of the

e-Tutorials in this project. Learning autonomy should therefore be supported

by providing clear instructions in the e-Tutorials.

There have been numerous studies in a variety of disciplines comparing

online and face-to-face learning. Several difficulties have been highlighted in

the research design of these studies. Most significantly is the number of

variables to be accounted for in the research. DiRienzo and Lilly (2014, p. 4-5)

discussed the time spent by students learning, the effectiveness of teachers

(which is difficult to quantify), and working memory. While all factors must be

considered, only those that can be recorded adequately will lead to verifiable

research. Horspool and Lange (2012, p. 84) and others sought to control as

many variables as possible, such as comparing online and face-to-face

courses taught by the same instructor and using the same textbook and

materials (Margolis, Grediagin, Koenig, and Sanders, 2009; Steiner and

Hyman, 2010; Tanyel and Griffin, 2014; and Ter-Stepanian, 2012). Other

difficulties with comparison studies of online and face-to-face learning concern

the participant sample size and participant characteristics. Tanyel and Griffin’s

(2014, p. 4) ten year study noticed significant increases in the enrolments in

online courses which may have skewed their data. In addition to that, the

characteristics of students in the online sample were generally expected to be

older with a higher GPA than those in the same face-to-face course (Tanyel

Page 22: Developing and Creating Interactive e-Tutorials

14

and Griffin, 2014, p. 4). These factors may have also skewed the data

collected as the students in the online course were expected to perform better

than those in the face-to-face course with lower GPAs. The sample sizes in

the study by Hauser et al. (2012), for example, were comparatively lower in

the online course, with 35 students, than the face-to-face course, with 205

students.

The majority of the studies reviewed used student reactions and opinions

to measure whether online or face-to-face learning is the better medium

(Horspool and Lange, 2012). Hauser et al. (2012) and Steiner and Hyman

(2010) studies revealed that offering alternative delivery systems of the same

course helped improve retention rates and therefore student satisfaction.

DiRienzo and Lilly (2014), as well as Margolis et al. (2009), found that

students felt the delivery methods had no significant difference in student

learning. Conversely, Pena and Yeung’s (2010) study of a blended learning

Spanish language course found that those who favoured face-to-face learning

generally disliked online learning and vice versa.

Steiner and Yeung (2010) implemented a review of the grades received by

students, in both online and face-to-face courses, and determined that the

grades were relatively equal. Tanyel and Griffin’s (2014) longitudinal study

reported an increase in online student results. Reigle (as cited in Keramidas,

2012) found that online students expected better results than face-to-face

students. However, Keramidas (2012, p. 31) discovered that the average

results of students were similar in both online and face-to-face, but the online

students needed almost twice as many resubmissions for assignments. Each

study examined a different discipline and set of students but the overall

consensus seems to be that online learning does not affect student results for

Page 23: Developing and Creating Interactive e-Tutorials

15

the worse, and in some cases it may improve the students’ engagement with

the course.

2.2.2 Distance learning.

Distance learning, as defined by Merriam-Webster (2014), is “education

that takes place via electronic media linking instructors and students who are

not together in a classroom”. A primary motive for offering distance learning is

to provide educational opportunities for those unable, restricted, or unwilling to

attend courses traditionally held on a campus (Banas and Emory, 1998;

Barreau, 2000, p. 79). Additionally, students have increasingly begun to insist

upon a shift towards more distance learning courses within universities

(Amirault, 2012, p. 255; Stella and Gnanam, 2004). Taking this into

consideration, some authors believed that current policies regarding

education should be reviewed and updated to include the diverse

infrastructure requirements of distance learning (Banas and Emory, 1998;

Stella and Gnanam, 2004).

Distance learning courses have enabled institutions to meet students’

learning needs without additional investment in facilities (Latham and Smith,

2003, p. 120). Furthermore, the convenience and flexibility of distance

learning has allowed students to focus on the learning itself as they do not

have to worry about travelling or falling behind in the course because they can

learn at their own pace (Barreau, 2000, p. 80). However, some students may

not have realised that flexibility does not mean twenty-four hour access to

faculty and staff (Banas and Emory, 1998). Clarification and focus on major

aspects should be provided for students taking an e-Learning course,

especially considering the limited availability of lecturers and their response

times (Magnussen, 2008, p. 83). One argument against e-Learning suggested

Page 24: Developing and Creating Interactive e-Tutorials

16

by the students themselves was that they want to have the traditional

university experience (Bowden, 2013; Kanuka and Kellund, 2008).

Distance learning courses designed with student collaboration in mind

have often increased student interactivity, which has further enabled the

sense of support and connection between various students enrolled within the

course, ultimately improving student motivation (Pymm and Hay, 2014, p.

136). Conversely some students may experience a sense of isolation

(Barreau, 2000, p. 84). The act of deciding how to create and introduce

asynchronous and synchronous learning tools to avoid student isolation and

improve collaboration can be difficult (Banas and Emory, 1998; Pymm and

Hay, 2014, p. 135). For example, Wang (2007, p. 302) noted that research,

completed on a study involving cultural groups and distance learning,

identified asynchronous tools (email, discussion boards, and blogging, etc.)

were preferred to synchronous tools (conference calling, videoconferencing,

and instant messaging) by all groups. The synchronous tools encountered

technical difficulties, poor audio quality, and difficulty organising meeting times

(Wang, 2007, p. 303). Barreau (2000) and others have recognized that

technological issues are a constant threat to distance learning courses yet

they continue to advocate for it (Amirauly, 2014, p. 254; Banas and Emory,

1998; Barreau, 2000).

Virtual classrooms allow increased interaction by students, as they are not

“bound by time or place” and can continue as long as desired (Barreau, 2000,

81). Students however need time to become comfortable with the various

technologies involved in the distance learning course they are taking

(Barreau, 2000, p. 90-91). Barreau (2000) has written that email is a primary

and preferred form of communication. However, there are human issues

Page 25: Developing and Creating Interactive e-Tutorials

17

surrounding electronic communication, such as the option of delaying or

ignoring responses in a way that would not be possible if the communication

took place in person (Barreau, 2000, p. 85).

2.2.3 Using e-tutorials in UCD.

As Munck and McConnell (2009, p. 35) have pointed out, every university

is required to have a strategic plan under the Universities Act of 1997. This

means that all universities must plan for the future in terms of campus, fiscal

(Westera, 2004), and instructional development (UCD Strategy Group, 2010).

Pressure for change and increased adoption of ICT derives from student

demand, changes in the education market, and the innovation of education

technology (Concannon, Flynn, and Campbell, 2005, p. 502). In promoting

SILS as an iSchool, branding is an important concept to be included in the

design of e-Tutorials and other e-Learning tools. Educational innovation is

described as an opportunity for institutions to explore and create new flexible

learning and delivery systems (Munck and McConnell, 2009; Schworm and

Gruber, 2012).

Due to significant increases in the use of learning content via mobile

devices (UCD Strategy Group, 2010, p. 1), UCD has incorporated

enhancements of e-Learning environments into their IT strategy (UCD IT

Services, 2008), with the intention of instigating greater interest in students to

learn and engage with their courses. Certain software applications, such as e-

Tutorials, are included in this e-Learning environment (UCD IT Services,

2008, p. 8). However, some view e-Learning somewhat negatively in terms of

student engagement. Kopp, Matteucci, and Tomasetto (2012, p. 18) viewed

learning in virtual environments as overall more demanding on the student.

Writing about Oxford’s legendary face-to-face tutorial system, Morgan (2013)

Page 26: Developing and Creating Interactive e-Tutorials

18

argued that lecture settings induce passivity in students while individualised

one-on-one tutorials allow the student to actively construct ideas and engage

with the course material. UCD does not avail any campus or school wide one-

to-one tutorial system; alternatively e-Tutorials are used within the university

schools to support module and course content in conjunction with face-to-face

seminars and group tutorials. UCD has encouraged the development of e-

Learning within its schools (UCD Strategy Group, 2010, p. 3) as part of

traditional programmes citing more interaction and contact with students as

desirable.

UCD recognises the benefits of e-Learning in generating income, reducing

costs, and as a marketing channel (UCD Strategy Group, 2010, p. 1), while

Armellini and Aiyegbayo (2010) also promoted e-tivities as “low-cost,

reusable, customisable and scalable” (p. 922). However, the world of e-

Learning is competitive and some businesses have recognised the potential

for sourcing free training material online (Fisher, 2013). Social media may in

fact challenge the role of traditional institutions (Brown, 2010, p. 6-8).

However, human interaction is still an integral part of successful learning

practice according to Caroline Taylor and Professor Barbara Allan (as cited in

Fisher, 2013). Professor Barbara Allan, (as cited in Fisher, 2013) has claimed

surveys continuously show that learners appreciate meeting their instructors

face-to-face, in class or in consultation.

2.2.4 e-Learning and assessment.

Yorke (2003, p. 494) has written that assessment is one of the least

developed aspects of course curricula. A selection of the literature featuring

online assessment has recognised the importance of distinguishing the

difference between formative and summative assessment (Boud, 2000; Nicol,

Page 27: Developing and Creating Interactive e-Tutorials

19

2007; Nicol and Macfarlane-Dick, 2006; Yorke, 2003). Of this selection, Boud

(2000, p. 155) stipulated that the two forms of assessment are inextricably

linked and almost impossible to separate in practice. Assessment essentially

has two main objectives, firstly as a means to obtain qualifications, i.e.

summative assessment, and secondly to assist in the learning process, i.e.

formative assessment (Boud, 2000, p. 155). In terms of pedagogy it has been

recommended that a learner-centred approach should be taken towards

teaching (Biggs and Tang, 2011; Laurillard, 2002). In their research on the

Carpe Diem e-tivity programme Armellini and Aiyegbayo (2010, p. 933) noted

that there has been a shift towards task-based, learner-centred approach to

teaching and assessment. For example Arthur (2006) studied the use of user-

generated multiple choice questions (MCQs) as a form of assessment in order

to encourage students to engage with material and meet learning objectives.

However, a significant point was raised by Armellini and Aiyegbayo (2010).

Some educators believe that without an element of assessment incorporated

into the completion of an e-tivity the students will not do it (Armellini and

Aiyegbayo, p. 934). Armellini and Aiyegbayo (2010, p. 934), however,

believed that if the student understands the benefit of completing the e-tivity,

the student will engage with it.

In deciding the form of assessment to include in the e-Tutorials, the

advantages and disadvantages of using MCQs had to be deliberated. There

are several benefits of the MCQ outlined by Fellenz (2004, p. 704) including

easy marking, time-saving while marking and indisputable grades. Another

incentive for using MCQs is that the results will display problem areas in the

course material, in which the lecturer must adjust the teaching strategy

(Yorke, 2003, p. 482). Nicol (2007, p. 54), however, highlighted some of the

Page 28: Developing and Creating Interactive e-Tutorials

20

limitations inherent in MCQs: the encouragement of memorisation rather than

understanding or independent thought; the feedback provided is rigid; and the

reasons for using MCQs most often lie with time-saving benefits rather than a

desire for effective learning. Furthermore, Nicol (2007, p. 60) has critiqued

Fellenz’s study on the basis that it focuses on the instructor’s perspective

rather than the learner-centred approach to assessment. Similarly, Scouller

(1998) connected the surface learning approach, i.e. memorisation and recall,

with the successful completion of MCQs. In fact, Scouller (1998, p. 470) has

stated that the employment of the deep learning approach, i.e. the focus on

meaning and understanding, may prove to be a disadvantage in completing

MCQs. There are, however, academics who still advocate the use of MCQs

(Arthur, 2006; Douglas, Wilson, and Ennis, 2012). Douglas, Wilson, and Ennis

(2012) argued that MCQs work best in conjunction with other assessment

methods. In addition, their study found most students had positive attitudes

towards MCQ tests (Douglas, Wilson, and Ennis, 2012, p. 116); the students

were able to recognise their own capabilities, a process which in turn

engenders self-directed learning.

Feedback from educators to students can empower self-monitored

learning (Nicol and Macfarlane-Dick, 2006, p. 199). Nicol and Macfarlane-Dick

(2006) defined good feedback practice as “anything that might strengthen the

student’s capacity to self-regulate their own performance” (p. 205). In

discussing the development process of an assessment-centred e-Learning

system, Wang (2014, p. 200) discovered that personalised e-Learning

materials and annotation has a positive effect on student learning as it gives

the student the opportunity to focus on areas they haven’t fully understood.

Yorke (2003, p. 488) has noted the importance of student reaction to

Page 29: Developing and Creating Interactive e-Tutorials

21

feedback, i.e. feedback is only useful if the student learns and responds to it.

Yorke (2003, p. 489) has also highlighted the dangers of over-dependence on

feedback in which the student fully relies on the teacher for direction in their

study.

De Vries et al. (2005) have recognised that plagiarism, i.e. copying

another student’s answers, is particularly problematic in e-Learning and a

significant effort must be employed to deter its practice. Students will be

allowed to repeat the e-Tutorials and quizzes, therefore the detection rate of

plagiarism will be impossible to determine. Bennett tested a sample of 249

students in a post-1992 London university (as cited in De Vries et al., 2005, p.

221) and found the detection rate of online plagiarism was 1.5%, while 20% of

tutors ignored obvious plagiarism due to the hassle involved in reporting it. It

must be noted however that these figures are more than ten years old. De

Vries et al. (2005, p. 226) have suggested content, process, and product as

avenues of support in reducing the amount of overlooked plagiarism. Nicol

(2007, p. 57) has described an effective method of reducing plagiarism in

MCQs; by asking students to justify their answers, the student is required to

think about the answers he or she has given. Another preventative measure

suggested is that of surveillance, such as the tracking of log-in times and

dates (Land and Bayne, 2005, p.165), a service which Blackboard currently

offers. However, Land and Bayne (2005, p. 173) have noted the U.K. Data

Protection act of 1998 which prevents the use of such online surveillance

technologies in the disciplinary action accompanied with plagiarism.

2.3 e-Tutorial Creation and Design

It was important to establish the best practices employed by institutions

developing e-Tutorials. The successful practices of others would be taken into

Page 30: Developing and Creating Interactive e-Tutorials

22

account in this review of literature and also through the competitive product

survey, a functional form of benchmarking. In addition, we would develop our

own best practices during the research, design, and testing phases of the

project. For example, Articulate, a programme for creating online and mobile

programmes using e-Learning authoring tools (Articulate website, 2014), was

chosen to develop the e-Tutorials. Two significant limitations of Articulate

were discovered during the development process. First, it was not possible to

split the e-Tutorials into segments to include quiz questions throughout. And

second, there were a limited number of options available when selecting slide

animations from PowerPoint. The Advanced Distributed Learning Initiative

supervised by the United States Department of Defence created SCORM.

SCORM (Sharable Content Object Reference Model) is a collection of

specifications and standards that ensures compatibility of the e-Tutorial with

the Learning Management System (LMS), i.e. Blackboard (Ehlers, and

Pawlowski, 2006). SCORM allows e-Tutorials to be easily transferred to other

LMSs, such as Litmos (Ehlers, and Pawlowski, 2006). The compatibility of the

e-Tutorials with Blackboard and the LMS used by SILS had to be checked for

successful implementation and roll-out. The following section reviews

problems encountered by researchers and their solutions.

2.3.1 Engaging students in e-learning.

Certain studies have suggested that increasing student engagement, in

areas such as problem solving and critical thinking, can improve students

learning outcomes based upon the activity theory (Liaw, 2008, p. 869). Activity

theory refers to the changes and contradictions occurring during tasks that

can be identified by studying human activities as developmental processes,

according to Mwanza and Engestrom (2005, p. 457). When students discover

Page 31: Developing and Creating Interactive e-Tutorials

23

information by themselves and control the pace of their learning, they become

more accustomed to interactivity and self-directed learning increases (Liaw,

2008, p. 869). Liaw (2008, p. 869) argued that the incorporation of graphics,

videos, and other media would assist in this and appeal to a wider variety of

students to engage with the technology.

Jagannathan and Blair (2013, p. 3) viewed student engagement in

educational activities as the degree to which a student feels included in or

connected to the activities. When student engagement is directly related to

the learning objectives of the course, as stated by Jagannathan and Blair

(2013, p. 3), students display increased engagement within the educational

activities. By providing a variety of learning media, Robinson and Bawden

(2002, p. 52-53) hoped that the needs and learning styles of the majority of

students would be reached. O’Toole and Keating (2011, p. 30) believed

students should be interacting as much as possible with the e-Tutorial, for

example, whether through quizzes, activities, or decision-making scenarios as

well as using slides that contain textual, auditory, and visual elements.

Moreover, the dependency on using an overwhelming amount of knowledge

based in textbooks for summarizing the main concepts in a course, has been

condemned by Magnussen (2008, p. 83) as a hindrance to learning.

2.3.2 e-Tutorial presentation.

It has been perceived that e-Tutorials which are easy to use have lead to

an increasing amount of students taking part in them (Islam, 2013, p. 397).

Furthermore, students prefer distance learning courses that are well

organised, simple to use, and maintain their attention through dynamic

graphics (Seiver and Troja, 2014, p. 91). Islam (2013, p. 397) has suggested

that instructors should be involved in the construction of the e-Tutorials, so

Page 32: Developing and Creating Interactive e-Tutorials

24

that the content remains beneficial to the student and up-to-date. The concept

of managing content “incorporates the editorial process of gathering, creating

new, or selecting suitable educational materials from existing resources for

web delivery”, according to Mwanza and Engestrom (2005, p. 458). In order to

successfully implement this process, Mwanza and Engestrom have suggested

the establishment of guidelines for gathering and selecting information (2005,

p. 458). Liaw (2008, p. 866) has put forward three factors for designing

effective e-Learning systems: learner characteristics, instructional structure,

and interaction. Additionally, Pituch and Lee (2006, p. 225) have highlighted

functionality, interactivity, and poor response time as three system

characteristics to be considered.

There is a debate surrounding the effectiveness of video versus text, as

some students prefer to watch and listen to videos while others prefer to read

(Bowels-Terry et al., 2010, p. 24). The choice of media may depend on the

task at hand. Students completing more complex tasks may want text

instructions for easy referral, while students completing less complex tasks

might prefer to watch a short video (Bowels-Terry et al., 2010, p. 24). Bowels-

Terry et al. (2010, p. 24) believe it is essential to provide multiple formats,

including video and text, so that students with various learning styles and

technological capabilities may choose the best format for themselves.

2.3.3 Bowels-Terry et al.’s best practices for e-Tutorial creation.

Bowels-Terry et al. (2010) have verified four best practices to be followed

when creating e-Tutorials by conducting a study at the University of Illinois:

The pace of narration in a video tutorial should be slightly slower than a

regular conversational tone and captions should be included for

accessibility.

Page 33: Developing and Creating Interactive e-Tutorials

25

Video tutorials should be short, lasting from thirty seconds to one

minute in length, with the most important and desirable content in the

beginning. The content may be discussed afterwards.

Students generally view video tutorials for information and instruction,

not for entertainment. While music may capture the students’ attention

in the beginning, it should not overwhelm the content. Brief opening

and closing music, in addition to professional and appealing graphics

will help maintain the attention of students.

Students should be able to view the video or tutorial information in a

variety of formats to accommodate students with disabilities and

learning difficulties. The language in the video should therefore be

simple and straightforward.

2.3.4 e-Tutorial design and accessibility.

Students with and without learning difficulties must learn to adapt to the

academic demands required in higher education. Heiman (2006, p. 55) has

described a learning style as the way in which a student concentrates,

processes, internalizes, and recalls new information. Students internally

regulate their learning by specifying their own learning goals without the need

for instructions (Heiman, 2006, p. 56). But Heiman (2006, p. 56) has pointed

out that some students need external regulation or support with orienting,

planning, monitoring, evaluating their own performance, and rely heavily on

the teacher’s instructions. Furthermore, Heiman listed various self-regulating

strategies that may be useful to students with a disability, such as setting

goals, monitoring performances, managing time efficiently, using self-

evaluation methods, attributing causation to success or failure, or adapting

new methods to use in the future (Heiman, 2006, p. 56). As Magnussen

Page 34: Developing and Creating Interactive e-Tutorials

26

(2008, p. 85) has stated, students cannot learn or become actively engaged

within a frustrating environment.

In May of 2011, the U.S. Department of Education’s Office for Civil Rights

(OCR) clarified the specific legal requirements relative to digital resources: “—

equal opportunity, equal treatment, and the obligation to make

accommodations or modifications to avoid disability-based discrimination”

(Hashey and Stahl, 2014, p. 72). These guidelines and standards have

assisted in the clarification and structure of what it means for content to be

accessible on the Web. Smith and Basham (2014, p. 72) named three primary

guidelines and standards currently being used in the U.S.: Web Content

Accessibility Guidelines (WCAG) 2.0, Section 508 of the Rehabilitation Act,

and EPUB 3 (Electronic Publication 3). Guidelines regarding accessibility

would greatly help decrease and resolve the issues currently surrounding

inaccessible websites, e-Learning tools, and technology (Fichten et al., 2009,

p. 253). For example, Gornitsky (2011) has written about the e-Learning

platform Blackboard Learn 9.1, which greatly improved accessibility for

students with visual impairments through several avenues: keyboard

navigation, dynamic content awareness, test structures, accessible

multimedia controls, uploading multiple file types, and improved form

interaction (Gornitsky, 2011, p. 49). One of the ways to make the e-Tutorials

accessible to students, with and without learning difficulties, would be to use a

limited colour palette. This would allow minimal distractions for the students

viewing the e-Tutorials as well as accommodating those with colour blindness.

Brunvand and Abadeh (2010, p. 305) have highlighted the difficulties all

students face in locating relevant and accurate content through websites and

have suggested that the instructor pre-select and screen websites for the

Page 35: Developing and Creating Interactive e-Tutorials

27

accuracy of content, reading level, page layout and ease of navigation.

Distance education allows for content to be presented in various ways, with

the possibility of combining audio, video, text, etc. This provides students with

varied abilities and disabilities improved accessibility and greater learning

opportunities (Hashey and Stahl, 2014, p. 71; Heiman, 2006, p. 57). For

example, Heiman (2006, p. 60) has determined that students with learning

disabilities preferred step-wise processing, comprising of memorisation, and

had a greater need for self-regulating strategies including controlling learning

processes, self-orientation, planning, monitoring, and continuous evaluation of

learning processes.

Seale and Cooper (2010, p. 1111) contended that in order to successfully

facilitate learning, students must be able to successfully interact with the

technology, faculty, peers, and learning materials. Fichten et al. (2009, p. 241)

have described the myriad of adaptive technologies which make it possible for

students with learning difficulties or disabilities to take part in activities that

were previously inaccessible, i.e. text-to-speech technology, chat

programmes, dictation software, and magnifying software.

Generally, video programmes created by the instructor are considered

useful and efficient by Mechling (2005, p. 32) as they “present concepts in a

systematic way, with repetition, in a relatively simple, non-distracting format

that focuses the learner”. Mechling (2005) stated that it is through watching a

video with sight, sound, and contextual information those with learning

difficulties were reported to have increased motivation (2005, p. 32). While

such videos take time to develop and create they can be reused until the

content has to be changed or updated (Mechling, 2005, p. 32).

Page 36: Developing and Creating Interactive e-Tutorials

28

2.4 Summary of e-Tutorial Design Best Practices

The following is a summary of the key considerations for creating and

designing e-Tutorials identified in the literature review. The competitive

product survey, (see Appendix C) conducted for this project, highlighted

further current standards of quality and best practices in e-Tutorial design

being used by educational institutions which will be discussed later.

2.4.1 Content.

e-Tutorial slides should not be overwhelmed with text, as

Magnussen (2008, p.83) has condemned such practice an obstacle

to learning.

Content should be presented in a straightforward and systematic

way to assist students with and without learning difficulties as

Heiman (2006, p. 60) and Mechling (2005, p. 32) have suggested.

2.4.2 Visual.

The use of graphics, videos, and other media will encourage self-

directed learning in students and engagement with the technology

and course content, as argued by Liaw (2008, p. 869), and O’Toole

and Keating (2011, p. 30).

Videos embedded in the e-Tutorials should be short, i.e. thirty

seconds to one minute in length, with the most pertinent and

desirable content in the beginning. As Bowels-Terry et al. (2010)

have argued the content may be discussed afterwards.

2.4.3 Audio.

Bowels-Terry et al. (2010) suggest that the pace of narration in

video tutorials should be slightly slower than a regular

conversational tone.

Page 37: Developing and Creating Interactive e-Tutorials

29

2.4.4 Pace.

By providing navigation controls in the e-Tutorials students can

control the pace of learning and discover more information

themselves (Liaw, 2008, p. 869).

2.4.5 Engagement.

Multiple choice questions (MCQs), activities, and decision-making

scenarios should be used as a form of assessment, and as a

method of encouraging students to engage with the material, as

suggested by Arthur (2006), and O’Toole and Keating (2011, p.

2011).

2.4.6 Accessibility.

Functionality, interactivity, and response times are three

characteristics to be considered (Pituch and Lee, 2006, p. 225) in

order to make e-Tutorials simple to use (Seiver and Troja, 2014,

p.19).

An alternative text version of the e-Tutorials should be provided for

students with varying learning styles and abilities as proposed by

Bowels-Terry et al. (2010, p. 24).

Bowels-Terry et al. (2010) have suggested that captions should be

included in the slides for improved accessibility.

A limited colour palette should be used to decrease distractions and

accommodate students with colour blindness.

Page 38: Developing and Creating Interactive e-Tutorials

30

3. Method

The following section describes the method used to research and test the

two newly created (“Digital Footprint and Online Reputation Management”,

and “ProQuest Flow”), and two redeveloped e-Tutorials (“How to find an

Article”, and “Evaluating Digital Information”). The competitive product survey

is explained in conjunction with the reasons it was necessary to conduct

preliminary research to enhance the development of our e-Tutorials. In the

next section we justify our choice of mixed methods research design for the

usability testing, i.e. the combination of think aloud, eye-tracker, and follow-up

semi-structured interviews. We further explain how we collected and recorded

the data during testing. The data and its method of collection is also

discussed in terms of reliability and confidence for its subsequent use in

revising the e-Tutorials.

The next section considers the ethics requirements and guidelines set out

by the UCD Office of Research Ethics. We describe the process of obtaining

ethics approval, from UCD Office of Research Ethics. Following this, we

describe the method of sampling used to gather the participants, and our

reasoning behind this. The sample population profile is presented in terms of

age, gender, nationality etc.

3.1 Research Design

It is generally agreed that the motivation for choosing a mixed methods

research (MMR) approach must lay in the belief that it serves to enhance the

quality of the study (Brannen, 1992; Bryman, 2001; Creswell, 2009; and Fidel,

2008). An advantage of MMR is that it allows triangulation, the

synchronisation of findings, and the superior degree of confidence in results

(Brannen, 1992, p. 63). This study aimed to utilise the triangulation of

Page 39: Developing and Creating Interactive e-Tutorials

31

qualitative data from think aloud, and semi-structured interviews (audio

recordings), with quantitative eye-tracker data (visual recordings). This

method was chosen in order to validate results, and to gather information that

may have been missed in the think alouds. Creswell (2009, p. 15) has entitled

this form of research ‘concurrent mixed methods’, while Fidel (2008, p. 266)

refers to it as ‘methods triangulation’ as the think aloud took place at the same

time as the eye-tracker session. This will be expanded upon within the

research design in the following sections.

As mentioned, MMR is not a universal strategy (Bryman, 2001, p.456). In

this process typical think aloud usability testing would not be sufficient on its

own since the opinions of testers, while valuable, are subjective. In using the

quantitative analysis of eye-tracker data we believed this would be resolved.

Burke Johnson et al. (2007, p. 115) have listed several advantages to MMR

research that apply to this project, such as confidence in results, thicker and

richer data, and uncovering contradictions in the data gathered or theories. In

establishing our research design, the uncovering of contradictions or

inconsistencies in opinions given by respondents might be achieved through

the quantitative analysis of eye-tracker data.

3.1.1 Competitive Product Survey: Background and Justification

While deciding which e-Tutorials to create and redevelop, a competitive

product survey was conducted to establish current best practices in e-Tutorial

creation. The competitive product survey is a form of benchmarking or

competitor analysis used by businesses in which market commonality is

identified, i.e. “the degree to which a given competitor overlaps with the focal

firm in terms of customer needs served” (Bergan and Peteraf, 2002).

According to Attiany (2014) benchmarking is “a systematic method by which

Page 40: Developing and Creating Interactive e-Tutorials

32

organisations can measure themselves against the best industry practices”

and “promote superior performance” (p.41), a definition which Shah and

Kleiner (2011) agree with. Many industries and firms use this form of analysis,

such as the Irish Food Board (Bord Bia), and Tobii Technology.

Thedesignexchange.org (n.d.) has described a competitive product survey as

a means to “collect, compare, and conduct evaluations of the product's

competition”.

The competitive product survey examined five well known educational

institutions that produce e-Tutorials: Codecademy, Duolingo, Kahn Academy,

Monash University Library, and Universal Class. Codecademy was chosen

because any internet user can join this site, regardless of age or background,

which means a wide audience is catered for with various learning styles and

abilities. Duolingo’s highly interactive tutorial format was an important aspect

to review, to establish best practices our e-Tutorials would have to reach.

Kahn Academy teaches a wide variety of subjects, from history to cosmology.

Their tutorials are designed for post-primary students and university students.

Kahn academy was chosen due to its audience level and pedagogical

structure. Monash University Library e-Tutorials were examined due to their

slight similarity in content with the tutorials to be created in this project. And

finally, Universal Class was examined because of their tutorials’ well-defined

course goals and structure. We agreed with Shah and Kleiner (2011) that it is

important to establish the dimensions of quality relevant to our e-Tutorials.

Though Shah and Kleiner (2011) name twelve dimensions, we regarded six

as relevant to e-Tutorial creation, i.e. performance, quick response, features,

reliability, durability, and aesthetics. In order to meet these dimensions of

Page 41: Developing and Creating Interactive e-Tutorials

33

quality, the competitive product survey focussed on aspects of layout and

design, the presentation of information, and general usability.

It was important to conduct the competitive product survey to establish the

standards of design and usability to be reached in our e-Tutorials for several

reasons. First, as Walleck, O’Halloran, and Leader (1991, p. 6) have stated,

understanding one’s competitors’ strengths and weaknesses can lead to a

more effective formation of strategy, i.e. by understanding the positive aspects

as well as the faults of other institutions’ e-Tutorials we would be able to avoid

making the same mistakes. Secondly, companies must make predictions on

their products before they have been released. Jain (2007, p. 28) has used

the example of forecasting to determine the correct quantity of a product to

produce. In the case of this project, we had to establish the criteria by which

students would judge our e-Tutorials, such as content, audio/visual elements,

and pace.

Thirdly, efficient execution in the early design phases can reduce costs

(Walleck, O’Halloran, and Leader, 1991). This may be useful for SILS in

budgeting for e-Learning tools in the future. Connected to this is the

advantage of time-saving. We anticipated that knowledge of the market

leaders’ tutorials would speed up the process of e-Tutorial creation, since we

would know what areas of design and development to spend the most time

on. And Finally, Attiany (2014, p. 49) conducted a field study of industrial

companies listed in the Amman Stock Exchange, in which it was concluded

that the process of benchmarking assists business performance by retrieving

external knowledge and applying it to internal practices. Walleck et al. (1991,

p. 5) conceived three main reasons companies shy away from benchmarking:

Page 42: Developing and Creating Interactive e-Tutorials

34

belief in superiority of imitation rather than invention; moral or legal obstacles;

and a strong belief in a company’s uniqueness.

3.1.2 Think aloud.

The research design refers to the usability testing of the e-Tutorials. It was

decided usability testing would consist of concurrent think aloud while

participants completed an e-Tutorial. Each participant would complete single

randomly selected e-Tutorial, i.e. 5 participants per tutorial, 4 tutorials equals

a total of 20 participants. Think alouds would be audio recorded with the

participants’ permission. Concurrent think aloud (CTA) has been described as

the method in which participants verbalise their thoughts and opinions while

completing a specific task (Elling, Lentz, and de Jong, 2012, p. 206; Tobii

Technology, September 2009, p. 3). Think alouds were chosen due to the

need to gain an “insight into the cognitive processes and the obstacles

participants experience” as Elling et al. (2012, p. 206) have stated. In Van

Den Haak, De Jong, and Schellens’s (2003) usability testing of a library

catalogue, their results concluded that the think alouds did not reduce the

speed in which participants completed the task (p. 346). They concluded that

think alouds had a negative effect on task performance (Van Den Haak et al.,

2003, p. 339).

The literature has highlighted several disadvantages inherent in the use of

think alouds. Elling et al. (2012) found that some participants fell silent during

CTA when experiencing “cognitive-processing difficulties” (p. 207). While Tobii

Technology (September, 2009, p. 3) have written that participants sometimes

look away from the task while trying to think or verbalise their opinions or

difficulties. However, Tobii Technology (September, 2009) believe it is not a

problem if participants remain silent while reading or scanning the task, since

Page 43: Developing and Creating Interactive e-Tutorials

35

they are formulating an opinion while doing so. And although Tobii

Technology (September, 2009) favours retrospective think aloud (RTA), i.e.

the instance in which the participant completes the task and is presented with

a replay of the task afterwards, during the interview, and asked to comment

on it. Van Den Haak et al. (2003, p. 350) have determined that think alouds

and RTA can be considered equivalent while still being different methods of

evaluation.

Transcripts were created from the think aloud audio files. Participant

information was removed from the audio files and transcripts were de-

identified in compliance with UCD Office of Research Ethics regulations.

These transcripts were read repeatedly to enable inductive logic. The

transcriptions highlighted the strengths and weaknesses of the new and

revised e-Tutorials as opined by participants. Think alouds provided

commentary relating to the layout and the content of the e-Tutorials. This

collection and analysis of data focussed the improvement of the e-Tutorials on

any weak points. The process also assisted in the creation of a best practice

guide for future e-Tutorial creation in SILS.

3.1.3 Eye-tracker.

While the participant was thinking aloud during the e-Tutorial, eye-tracking

software recorded their eye-movements. Eye-tracking software records the

eye-movements of the user sitting in front of the computer screen. According

to Just and Carpenter (1976, as cited in Cooke, 2005) the eye-tracking

methodology is based on their “eye-mind” hypothesis in which “the location of

a person’s gaze directly corresponds to the most immediate thought in the

person’s mind” (p. 458). The software must first be calibrated in order to track

the eye-movements of the participant. To do this the participant is asked to

Page 44: Developing and Creating Interactive e-Tutorials

36

hold their gaze on the centre of the target displayed on the computer screen

(Reingold, 2014, p. 639). The model used in this project was the Tobii T60

which allows some comfort for the user in that they do not have to wear any

apparatus or sit extremely still, an advantage noted by Cooke (2005, p. 457).

The primary reason the eye-tracker was chosen was because the

qualitative data gathered in the think alouds and follow-up semi-structured

interviews could be triangulated. This triangulation would increase the

robustness of the project, by allowing the data collection methods to examine

the problem from different perspectives. As Tobii Technology (September,

2009, p. 3) has written, the eye-tracker collects objective numerical data along

with visual representations that can reveal additional information about

participant behaviour and the interface being tested. If the participant is

unable to verbalise certain thoughts, the eye-tracker will record the area being

fixated on, thus highlighting problem areas (Tobii Technology, September,

2009). Think alouds used on their own have various limitations, but Cooke

(2005, p. 461) argued that eye-tracking software is a more reliable way to

record user behaviour because of the limitations of think aloud protocol (as

described above). As well as that, it may find reasons behind the difficulties

experienced by users but who may not have vocalised them (Cooke, 2005, p.

458).

All software however has its disadvantages, one of which has been

highlighted by Reingold (2014, p. 639), namely that calibration is not always

perfect, as the eye is never completely still. Minor movements occur called

microsaccades, drift, and tremor. In addition, Elling et al. (2012, p. 208) found

during their study that sometimes participants vocalised thoughts concerning

Page 45: Developing and Creating Interactive e-Tutorials

37

one part of the screen while looking somewhere else. This lack of

synchronicity may present difficulties in the data analysis phase of a project.

Eye-tracking software can be used for different purposes other than

usability testing, such as deciphering difficulties experienced by students in

problem-solving (Tsai and Hou et al., 2012, p. 384). Eye-tracking is also used

to test video games; Ariele, Ben-ami, and Rubenstein (2011, p. 69) have

found that eye-tracking is superior to mouse-recording software, such as

MouseLab, in that natural and unconscious eye-movements are recorded.

3.1.4 Semi-structured interview.

Following the testing of the e-Tutorial with think aloud and eye-tracker,

participants were asked follow-up questions in the form of a brief, ten minute,

semi-structured interview. Semi-structured interviewing is a method of inquiry

used to determine the interviewee’s “interpretation of phenomena” which they

believe to be important or relevant to the topic (Brannen, 1992, p. 72). While a

set of technical proficiency questions and general questions (see Appendix E)

had been established, the semi-structured interview format allowed some

flexibility. Using this method, more time could be spent on issues specifically

raised by each participant rather than a general line of inquiry. The semi-

structured interview would enhance the collection of data as the participants’

opinions would be the focus of attention while using the quantitative data

gathered in the eye-tracker to highlight or confirm certain issues (Bryman,

2001, p. 451).

Semi-structured interviewing can use both open-ended and closed-ended

questions but primarily utilises open-ended questions which result in

unpredictable answers. Bryman (2001, p. 145) listed some advantages of

open-ended questions: interviewees can use their own terminology;

Page 46: Developing and Creating Interactive e-Tutorials

38

responses may be unusual and highlight otherwise unknown issues; and it

allows the interviewer to explore issues in-depth. Similarly, the results of

Gibson’s (1998) study comparing semi-structured and unstructured

interviewing in the case of out-patient care revealed that both methods gave

the patient the opportunity to talk and express opinions which would not occur

with surveys or structured interviews. However, Denzin and Lincoln (1994)

have discussed the advantageous benefits of structured interviewing.

Because all questions are the same for each participant with limited possible

responses data analysis is efficient. In addition, the fixed question schedule

means that inexperienced interviewers would be able to conduct them

(Denzin and Lincoln, (1994). Structured interviewing was too formulaic for

testing, and did not allow enough flexibility in questioning the participant or for

the participants to respond.

3.2 Ethics

The sample population we recruited for usability testing of the e-Tutorials

were UCD SILS students. This group could be considered vulnerable as they

are students, but because usability testing was peer-to-peer there was no

power differential to make participants vulnerable. Permission was sought

from the head of the School of Information and Library Studies to conduct

usability testing with SILS students. The testing posed no risk above that of

everyday life; no sensitive topics were addressed, or invasive or

physical/mental stress inducing procedures were used.

A letter of information was written and given to participants (see Appendix

E). This letter outlined the usability testing procedure and what participants

were required to do. The aim of the document was to provide clarity for

participants so that they would not feel surprised and uncomfortable while

Page 47: Developing and Creating Interactive e-Tutorials

39

conducting the usability testing. The document also informed participants of

the value of their participation to the project. Finally the document outlined the

measures that would be taken to ensure participant anonymity i.e. interview

recordings were stored on an encrypted storage device for the duration of the

study and will be destroyed at the end of the study, and interview transcripts

were de-identified. Participants were asked to read the document before

testing. Once they had read the document and confirmed that they fully

understood it, they were asked to sign a consent form (see Appendix F).

Usability testing took place in the UCD Campus. The think aloud and eye-

tracking portions of testing were conducted in the Computer Science

Laboratory. However, the interviews were conducted in Room 106, of the

School of Information and Library Studies department. This location was

chosen because it is a public setting familiar to participants, which ensured a

sense of security and comfort. Since the interviews were peer-to-peer and

conducted in a public setting, it reduced the risk of participants feeling

intimidated.

The reference number for the project’s ethics exemption is HS-E-14-38-

Dunne-Fulton (See Appendix A and B).

3.3 Selection of Participants

The aim of sampling, as stated by Sapsford and Jupp (1996, p.26), is not

only to save time and effort, but to obtain consistent and unbiased estimates

of the population regarding the topic that is being researched. An imperative

step in sampling is defining the population clearly and accurately. Once this

has been accomplished, the type of sampling will need to be determined. The

chosen form of sampling was non-probabilistic snowball sampling (Sapsford

and Jupp, 1996, p.29; Acharya et al., 2013, p.330-332).

Page 48: Developing and Creating Interactive e-Tutorials

40

Considering that this project relates directly to University College of Dublin

students, snowball sampling was chosen in the hopes that one of the initial

students, who participated in the study, would refer us to another student and

so on. This would enable a variety of UCD SILS students to participate and

offer their honest opinions regarding the new and revised e-Tutorials.

Conclusively, snowball sampling allows for the “studying of characteristics of

individuals in the population” (Handcock and Gilet, 2011, p.369). Additionally,

it must be noted that, out of the 20 voluntary participants, the results of this

study are only applicable to the sample population of UCD SILS students and,

therefore, may not be extrapolated to the population at large.

3.4 Participant Profile

The usability testing took place from the 11th to the 25th of July 2014.

Twenty students were recruited and tested over this period via snowball

sampling. The participants that contributed to this study were University

College Dublin students; specifically students from within the School of

Information and Library Studies (SILS) department. Each participant would

complete one randomly selected e-Tutorial out of the 4 available; 5 students

per e-Tutorial. With four e-Tutorials tested by 5 students each, this came to a

total of 20 participants.

Of these 20 voluntary participants, 70% were females and thirty 30% were

males (Fig.1); with 40% of the participants between the ages of 18 and 24, 35

% between the ages of 25 and 29, and 25% over the age of 30.

Page 49: Developing and Creating Interactive e-Tutorials

41

Figure 1. Gender of Participants

Figure 2. Age Range of Participants

Page 50: Developing and Creating Interactive e-Tutorials

42

Additionally, 60% of the participants were from Ireland, while 5% were from

another European country and 35% were from elsewhere within the world.

Figure 3. Nationality of Participants

Furthermore, 85% of participants were native English speakers, while the

remaining 15% of participants were not.

Figure 4. Native Language of Participants

Page 51: Developing and Creating Interactive e-Tutorials

43

3.5 Data Analysis Procedure

Audio recordings were transcribed and de-identified. The transcriptions

were read repeatedly to enable inductive logic. These transcriptions were

coded under the following headings; audio, visual and textual content. These

headings were informed by the literature review and the competitive product

survey.

Page 52: Developing and Creating Interactive e-Tutorials

44

4. Results

In presenting the results of our research, we begin by briefly introducing

the e-Tutorials that were created and those that were redeveloped for the

stakeholders, SILS. Next, the opinions of the 20 participants on the e-Tutorials

previously incorporated in SILS modules are outlined. The results of the

competitive product survey are revealed in the second section; the areas of

concern being performance, quick response, features, reliability, durability,

and aesthetics. In the following section, the results of the usability testing are

revealed.

4.1 e-Tutorials: Created and Redeveloped

4.1.1 Created.

Two e-Tutorials were created during this project for implementation in

SILS modules. As these e-Tutorials are new they have not been assigned to

modules yet but are expected to be incorporated into undergraduate and

postgraduate modules. The first e-Tutorial to be created was “Digital Footprint

and Online Reputation Management”. This e-Tutorial was chosen because it

is likely that this will be the students’ first encounter with what a digital

footprint is. It is important for students to understand what affects their actions

online have on their digital footprint. In addition, new students may not be

aware of some simple methods of protecting their online reputation, such as

switching to incognito mode in their chosen web browser.

Page 53: Developing and Creating Interactive e-Tutorials

45

Figure 5. Digital Footprint and Online Reputation Management

The second e-Tutorial chosen for creation was that of “ProQuest Flow”.

Flow is a new, free bibliographic management tool, developed by ProQuest,

which will enable students to improve on their organisation, research, and

citation skills for SILS programmes. The tutorial explores the basic and

advanced features of the bibliographic management tool, including the option

to share references. The “ProQuest Flow” e-Tutorial will be embedded in

Masters level modules and has been designed with postgraduate students in

mind, though it may also be recommended to incorporate in some

undergraduate programmes. The tutorial will assist students in using Flow to

complete their assignments with correct citations.

Page 54: Developing and Creating Interactive e-Tutorials

46

Figure 6. ProQuest Flow

4.1.2 Redeveloped.

The first e-Tutorial that needed to be redeveloped was “How to find an

Article”. The tutorial was previously implemented in the IS20010: Advanced

Information Skills module. It is essential for undergraduate students who have

not encountered academic databases before and for those who have not

searched for peer-reviewed articles previously. The tutorial explains how to

search for citations and abstracts using the online database LISA. It further

explains how to find an electronic copy and a print copy of an article.

Page 55: Developing and Creating Interactive e-Tutorials

47

Figure 7. How to find an Article

The second e-Tutorial to be redeveloped was “Evaluating Digital

Information”. It is important to teach new students how to seek and identify

information of value and quality. Using credible references is an essential

aspect of academic assessments. The article was previously embedded in the

undergraduate IS10050: Digital Judgement module. In “Evaluating Digital

Information”, students learn by viewing examples that demonstrate how to

decide whether an article is credible or reliable. For example, articles from

TheJournal.ie are used.

Page 56: Developing and Creating Interactive e-Tutorials

48

Figure 8. Evaluating Digital Information

4.2 Key Results from Usability Testing

(Please note: All values given are based on a population sample size of 20

participants (n=20), unless stated otherwise.)

4.2.1 Previous experience of e-tutorials.

All of the participants had experience of using previous SILS e-Tutorials

while enrolled in SILS courses. It was also important to know whether

participants had any previous experience with e-Tutorials outside of UCD. We

found that 5 participants (25%) in the sample had used e-Tutorials outside of

UCD.

Page 57: Developing and Creating Interactive e-Tutorials

49

Yes 25%

No 60%

Not Asked 15%

Previous use of e-Tutorials (other than in

SILS modules)

Figure 9. Previous use of e-Tutorials

4.2.2 Layout.

Results showed that the new layout of the e-Tutorials received

positive comments. One participant commented, “The text was easily

read and I thought the layout and looks were grand”. 65% of the

participants remarked, during the usability testing, that the layout was

very consistent; many of these believed that it was an improvement on

the previous e-Tutorials.

Positive Comments

65%

Basic Layout

5%

No Comment

30%

e-Tutorial Layout

Figure 10. e-Tutorial Layout

Page 58: Developing and Creating Interactive e-Tutorials

50

The following is a breakdown of the positive comments made on each e-

Tutorial.

Figure 11. e-Tutorial Layout breakdown

4.2.3 Visual.

The results on the visual elements of the e-Tutorials have been broken

into two sections. We first looked at the colour scheme and graphics of the e-

Tutorial, and then at the embedded videos. 70% of participants positively

commented that they liked the colour scheme of the slides. Quite a few

remarked on the blue background, noting that it kept them engaged

throughout the e-Tutorial.

“The white against the blue kind of stood out and they’re large

enough for me to read.”

(Participant)

They also commented that the white text on the blue background made it

easier for them to read the text on the screen. However, one participant found

Page 59: Developing and Creating Interactive e-Tutorials

51

that the bright colours combined with the interactive shapes in the “Digital

Footprint and Online Reputation Management” e-Tutorial was distracting.

Positive 70% Negative

5%

No Comment

25%

e-Tutorial Colour Scheme

Figure 12. e-Tutorial Colour Scheme

The following is a breakdown of the positive comments made by participants

on each e-Tutorial.

0

1

2

3

4

5

Flow How to Find an

Article

Evaluating Digital

Information

Digital Footprint

e-Tutorial Colour Scheme Positive Comments

Figure 13. e-Tutorial Colour Scheme Positive Comments Breakdown

50% of participants agreed that it was an improvement to have the videos

embedded into the e-Tutorial, and not doing so could lead to the disruption in

Page 60: Developing and Creating Interactive e-Tutorials

52

the flow of the learning experience. However, 70% of the participants

commented that they would like to see the videos and pictures enlarged.

While many participants opined that if they were able to maximise the screen,

it would result in a more suitable size.

Too Small 70%

No Comment

30%

Visual Element Size

Figure 14. Visual Elements Size

The following is a breakdown of the comments made by participants on each

e-Tutorial.

0

1

2

3

4

5

Flow How to Find an

Article

Evaluating

Digital

Information

Digital

Footprint

Visual Elements

Figure 15. Visual Elements Breakdown

Page 61: Developing and Creating Interactive e-Tutorials

53

4.2.4 Audio.

The results concerning the auditory aspect of the e-Tutorials were

divided into two sections: clarity of audio, and pace of the narration.

90% of the participants found the audio clarity to be of a high quality,

and an improvement on previous e-Tutorials. As one participant

mentioned, “I could understand every word she said. It was clear.”

Comments referring to a suitable tone throughout and consistent audio

levels were noted more than once during the think aloud portion of

usability testing.

Postiive 90%

No Comment

10%

Audio Clarity

Figure 16. Audio Clarity

The following is a breakdown of the positive comments made by participants

on each e-Tutorial.

Page 62: Developing and Creating Interactive e-Tutorials

54

0

1

2

3

4

5

Flow How to Find an

Article

Evaluating Digital

Information

Digital Footprint

Audio Clarity Breakdown

Figure 17. Audio Clarity Breakdown

In the case of the pace of narration, the results yielded a mixed review.

The majority of participants felt that the pace of the narration fell between a

‘well paced’ to an ‘acceptable’ standard.

“I found the audio was perfectly synced with the video and well paced

throughout.”

(Participant)

However, some of the participants who commented that the pace was either

acceptable or slow, were keen to highlight that this may be due to that fact

that they already knew the content, and that the pace might be better for

students approaching the topic for the first time. Overall, 10% of the

participants thought that the pace was good throughout the e-Tutorial, but that

there were inconsistencies in a small selection of slides.

Page 63: Developing and Creating Interactive e-Tutorials

55

Figure 18. Narration Pace

4.2.4 Quiz.

In the quiz section of the e-Tutorials, we introduced an interactive element

to the questions. Rather than selecting the right answer from a list of options,

the interactive questions required students to select the correct area in the

displayed screenshot. Most of the postgraduate students had never seen

questions presented this way in the previous e-Tutorials. Out of the sample

group, 6 participants (30%) had trouble completing the question. All of these

mentioned that the instructions should be phrased more clearly to explain how

to complete the interactive questions. Overall, 70% of the participants thought

these questions were useful in aiding the learning process of the content. As

one participant remarked, “by doing the interactive questions it is almost

proving that you know you have just learned the lesson”. They also said, by

getting these types of questions right, it gave them a feeling of confidence; it

reaffirmed that they understood what they had just learned in the e-Tutorial.

Page 64: Developing and Creating Interactive e-Tutorials

56

Positive 70%

No Comment

30%

Interactive Quiz Questions

Figure 19. Interactive Quiz Questions

4.2.5 Content.

Three separate areas of concern relating to e-Tutorial content were

highlighted in the results. They were: grammatical errors; step-by-step

processing; and use of knowledge gained on completion of the e-Tutorial.

There were comments received from seven participants (35%), regarding

the grammatical errors they had noticed while viewing their respective e-

Tutorials. The majority of these were minor errors, such as capital letters in

the wrong places, or question marks omitted from the ends of a questions. In

addition, some participants remarked that several of the quiz questions would

have to be rephrased for them to be clearly understood.

25% of the participants liked the step-by-step examples included in the e-

tutorials, and believed they would find this helpful when trying to remember

the steps for real-life situations.

“I thought it was good the way you blended different types of learning

throughout.”

(Participant)

Page 65: Developing and Creating Interactive e-Tutorials

57

The content of two e-Tutorials describe fixed processes for accomplishing

certain tasks, i.e. “ProQuest Flow” and “How to find an Article”. The

participants were asked if they would feel confident carrying out the tasks

covered in the respective tutorials upon their completion. Out of the ten

participants (n=10) asked, all 10 (100%) said that they would feel confident

carrying out the tasks.

4.2.6 Eye-tracker.

Results from the eye-tracker data found that on average, the participants

were focused on the screen for 76% of the time.

Figure 20. Participant Time Focused on Screen

The eye-tracker data also revealed the use of repetition had an adverse effect

on the participant’s concentration and their engagement levels dropped on

average by 33%.

Page 66: Developing and Creating Interactive e-Tutorials

58

5. Discussion

In this study two existing e-Tutorials, “How to find an Article” and

“Evaluating Digital Information” were revamped. The “How to find an Article”

e-Tutorial from the Advanced Information Skills module teaches students how

to use a database, how to find an electronic copy of an article, and how to find

a hardcopy of an article. The “Evaluating Digital Information” e-Tutorial from

the Digital Judgement module, discusses why we evaluate digital sources, the

21st Century’s Information Fluency Model of Evaluation, and different ways to

evaluate information from social media sites.

The “ProQuest Flow” and “Digital Footprint and Online Reputation

Management” e-Tutorials were created from scratch. The “ProQuest Flow” e-

Tutorial illustrates how to set up a Flow account. It also demonstrates the

basic features and some of the more advanced features of Flow. Finally, the

“Digital Footprint and Online Reputation Management” e-Tutorial explains

what a digital footprint is, what a filter bubble is, and how to manage your

digital footprint.

This chapter begins by discussing participant views of previous e-

Tutorials. It then outlines the findings of our Competitive Product Survey. The

e-tutorial development and usability testing findings are presented in the third

and fourth sections respectively. The discoveries relating to our interactive

assessments are then discussed. The sixth section highlights technical and

aesthetic features relating to textual, auditory and visual elements that

emerged in the research. This section concludes with the eye-tracker

analysis.

Page 67: Developing and Creating Interactive e-Tutorials

59

5.1 Competitive Product Survey

The competitive product survey looked at five reputable institutions that

produce e-Tutorials: Codeacademy, Duolingo, Kahn Academy, Monash

University Library, and Universal Class. These e-Tutorials were analysed

using six dimensions of quality identified by Shah and Kleiner (2011), i.e.

performance, quick response, features, reliability, durability, and aesthetics.

The performance dimension assessed how well the following dimensions

were blended together. The e-Tutorials response times were compared. In

this study the e-Tutorial features analysed were ease of navigation and

interactive quizzes. The e-Tutorials reliability and durability depended on the

accuracy and timeliness of the information presented. Finally, the aesthetic

dimension related to the visual attractiveness of each e-Tutorial.

Some of the key findings include Codeacademy’s step-by-step guide,

which made navigation easy and enhanced usability. Duolingo featured a

Progress Tracker. The strong audio/visual component in Kahn Academy

made these e-Tutorials very appealing to users. The test questions in Monash

University Library’s tutorials promoted user engagement. Finally, Universal

Class was a prime example of consistent branding within e-Tutorials. The

competitive product survey also found that the inclusion of bright and colourful

images, along with auditory, textual and visual elements and a consistent

layout enhance the user experience. Finally, the analysis found that pace of

delivery impacted on user’s ability to absorb content. The findings of our

competitive product survey are summarized in Table 1.

Page 68: Developing and Creating Interactive e-Tutorials

60

Institution

Dimension of

Quality

Co

de

aca

de

my

Du

olin

go

Kah

n A

cad

em

y

Mo

nas

h U

niv

ers

ity

Lib

rary

Un

ive

rsa

l C

lass

Performance ✔ ✔ ✔

Quick

Response

✔ ✔

Features ✔ ✔ ✔

Reliability ✔ ✔ ✔

Durability ✔ ✔ ✔

Aesthetics ✔ ✔ ✔

Table 1. Summary of Competitive Product Survey Findings

5.2 e-Tutorial Development

Armed with this external knowledge, we sought to apply it to the e-Tutorial

development (Attiany, 2014, p.49). The two module instructors were heavily

involved in the “editorial process of gathering, creating new, and selecting

suitable educational material” for inclusion in the e-Tutorials (Mwanza and

Engestrom, 2005, p. 458). Their involvement helped to ensure that the

content remained beneficial to the student and up-to-date (Islam, 2013, p.

397). Each of the four e-Tutorials underwent a number of iterations until they

provided the optimum user experience and satisfied the dimensions of quality

identified in the competitive product survey. The “How to find an Article” and

“ProQuest Flow” e-Tutorials underwent three iterations. In contrast,

Page 69: Developing and Creating Interactive e-Tutorials

61

“Evaluating Digital Information” underwent 5 iterations while the “Digital

Footprint and Online Reputation Management” e-Tutorial was brought through

six iterations. A greater number of iterations for the latter were necessary

because one third of each of those e-Tutorials was video content. Although

the video content took longer to develop and create they can be reused until

the content has to be changed or updated (Mechling (2005, p. 32). Figure 21

illustrates the iteration process of the “Evaluating Digital Information” e-

Tutorial. Once the final phases of the iteration process were complete, we

proceeded to the usability testing phase of the project.

Figure 21. Evaluating Digital Information e-Tutorial Iteration Process

5.3 Usability Testing

The participants’ prior experience of e-Tutorials meant that they had time

to become comfortable with the various technologies involved in e-Tutorials

(Barreau, 2000, p. 90-91). An analysis of the data revealed that all

participants had a positive response to the e-Tutorial length. This supports

O’Toole and Keating’s (2011) finding that e-Tutorials should be short and

require lots of interaction to engage the user (p. 30-31). The participants

found the pause and skip features easy to use (Seiver and Troja, 2014, p. 91;

Islam, 2013, p. 397) allowing them flexibility to focus on the learning itself, as

they were not “bound by time or place” (Barreau, 2000, p. 80-81). These

features along with the interactive quizzes and decision-making scenarios

Page 70: Developing and Creating Interactive e-Tutorials

62

increased participant interaction (O’Toole and Keating, 2011, p.30) and

personalized the experience by allowing students to focus on particular areas

(Wang, 2014, p.200).

Participants found the new e-Tutorials “Digital Footprint and Online

Reputation Management” and “ProQuest Flow” content relevant and

beneficial to the learning objectives of the course. In line with Jagannathan

and Blair (2013) and Biggs and Tang, (2011) they were therefore were more

engaged and motivated to learn. In each of the four e-Tutorials the findings

supported Liaw (2008) and Robinson and Bawden (2002) in that participants

found the incorporation of a variety of learning media, such as graphics,

videos and audio appealing.

5.4 Interactive Assessment

Armellini and Aiyegbayo’s (2010, p. 934) found that assessments help to

ensure participants complete e-tivities and therefore interactive quizzes were

integrated into the e-Tutorials. The quizzes incorporated multiple assessment

methods to enhance the effectiveness of the MCQs (Douglas, Wilson, and

Ennis, 2012). In addition to MCQs, the questions included interactive

visualisations and practical applications. The interactive quizzes were both

formative and summative assessments as they aided participants in their

learning while also recognising their achievement if they successfully

answered all ten questions (Boud, 2000, p. 155). All of the participants had a

positive attitude towards the interactive quizzes as they felt empowered to

self-direct their own learning (Douglas, Wilson, and Ennis, 2012, p. 116).

However, since the questions were drawn from an instructor’s perspective,

some participants highlighted questions where the phraseology was

Page 71: Developing and Creating Interactive e-Tutorials

63

ambiguous or where the answer had not been explicit within the e-Tutorial

(Nicol, 2007).

Two of the participants questioned the long term learning benefits of

completing a quiz immediately following an e-Tutorial. This supports Scouller

(1998) who drew parallels between a surface learning approach, i.e.

memorisation and recall, with the successful completion of MCQs.

Nonetheless, since the interactive quizzes provided direct feedback upon

completion participants were empowered to self-regulate their own

performance (Nicol and Macfarlane-Dick, 2006, p. 205). Less than half of the

participants availed of this opportunity to revisit the material and increase their

learning, reinforcing Yorke’s (2003) finding that feedback is only as good as

the student’s response to it. It is worth noting that for the purposes of testing

the participants were not required to achieve full marks however SCORM

allows module instructors to make successful quizzes mandatory for

completion of the e-Tutorial.

5.5 Textual, Visual, and Auditory Elements

In accordance with the legal requirements set forth by the U.S.

Department of Education’s Office for Civil Rights (OCR), specific

“accommodations and modifications” were incorporated into the design of the

e-Tutorials to “avoid disability based discrimination” (Hashey and Stahl, 2014,

p. 72). These accommodations included content being presented through

textual, auditory and visual elements to provide greater accessibility and

learning opportunities (Hashey and Stahl, 2014, p. 71; Heiman, 2006, p. 57).

Full keyboard navigation controls were also integrated to ensure students

were continually in control of the learning process (Gornitsky, 2011; Heiman,

2006) and could successfully interact with the learning material (Seale and

Page 72: Developing and Creating Interactive e-Tutorials

64

Cooper, 2010). Other accommodations such as use of a limited colour palette,

pace of narration and step-by-step integrated demonstrations, are discussed

under their relevant headings.

Figure 22. Digital Footprint and Online Reputation Management

Page 73: Developing and Creating Interactive e-Tutorials

65

5.5.1 Textual.

Figure 23. Evaluating Digital Information

A limited colour palette was used to accommodate students with visual

difficulties, however, 95% of those tested found the colours used in the

interface appealing and easy to read. The user experience was further

enhanced by dynamic graphics and text animation (Seiver and Troja, 2014, p.

91). One participant thought red may have been more effective however red

had been deliberately omitted from the design in keeping with the findings that

people with visual difficulties find it difficult to distinguish between red tones. A

full script of the narration was available at the click of a button to successfully

facilitate learning of participants with auditory difficulties (Seale and Cooper,

2010, p.111). Although, none of the participants availed of this option, all

comments relating to core text in the e-Tutorials were positive.

Page 74: Developing and Creating Interactive e-Tutorials

66

5.5.2 Visual.

Each of the four e-Tutorials integrated video demonstrations, while the

“ProQuest Flow” e-Tutorial also incorporated static demonstrations. Although

some issues arose regarding the size and duration of the demonstrations all

of the participants welcomed their inclusion and were in support of Bowels-

Terry et al.’s (2010) finding that some students prefer to watch and listen to

videos than read (p. 24). The data analysis revealed that all participants found

the embedded videos more preferable to the previous pop-out model. The

participants also supported Heiman’s (2006) findings that students particularly

those with learning disabilities prefer the ‘stepwise processing’ of the

integrated demonstrations.

The videos did not conform to Bowels-Terry et al.’s (2010)

recommendation that integrated videos should be no longer than one minute

in length, subsequently all of those testing the “Digital Footprint and Online

Reputation Management” e-Tutorial found the videos too long (eye-tracker).

The high technical proficiency levels and prior experience of e-Tutorials was

evident when all of the “Digital Footprint and Online Reputation Management”

participants found the video detailing how to use a digital footprint calculator

repetitive. In a similar way, two of the “How to find an Article” participants

commented on the need to sign into the library prior to opening the database.

These findings illustrate that a participant’s prior knowledge and experience

impacts on the necessary content and may in some instances negate the

need for repetition as recommended by Mechling (2005).

5.5.3 Auditory.

In keeping with best practice the pace of the narration was marginally

slower than a regular conversational tone (Bowels-Terry et al., 2010, p. 24).

Page 75: Developing and Creating Interactive e-Tutorials

67

Although, one participant found the audio level and speed of narration

fluctuated slightly during the “Evaluating Digital Information” e-Tutorial, the

pace of narration was well received by all other participants. The analysis

found that the multiple iterations prior to testing ensured “poor audio quality”

did not impact on the project’s findings (Wang, 2007, p. 303).

5.6 Eye-Tracker Analysis

An analysis of the eye-tracker data and think aloud transcripts supported

Cooke (2005, p461) and Elling et al.’s (2012, p. 218) findings that the

qualitative and quantitative sources complement each other. We were able to

crosscheck the eye tracker’s observational data with the verbalized think

aloud data. Although an analysis of the eye-tracker data revealed

microsaccades identified by Reingold (2014, p. 639) these did not inhibit the

data as the e-Tutorials did not require participants to focus on minute details.

Figure 24. Example of Eye-Tracker Data

Page 76: Developing and Creating Interactive e-Tutorials

68

6. Conclusion

A student-centred approach was taken in the creation, design, and

usability of the e-Tutorials in this project. This was established as the

appropriate stance in the literature review, O’Toole and Keating (2011) argued

that a focus on the pedagogy leads to successful e-Tutorial design. Functional

benchmarking was conducted to establish the best practices used by leading

educational institutions through the Competitive product survey. The

knowledge gained from these institutions was considered invaluable to the

design process because creating the best possible e-Tutorials for students

was the primary goal in this project. In addition, the data gathered in the

usability testing was accepted as vital to the improvement of the e-Tutorials.

For future successful e-Tutorial creation in SILS, the input and opinions of

students, i.e. potential e-Tutorial users, must be given its deserved weight and

attention.

The timeframe allocated to the creation and usability testing phases was

short compared to industry standards. However, the speed of improvements

in the design iteration processes compensated for this. Final improvements

and adjustments were made to the e-Tutorials following usability testing.

Although a second round of usability testing was not possible – due to time

constraints – before implementation in SILS modules, the strict adherence to

the opinions of students is expected to be sufficient. The final step to

complete before students can view the e-Tutorials in the relevant modules is

to upload SCORM compatible versions to Blackboard. This process is

scheduled for Autumn 2014 pending approval from the module coordinators.

Page 77: Developing and Creating Interactive e-Tutorials

69

7. Recommendations

Upon completion of the results and discussion sections a best practice

guide for creating an e-Tutorial (Table 2) was drawn from this study’s findings

supported by the literature review. Where our findings were incongruent with

the literature review, every effort was made to find out why. This best practice

guide for creating an e-Tutorial has been specifically tailored to meet the SILS

stakeholder needs, however the core concepts are universal to all e-Tutorial

development. The best practice guide has been presented in eight

consecutive stages of creation.

The following are a list of recommendations for SILS:

The best practice guide for creating an e-Tutorial (Table 2)

enhances the existing UCD e-Learning strategy and provides a

roadmap for its next evolution. It should therefore be adopted

across all faculties developing e-Tutorials to supplement existing

course content.

An initial capital investment of approximately €1,000 to procure an

additional computer is necessary to maximize productivity of e-

Tutorials. Further investment may be sought pending research into

available e-Tutorial development tools.

In order to maximize the return on this investment, e-Tutorials

should be presented to industry experts annually for feedback.

These presentations also serve as opportunities to increase SILS

visibility and to expand the e-Tutorial market; particularly in external

libraries.

Page 78: Developing and Creating Interactive e-Tutorials

70

E-Tutorial Strategy

E-Tutorials supplement traditional teaching and afford students

opportunities to revisit core concepts

E-Tutorials created by SILS include SILS branding to distinguish them from

other e-Tutorials in UCD

Support SILS students both academically and technically with regard to e-

Tutorials

E-Tutorial Development

Seek approval from UCD Ethics Board for user testing

Conduct a competitive product survey to review best practice in e-Tutorial

development

Chose e-Tutorial development tools compatible with SCORM i.e. Articulate

and explore their capabilities

Analyse e-Tutorial compatibility with medium for their purveyance

Conduct multiple iteration cycles until e-Tutorial reaches maximum maturity

E-Tutorial Design

Design and test content in consultation with SILS experts

Optimum duration for e-Tutorial is no more than 15 minutes

Define and communicate learning outcomes from the outset of the e-

Tutorial

Auditory, Visual and Textual Elements

Incorporate findings from competitive product survey in content design

Ensure e-Tutorials adhere to highest accessibility standards for students

with learning difficulties and disabilities

Vary content delivery to cater for multiple learning styles

Page 79: Developing and Creating Interactive e-Tutorials

71

Optimum duration for integrated demonstrations is no more than one minute

Record clear narration paced marginally slower than a regular

conversational tone using existing SILS equipment

Interactive Assessments

Incorporate multiple question formats i.e. MCQs, hotspots and practical

applications

Ensure questions are explicit and reflect learning outcomes of e-Tutorial

Maximise user engagement by concealing assessment scores until

completion

Usability Testing

Create semi-structured interview protocol, letter of information and consent

form

Test a minimum of 5 participants per e-Tutorial

Conduct retrospective think aloud using eye-tracker data

Conduct a brief semi-structured interview to expand on issues raised in

think aloud

Final Iteration

Analyse data from eye-tracker, think aloud and semi-structured interviews

Make necessary amendments to e-Tutorial

Publish e-Tutorial on SILS network

Annual Review

Conduct online survey of relevant SILS students assessing e-Tutorial

Consult with SILS experts to ensure e-Tutorial content is still relevant and

complements core learning outcomes of SILS

Table 2. Best Practice Guide for creating an e-Tutorial

Page 80: Developing and Creating Interactive e-Tutorials

72

8. References

Acharya, A. S., Prakash, A., Saxena, P., & Nigam, A. (2013). Sampling: Why

and how of it? Indian Journal of Medical Specialities, 4(2), 330-333. doi:

10.7713/ijms.2013.0032

Amirault, R. J. (2012). Distance learning in the 21st century university: Key

issues for leaders and faculty. Quarterly Review of Distance Education,

13(4), 253-265.

Armellini, A., & Aiyegbayo, O. (2010). Learning design and assessment with e-

tivities. British Journal of Educational Technology, 41(6), 922-935.

Arthur, N. (2006) Using student-generated assessment items to enhance

teamwork, feedback and the learning process. Synergy: Supporting the

Scholarship of Teaching and Learning at the University of Sydney, 24,

21–23. Retrieved 25th May, 2014, from:

http://www.itl.usyd.edu.au/synergy/article.cfm?articleID=283

Attiany, M. (2014). Competitive advantage through benchmarking: Field study

of industrial companies listed in Amman Stock Exchange. Journal of

Business Studies Quarterly, 5(4), 41-51.

Babb, K. A., & Ross, C. (2009). The timing of online lecture slide availability

and its effect on attendance, participation, and exam performance.

Computers & Education, 52(4), 868-881. doi:

10.1016/j.compedu.2008.12.009

Banas, E., & Emory, F. (1998). History and issues of distance learning. Public

Administration Quarterly, 22, 365-383.

Barreau, D. (2000). Distance learning: Beyond content. Journal of Education

for Library and Information Science, 41, 79-93. doi: 10.2307/40324057

Page 81: Developing and Creating Interactive e-Tutorials

73

Bhati, N., Mercer, S., Rankin, K., & Thomas, B. (2010). Barriers and facilitators

to the adoption of tools for online pedagogy. International Journal of

Pedagogies & Learning, 5(3), 5-19.

Biggs, J. & Tang C. (2011). Teaching for quality learning at university: What

the student does. (4th ed.) Berkshire, England: Open University Press.

Boud, D. (2000). Sustainable Assessment: rethinking assessment for the

learning society. Studies in Continuing Education, 22(2), 151-167. doi:

10.1080/01580379950177314

Bowden, J. L. (2013). What's in a relationship? Asia Pacific Journal of

Marketing and Logistics, 25(3), 428-451. doi: 10.1108/APJML-07-2012-

0067

Bowles-Terry, M., Hensley, M. K., & Hinchliffe, L. J. (2010). Best practices for

online video tutorials in academic libraries: A study of student

preferences and understanding. Communications in Information

Literacy, 4(1), 17-28.

Brannen, J. (1992). Mixing methods: qualitative and quantitative research.

Brookfield, USA: Avebury.

Brown, S. (2010). From VLEs to learning webs: the implications of Web 2.0 for

learning and teaching. Interactive Learning Environments, 18(1), 1-10.

doi: 10.1080/10494820802158983

Brunvand, S., & Abadeh, H. (2010). Making online learning accessible.

Intervention in School & Clinic, 45(5), 304-311.

Bryman, A. (2001). Social research methods. Oxford: Oxford University Press.

Canning, J. (2007). Pedagogy as a discipline: emergence, sustainability and

professionalisation. Teaching in Higher Education, 12(3), 393-403. doi:

10.1080/13562510701278757

Page 82: Developing and Creating Interactive e-Tutorials

74

Cheawjindakarn, B., Suwannatthachote, P., & Theeraroungchaisri, A. (2012).

Critical success factors for online distance learning in higher education:

A review of the literature. Creative Education, 3, 61-66. doi:

10.4236/ce.2012.38b14

Concannon, F., Flynn, A., & Campbell, M. (2005). What campus-based

students think about the quality and benefits of e-Learning. British

Journal of Educational Technology, 36(3), 501-512. doi:

10.1111/j.1467-8535.2005.00482.x

Cooke, L. (2005). Eye tracking: How it works and how it relates to usability.

Technical Communication, 52(4), 456-463.

Coopman, S. (2009). A critical examination of Blackboard’s e-Learning

environment. First Monday, 14(6). doi: 10.5210/fm.v14i6.2434

De Vries, F., Kester, L., Sloep, P., van Rosmalen, P., Pannekeet, K., & Koper,

R. (2005). Identification of critical time-consuming student support

activities in e-Learning. ALT-J Research in Learning Technology, 13(3),

219–229. doi: 10.1080/09687760500376488

Deepwell, F., & Malik, S. (2008). On campus, but out of class: An investigation

into students’ experiences of learning technologies in their self-directed

study. ALT-J Research in Learning Technology, 16(1), 5–14. doi:

10.1080/09687760701850166

Denzin, N. K., & Lincoln, Y. S. (eds.) (1994). Handbook of qualitative research.

London: Sage Publications.

DePietro, P. (2012). Transforming Education with New Media: Participatory

Pedagogy, Interactive Learning and Web 2.0. International Journal of

Technology, Knowledge & Society, 8(5), 1-11.

Page 83: Developing and Creating Interactive e-Tutorials

75

DiRienzo, C., & Lilly, G. (2014). Online versus face-to-face: Does delivery

method matter for undergraduate business school learning? Business

Education & Accreditation, 6(1), 1-11.

Distance Learning (2014). In Merriam-Webster online. Retrieved 30th May,

2014, from: http://www.merriam-

webster.com/dictionary/distance%20learning.

Douglas, M., Wilson, J., & Ennis, S. (2012). Multiple-choice question tests: a

convenient, flexible and effective learning tool? A case study.

Innovations in Education and Teaching International, 49(2), 111–

121.doi: http://dx.doi.org/10.1080/14703297.2012.677596

Ehlers, U. D., & Pawlowski, J. M. (2006). Handbook on quality and

standardisation in e-learning. Berlin: Springer.

Elling, S., Lentz, L., & de Jong, M. (2012). Combining concurrent think-aloud

protocols and eye-tracking observations: An analysis of verbalizations

and silences. IEEE Transactions on Professional Communication,

55(3), 206-220. doi: 10.1109/TPC.2012.2206190

Fellenz, M. (2004) Using assessment to support higher level learning: the

multiple choice item development assignment. Assessment and

Evaluation in Higher Education, 29(6), 703-719. doi:

10.1080/0260293042000227245

Fichten, C. S., Ferraro, V., Asuncion, J. V., Chwojka, C., Barile, M., Nguyen,

M. N., & ... Wolforth, J. (2009). Disabilities and e-Learning problems

and solutions: An exploratory study. Journal of Educational Technology

& Society, 12(4), 241-256.

Fisher, L. (2013, Jan 24). Accelerated learning. Marketing Week (Online).

Retrieved 6th June, 2014, from:

Page 84: Developing and Creating Interactive e-Tutorials

76

http://www.marketingweek.co.uk/analysis/supplements/idm-training-

guide- 2013/accelerated-learning/4005256.article

Gibson, C. C. (1998). Semi-structured and unstructured interviewing: a

comparison of methodologies in research. Journal of Psychiatric &

Mental Health Nursing, 5(6), 469.

Gornitsky, M. (2011). Distance education: Accessibility for students with

disabilities. Distance Learning, 8(3), 47-53.

Handcock, M. S., & Gilet, K. J. (2011). Comment: On the concept of snowball

sampling. Sociological Methodology, 41, 367-XI.

Hashey, A., & Stahl, S. (2014). Making online learning accessible for students

with disabilities. Teaching Exceptional Children, 46(5), 70-78.

Hauser, R., Paul, R., Bradley, J., & Jeffrey, L. (2012). Computer self-efficacy,

anxiety, and learning in online versus face to face medium. Journal of

Information Technology Education, 11, 141-154.

Heiman, T. (2006). Assessing learning styles among students with and without

learning disabilities at a distance-Learning university. Learning

Disability Quarterly, 55-63.

Horspool, A., & Lange, C. (2012). Applying the scholarship of teaching and

learning: student perceptions, behaviours and success online and face-

to-face. Assessment & Evaluation in Higher Education, 37(1), 73-88.

doi: 10.1080/02602938.2010.496532

Islam, A.K.M. (2013). Investigating e-Learning system usage outcomes in the

university context. Computer & Education, 69, 387-399.

Jagannathan, U., & Blair, R. (2013). Engage the disengaged. Distance

Learning, 10(4), 1-7.

Page 85: Developing and Creating Interactive e-Tutorials

77

Kanuka, H., & Kelland, J. (2008). Has e-Learning delivered on its promises?

Expert opinion on the impact of e-Learning in higher education.

Canadian Journal of Higher Education, 38(1), 45-65.

Kember, D., & Leung, D.Y.P. (2006). Characterising a teaching and learning

environment conducive to making demands on students while not

making their workload excessive. Studies in Higher Education, 31(2),

185–198. doi: 10.1080/03075070600572074

Keramidas, C. (2012). Are undergraduate students ready for online learning?

A comparison of online and face-to-face sections of a course. Rural

Special Education Quarterly, 31(4), 25-32.

Kopp, B., Matteucci, M., & Tomasetto, C. (2012). E-Tutorial support for

collaborative online learning: An explorative study on experienced and

inexperienced e-tutors. Computers & Education, 58(1), 12-20. doi:

10.1016/j.compedu.2011.08.019

Land, R., and S. Bayne. (2005). Screen or monitor? Issues of surveillance and

disciplinary power in online learning environments. In Education in

cyberspace, ed. R. Land and S. Bayne, pp. 165–178. London:

Routledge Falmer.

Latham, D., & Smith, S. (2003). Practicing what we teach: A descriptive

analysis of library services for distance learning students in ALA-

accredited LIS schools. Journal of Education for Library and Information

Science, 44, 120-136.

Laurillard, D. (2002). Rethinking university teaching: a conversational

framework for the effective use of learning technologies. London:

Routledge Falmer.

Page 86: Developing and Creating Interactive e-Tutorials

78

Liaw, S. (2008). Investigating students' perceived satisfaction, behavioral

intention, and effectiveness of e-Learning: A case study of the

Blackboard system. Computers & Education, 51(2), 864-873.

Magnussen, L. (2008). Applying the principles of significant learning in the e-

Learning environment. Journal of Nursing Education, 47(2), 82-86.

Margolis, L. M., Grediagin, A., Koenig, C., & Sanders, L. F. (2009).

Effectiveness and acceptance of web-based learning compared to

traditional face-to-face learning for performance nutrition education.

Military Medicine, 174(10), 1095-1099.

Mechling, L. (2005). The effect of instructor-created video programmes to

teach students with disabilities: A literature review. Technology and

Media Division of the Council for Exceptional Children, University of

Oklahoma.

Morgan, J. H. (2013). Re-Inventing the Tutorial in an Internet World: An

Enhancement of an Old English Tradition. Journal of Alternative

Perspectives in the Social Sciences, 5(3), 522-532.

Munck, R., & McConnell, G. (2009). University Strategic Planning and the

Foresight/Futures Approach: An Irish Case Study. Planning For Higher

Education, 38(1), 31-40.

Mwanza, D., & Engeström, Y. (2005). Managing content in e-Learning

environments. British Journal of Educational Technology, 36(3), 453-

463.

Nicol, D. (2007). E-assessment by design: using multiple-choice tests to good

effect. Journal of Further and Higher Education, 31(1), 53–64. doi:

10.1080/03098770601167922

Page 87: Developing and Creating Interactive e-Tutorials

79

Nicol, D. & Macfarlane-Dick, D. (2006). Formative assessment and self-

regulated learning: a model and seven principles of good feedback

practice. Studies in Higher Education, 34(2), 199–218. doi:

10.1080/03075070600572090

O'Toole, S., & Keating, G. (2011). What makes a good e-Learning

programme? Training & Development in Australia, 38(5), 030-032.

Pena, M., & Yeung, A. (2010). Satisfaction with online learning: does students'

computer competence matter? International Journal of Technology,

Knowledge & Society, 6(5), 97-108.

Pituch, K. A., & Lee, Y. (2006). The influence of system characteristics on e-

Learning use. Computers & Education, 47(2), 222-244.

Pymm, B., & Hay, L. (2014). Using Etherpads as Platforms for Collaborative

Learning in a Distance Education LIS Course. Journal of Education for

Library & Information Science, 55(2), 133-149.

Reingold, E. M. (2014). Eye tracking research and technology: Towards

objective measurement of data quality. Visual Cognition, 22(3), 635-

652. doi: 10.1080/13506285.2013.876481

Reyes, J. (2013). Transactional distance theory: is it here to stay? Distance

Learning, 10(3), 43-50.

Robinson, L., & Bawden, D. (2002). Distance learning and LIS professional

development. Aslib Proceedings, 54, 48-55. doi:

10.1108/00012530210426284

Rosch, D. M., & Anthony, M. D. (2012). Leadership Pedagogy: Putting Theory

to Practice. New Directions for Student Services, 2012(140), 37-51.

doi:10.1002/ss.20030

Page 88: Developing and Creating Interactive e-Tutorials

80

Salmon, G. (2004). E-moderating: the key to teaching and learning online (3rd

ed.). New York: Routledge Falmer.

Sapsford, R., & Jupp, V. (1996). Data collection and analysis (Rev. ed.).

London: SAGE Publications in association with the Open University.

Schworm, S., & Gruber, H. (2012). e-Learning in universities: Supporting help-

seeking processes by instructional prompts. British Journal of

Educational Technology, 43(2), 272-281. doi: 10.1111/j.1467-

8535.2011.01176.x

Scouller, K. (1998). The influence of assessment method on students’ learning

approaches: Multiple choice question examination versus assignment

essay. Higher Education, 35(4), 453-472.

Seale, J., & Cooper, M. (2010). E-Learning and accessibility: An exploration of

the potential role of generic pedagogical tools. Computers & Education,

54(1), 1107- 1116.

Seiver, J., & Troja, A. (2014). Satisfaction and success in online learning as a

function of the needs for affiliation, autonomy, and mastery. Distance

Education, 35(1), 90-105. doi: 10.1080/01587919.2014.891427

Shah, D., & Kleiner, B. H. (2011). Benchmarking for quality. Industrial

Management, 53(2), 22-25.

Smith, S., & Basham, J. (2014). Designing online learning opportunities for

students with disabilities. Teaching Exceptional Children, 46(5), 127-

137.

Steiner, S. D., & Hyman, M. R. (2010). Improving the student experience:

allowing students enrolled in a required course to select online or face-

to-face instruction. Marketing Education Review, 20(1), 29-33. doi:

10.2753/MER1052-8008200105

Page 89: Developing and Creating Interactive e-Tutorials

81

Stella, A., & Gnanam, A. (2004). Quality assurance in distance education: The

challenges to be addressed. Higher Education, 143-160.

Stewart, C. M., Shifter, C. C., & Markaridian Selverian, M. E. (Eds.) (2010).

Teaching and learning with technology: Beyond constructivism. New

York: Routledge.

Tanyel, F., & Griffin, J. (2014). A ten-year comparison of outcomes and

persistence rates in online versus face-to-face courses. B>Quest, 1-22.

Ter-Stepanian, A. (2012). Online or face to face?: instructional strategies for

improving learning outcomes in e-Learning. International Journal of

Technology, Knowledge & Society, 8(2), 41-50.

Tobii Technology. (September, 2009). Guidelines for using the retrospective

think aloud protocol with eye tracking. Retrieved 8th July, 2014 from:

http://www.tobii.com/Global/Analysis/Training/WhitePapers/RTA_guideli

nes_eyetracking_tobii_shortpaper.pdf

Tsai, M. J., Hou, H. T., Lai M. A., Liu, W. Y., & Yang, F. Y. (2012). Visual

attention for solving multiple-choice science problem: An eye-tracking

analysis. Computers & Education, 58, 375-385. doi:

10.1016/j.compedu.2011.07.012

UCD IT Services. (2008, December). UCD IT Strategy 2009-2013. Retrieved,

19th June, 2014, from: http://www.ucd.ie/t4cms/ucd_it_strategy2009-

2013.pdf

UCD Office of Research Ethics. (2010). UCD code of good practice in

research. Retrieved 30th March, 2014, from:

http://www.ucd.ie/t4cms/REC%20Code%20of%20Good%20Practice%2

0in20 Research%20271010.pdf

Page 90: Developing and Creating Interactive e-Tutorials

82

UCD Office of Research Ethics. (n.d.) Exemptions from full review. Retrieved

30th March, 2014 from:

http://www.ucd.ie/researchethics/apply/exemptions/

UCD Strategy Group on E-Learning and Digital Media. (2010). Strategy group

on e-Learning and digital media: Final report. Retrieved 19th June,

2014, from:

http://www.ucd.ie/t4cms/UCD%20Task%20Force%20on%20e-

Learning%20final%20report.doc.

Walleck, A., O'Halloran, J., & Leader, C. A. (1991). Benchmarking world-class

performance. Mckinsey Quarterly, (1), 3-24.

Wang, C., Shannon, D. M., & Ross, M. E. (2013). Students’ characteristics,

self-regulated learning, technology self-efficacy, and course outcomes

in online learning. Distance Education, 34(3), 302-323. doi:

http://dx.doi.org/10.1080/01587919.2013.835779

Wang, M. (2007). Designing online courses that effectively engage learners

from diverse cultural backgrounds. British Journal of Educational

Technology, 38(2), 294-311.

Wang, T. (2014). Developing an assessment-centered e-Learning system for

improving student learning effectiveness. Computers & Education, 73,

189-203.

Westera, W. (2004). On strategies of educational innovation: Between

substitution and transformation. Higher Education, 47(4), 501–517.

Yorke, M. (2000). A Cloistered Virtue? Pedagogical Research and Policy in

UK Higher Education. Higher Education Quarterly, 54(2), 106-126.

Page 91: Developing and Creating Interactive e-Tutorials

83

Yorke, M. (2003). Formative assessment in higher education: moves towards

theory and the enhancement of pedagogic practice. Higher Education,

45(4), 477–501.

Yorke, M., & Knight, P. (2004). Self-theories: some implications for teaching

and learning in higher education. Studies in Higher Education, 29(1),

25-37.

Page 92: Developing and Creating Interactive e-Tutorials

9. APPENDICIES

Appendix A Human Subjects Exemption from Full Ethical Review

Form

Human Subjects Exemption from Full Ethical Review Form

Including Access to UCD Students & University-Wide Surveys

An Exemption from Full Ethical Review is not an exemption from ethical best

practice and all researchers are obliged to ensure that their research is conducted

according to HREC Guidelines. Depending on the nature of the study described

below your study may require a preliminary review by the HREC Chairs and may

be subject to further clarification.

Please do not alter the format of this form and submit it as a word document.

Section A: General Information

I apply for Exemption from Full Ethical Review of the research protocol

summarised below, on the basis that (tick the appropriate box for YES):

a) All aspects of the protocol have received ethical approval from an approved

body

(e.g. Hospitals, hospices, prisons, health authorities)

b) The research protocol meets one or more of the criteria for exemption from

review

as detailed in Section 3 of Further Exploration of the Process of Seeking

Ethical

Page 93: Developing and Creating Interactive e-Tutorials

Approval for Research (HREC Doc 7)

I am also requesting permission to access UCD Students for one of the

following (tick the appropriate box only if your answer is YES):

c) I am accessing students from one school only and will seek permission from

the

Head of that school

d) I am seeking permission to access UCD Students from more than one school

(accessing students in more than one school will require HREC approval)

e) I am seeking permission to conduct a university-wide survey of UCD students

(if the research is a campus-wide student survey1 and involves students from

two or more schools, then permission to schedule the survey will be sought

from the University Student Survey Board (USSB) on your behalf after this

form has been reviewed by a HREC Chair and/or HREC Committee).

I have also read the following Guidelines (tick the appropriate box for YES):

i. HREC Guidelines and Policies for Ethical Approval of Research Involving

Human Subjects

1 Where the target population comprises students drawn from two or more schools and the survey encompasses university-wide activities or services

Page 94: Developing and Creating Interactive e-Tutorials

ii. The UCD Data Protection Policy

http://www.ucd.ie/dataprotection/policy.html

iii. The Data Protection Guidelines on Research in the health sector, (if

applicable)

This is not applicable to our research

http://www.dataprotection.ie/viewdoc.asp?m=m&fn=/documents/guidance/h

ealth_research.html

For all the latest versions of the HREC Policies and Guidelines please see the

research ethics website: http://www.ucd.ie/researchethics/hrec.html

1. PROJECT DETAILS

a) Project Title:

Developing and Creating Interactive e-Tutorials to support

Blended Learning in selected Modules in the School of

Information & Library Studies

b)

Study Start

Date:

(dd/mm/yy)

01/03/14

Study

Completion

Date:

(dd/mm/yy)

31/08/14

c)

Start Date of

Data

Collection:

(dd/mm/yy)

14/04/14

Completion

Date of Data

Collection:

(dd/mm/yy)

22/08/14

Page 95: Developing and Creating Interactive e-Tutorials

NOTE: In no case will approval be given if recruitment and/or data collection

has already begun

2. APPLICANT DETAILS

a)

Name of Applicant

(please include title if

applicable):

Adrian Dunne

UCD Student

Number: (if

applicable)

13205610

b)

Applicant’s position in

UCD (please put ‘yes’ in

relevant space):

Staff Postgraduate Undergraduate

c) Academic/Professional

Qualifications

B.A. in Computing

d) Applicant’s UCD

Contact Details

UCD Telephone UCD Email

0861636445 [email protected]

e) Applicant’s UCD

Address (school etc.) School of Information and Library Studies

f)

Name of Supervisor

(please include title if

applicable):

Dr. Crystal Fulton & Dr. Claire McGuinness

g) Supervisor’s UCD

Contact Details

UCD

Telephone UCD Email:

017167593 [email protected]

h) UCD Investigator(s)

and affiliations

(name all investigators & co-investigators on

project)

Robert Fagan, Fiona Farrelly, Jennifer Finnerty,

Page 96: Developing and Creating Interactive e-Tutorials

Chelsea Holland, Mark McLoughlin

i) Funding if applicable

Source Amount

N/A

j. EXTERNAL APPLICANTS ONLY (if study is not associated with any UCD staff

member or school)

a) External

Investigato

r(s) if

applicable

Not Applicable

b) Name of

Organizatio

n

Relationship with External

Organization

c) Address of

Organizatio

n

d) External

Investigato

r(s) if

applicable

Not Applicable

e) Project

Title:

f) Start Date

of Data

Collection:

(dd/mm/yy) Completion

Date of Data

Collection:

(dd/mm/yy)

Page 97: Developing and Creating Interactive e-Tutorials

k) Insurance Please contact the UCD Safety Office ([email protected]) to ascertain

whether you are insured to carry out your research. Please do not assume that

you do not require insurance (tick the appropriate box if the answer is YES)

Will this study covered by UCD Insurance/Indemnity?

Currently seeking UCD insurance coverage

Is there any blood sampling involved in this study?

Are there other medical procedures involved in this study?

You will need to provide proof of insurance cover which you should do once you

have received your research ethics reference number which can only be issued

after you submit this form to [email protected]

Page 98: Developing and Creating Interactive e-Tutorials

Section B: Research Design & Methodology

3. RESEARCH PROPOSAL

a) Methods of data collection (please select the appropriate box and

provide brief details)

i standard educational

practices

ii standard educational tests

iii standard personality tests

iv standard psychological tests

v interviews or focus groups

De-identified interviews with

approximately two School of Information

lecturers and three UCD

administrative/support staff. These

interviews will help identify strengths and

weaknesses in the e-tutorials currently

available, and highlight new topics for e-

tutorials to support students.

Snowball sampling will be used to obtain

approximately 20 students from the

School of Information and Library

Studies. These students will be a

Page 99: Developing and Creating Interactive e-Tutorials

combination of undergraduate and

postgraduate students. The postgraduate

students will comprise of students from

the Master of Library & Information

Studies programme and Master of

Information Systems programme.

The participants will be asked to

participate in a ‘Think Aloud’ exercise in

front of an eye tracking machine to

explore where improvement can be made

to the tutorials.

Then they will be asked to participate in a

20 minute semi-structured interview

pertaining to layout, content and issues

observed during the ‘Think Aloud’

exercise.

A ‘Think Aloud’ exercise is where

participants think aloud as they are

performing a set of specified tasks.

Participants are asked to say whatever

they are looking at, thinking, doing, and

feeling as they go about their task. There

is no risk to participants.

Page 100: Developing and Creating Interactive e-Tutorials

An eye tracker is a device for measuring

eye positions and eye movement. This

will be used to find where participant’s

focus is while doing the tutorials. It is

completely non-invasive and poses no

risk to participants.

These exercises and interviews will be

conducted in The School of Information

and Library Studies, room 106. This is a

public space located on the UCD

Campus.

All data gathered in these usability tests

will be de-identified. All interviews will be

recorded digitally using a digital voice

recorder and transcribed using Microsoft

Word.

The information gathered in these tests

will be used to finalise the structure of all

tutorials and the content featured in the

two new tutorials.

vi public observations

Page 101: Developing and Creating Interactive e-Tutorials

vii persons in public office

viii using existing data only

We will be using the survey carried out by

Dr. Crystal Fulton and Dr. Claire

McGuinness entitled ‘Evaluating e-

Tutorials’. This was an anonymous online

survey. The results of this survey will

determine the eight e-tutorials we will

edit. We will also combine this data with

the findings from the interviews in order

to deduce how build our new, improved

e-tutorials and the template for future e-

tutorials.

ix surveys/questionnaires

x audio/video recordings

With the permission of the participants,

interviews and ‘Think Aloud’ exercise will

use audio recording using a digital

recorder.

The ‘Eye-Tracker’ will only record the eye

movement of participants on the screen;

there will be no overall visual recording of

participants themselves during these

tests.

Page 102: Developing and Creating Interactive e-Tutorials

xi Other(please specify)

b) Who are the participants or

informants? (including size and

composition)

Interviews with UCD staff members

including two School of Library and

Information Studies lecturers and

approximately three other educational

professionals i.e. UCD

administrative/support staff.

A sample of twenty students will be

recruited via snowball sampling to

partake in the usability testing of the new

and improved e-tutorials. The sample of

students recruited for usability testing will

be a combination of undergraduate and

postgraduate students from the School of

Library and Information Studies.

c) Where are you recruiting the

participants from?

The students will be a combination of

undergraduate and postgraduate

students from the School of Library and

Information Studies.The postgraduate

students will comprise of students from

the Master Library & Information Studies

and Master of Information Sytems

programme.

We will recruit the UCD lecturers from the

Page 103: Developing and Creating Interactive e-Tutorials

School of Library and Information

Studies.

The administrative/support staff will be

recruited from the School of Library and

Information Studies and UCD James

Joyce Library.

i Are participants external to

UCD?

No.

ii Do you have permission to

access these participants?

(provide details of

organization/group and

attached a copy of the

permission if applicable)

Permission to be sought from the Head of

School.

If you are recruiting UCD students please ensure that you complete Section E

below.

d) Aims and Objectives of the study (in brief lay language – no more than

300 words)

The aims and objectives of this study are as follows:

Aims

To explore the role of e-Learning in full and part-time third level education

To identify the strengths and weaknesses of the existing e-tutorials with a

view for improvement.

To perform beta tests of the e-tutorials before roll out.

To research current theory on digital/online learning in order to make

Page 104: Developing and Creating Interactive e-Tutorials

informed decisions about the appropriate implementation of digital learning

objects in higher education.

Objectives

Create a set of relevant, engaging and interactive e-tutorials that may be

rolled out as digital learning objects in SILS in the coming academic year

2014/15.

Create a uniform template for all future tutorials and to incorporate the

School of Library and Information Studies branding in the e-tutorials.

Supply the School with the relevant documentation and ‘Best Practice

Guide’ for the future creation and modification of e-tutorials.

e) Research Design (in brief lay language – no more than 300 words)

The first step in our research will consist of a detailed review of the existing data

collected by Dr. Crystal Fulton and Dr. Claire McGuinness i.e. the online,

anonymous ‘Evaluating e-Tutorials’ survey. The review of this data will help

determine the strengths and weaknesses of the current e-tutorials. This will inform

the outline for our interviews with lecturers from the School of Library and

Information Studies as well as UCD administrative/support staff. These interviews

will further highlight problems in existing e-tutorials and identify the new e-tutorials

that need to be created.

The interview with SLIS administrative staff member will inform the methods of

branding to be incorporated in all e-tutorials.

The interviews with the UCD Library Staff member will aid in selecting the relevant

material to be included in the new e-tutorials.

Page 105: Developing and Creating Interactive e-Tutorials

Snowball sampling will be used to obtain approximately 20 students from the

School of Information and Library Studies.

The participants will be asked to participate in a ‘Think Aloud’ exercise in front of

an eye tracking machine to explore where improvement can be made to the

tutorials.

They will then be asked to participate in a 20 minute semi-structured interview

pertaining to layout, content and issues observed during the ‘Think Aloud’

exercise.

The information gathered in these tests will be used to finalise the structure of all

tutorials and the content featured in the two new tutorials.

Page 106: Developing and Creating Interactive e-Tutorials

Section C: Basis for Exemption

4. RESEARCH PARTICIPANTS: RISK, HARM, SELECTION AND CONSENT

a) Is this research likely to involve any foreseeable risk to

participants, above the level experienced in

everyday life? (if yes, tick the box)

Does this research involve the following: you are advised to read the HREC

Guidelines documents – see HREC Policies & Guidelines (tick the appropriate

box if the answer is YES):

http://www.ucd.ie/researchethics/information_for_researchers/policies_guidelines/

]

i

Any vulnerable groups? (this includes UCD Students)

The interviews conducted with students will be peer-to-peer,

so there is no power differential which would make the

participants vulnerable.

ii Sensitive topics that may take participants feel uncomfortable? (i.e.

sexual behavior, illegal activities, racial biases, etc.,)

iii Use of drugs?

iv Invasive procedures? (e.g. blood sampling)

v Physical stress/distress, discomfort?

vi Psychological/mental stress/distress?

vii Deception of/or withholding information from subjects?

viii Access to data by individuals or organizations other than the

Page 107: Developing and Creating Interactive e-Tutorials

investigators?

ix Conflict of interest issues?

x Any other ethical dilemma? (if the answer is YES please provide

details below)

5. ETHICAL APPROVAL FROM ANOTHER BODY

a) Has this study received Ethical Approval elsewhere? (e.g. hospital

REC or other external body or for data collected by another

organization for a specific purpose – if yes, please tick the box)

If your answer is No please proceed to Section 6

This is not applicable.

b) Ethical Approval from body other than UCD for this study or

parts of this study (if applicable, please

tick the box)

i Name of Organization

that has approved the

study?

ii Approval Number/Ref

Page 108: Developing and Creating Interactive e-Tutorials

iii Approval Date

Please provide a copy of the approval with this form as a supporting document

b) Provide a brief account of aspects of the study not covered by external

approval

c) Can you confirm that you will seek full ethical approval from UCD

HREC for all non-approved aspects of the

study? (tick the box for YES)

Please note that a grant of exemption from full ethical review will only be granted

by UCD HREC for those aspects of the study that have been approved and are

under the jurisdiction of the approving body

Approval from an approved body (if applicable, please tick the appropriate box

for YES)

i Have all aspects of the study received ethical approval from an

approved body?

ii Does the approving body have jurisdiction over aspects of the

study?

Page 109: Developing and Creating Interactive e-Tutorials

6. USE OF EXISTING DATA

a) If you are using existing data, please explain why this study is exempt

from a full ethical review? ( e.g. data collected by another organization for

a specific purpose )

We will be using the survey carried out by Dr. Crystal Fulton and Dr. Claire

McGuinness entitled ‘Evaluating e-Tutorials’. This was an anonymous online

survey. The results of this survey will determine the eight e-tutorials we will edit.

We will also combine this data with the findings from the interviews in order to

deduce how build our new, improved e-tutorials and the template for future e-

tutorials.

Page 110: Developing and Creating Interactive e-Tutorials

Section D: Confidentiality and Data Protection

7. DATA COLLECTION DETAILS

a) What arrangements are in place to ensure that the identity of each

participant remains confidential?

Participants’ names will only be collected on consent forms and will not be

associated with any of the data. Transcripts of interviews and usability testing will

be de-identified.

b) Please indicate the form in which the data will be collected (Please

tick the appropriate box and provide short details)

i Anonymous

ii

De-identified

(or anonymised)

Interview transcripts, ‘Think Aloud’, and

‘Eye-Tracking’ sessions will all be de-

identified.

iii Identifiable

iv Potentially identifiable

c) Indicate the form in which the data will be stored and/or accessed

(Please tick the appropriate box and provide short details)

i Anonymous

ii Interview transcripts, ‘Think Aloud’, and

Page 111: Developing and Creating Interactive e-Tutorials

de-identified

(or anonymised)

‘Eye-Tracking’ sessions will be de-

identified and will be assigned numerical

codes to group material by participant.

There will be no key retained to connect

the numerical code to a specific

participant. All potentially identifying

information will be removed.

iii Identifiable

iv Potentially identifiable

d) Describe the measures that will be taken to protect the

confidentiality of the data to be collected

i

Who will have control of the

data generated by the

research?

The data produced by this research will

be controlled by the principle

investigators.

ii

Where will the data will be

stored/ or archived, does this

comply with the HREC

guidelines?

Data will be stored in password protected

files on an encrypted external hard drive

and will be locked in a secure cabinet in

the School of Library and Information

Studies.

iii In what format will the data

be stored?

The data will be stored in .avi, .docx files,

.eyd files, and .wav audio files. All .wav

files will be transcribed and de-identified

into .docx files. On completion of this task

the .wav files will be destroyed.

Page 112: Developing and Creating Interactive e-Tutorials

iv For how long will the data be

stored?

The data will be securely stored until the

end of the project.

e) Responsibility for data collected in the study (tick the box for YES

where applicable)

i Will the data generated by the research be destroyed?

ii Will the data be destroyed at or before the end of the study?

The data will be securely stored until the end of the project.

iii

Who will be responsible for

destroying the data at the

end of the period indicated in

7 d) iv?

The principle investigator will destroy all

data at the end of the project.

iv Will the data be archived?

v Will the archived data be intended for personal use only?

vi Will the archived data be made available to other researchers?

If yes, please provide details of where

the archive will be held and what

restrictions for use will be put in place

vii

Who will be responsible of

the archive and future use of

data? (please provide a

name)

viii

Do you intend publishing all or part of your data?

We do not intend to publish any data from this; however we

may publish an article based on the findings.

Page 113: Developing and Creating Interactive e-Tutorials

If yes, please note that some journal editors require assurances (in addition to

ethical approval) that data were collected ethically and that all consents, assents

and other permissions were granted prior to the start of data collection.

Page 114: Developing and Creating Interactive e-Tutorials

If this study does not involve UCD students or university-wide surveys

please proceed to Section F

Section E: Access to UCD Students

Where researchers are hoping to access UCD students in more than one

school, Part 1 must be completed. If your research is a university-wide

student survey, Parts 1 and 2 must be completed. For information on the

process of securing access please see the policy document: Research

Access to UCD Students: A policy for UCD Staff/Students and external

organizations.

Part 1: Request for Permission to Access Students Not Applicable; not a

university-wide project.

1. Accessing Students? Yes No

a) Are you accessing students from more than one

school?

b) Do you wish to conduct a university-wide student

survey?

If your answer to 1(b) is yes, please also complete Part 2 below.

2. Type of Study (interviews, focus groups, electronic or paper based

questionnaires, etc)

Proposed Start

Date:

(dd/mm/yy) Proposed End

Date:

(dd/mm/yy)

Page 115: Developing and Creating Interactive e-Tutorials

If the study will

be repeated,

please indicate

the frequency:

(annual, twice-

yearly, etc):

Target students

(which

schools/colleges)

Any other

Comments:

Part 2: University-Wide Student Surveys ONLY Not Applicable.

Please note that if you are seeking permission to conduct a university-wide

survey you will need to complete this section. The Office of Research Ethics

will process this permission on your behalf after your form has been reviewed

and access to UCD students has been granted.

1. Title of Proposed Student Survey

2. Survey Sponsor / Applicant (please include title if applicable):

3. Details of the Proposed Survey

Has this survey been conducted in UCD before? Yes No

If yes, why is an additional survey required?

Page 116: Developing and Creating Interactive e-Tutorials

Section F: Signed Declaration

8. SIGNATURES ARE REQUIRED ONLY POST-REVIEW AND FOLLOWING

SATISFACTORY RESPONSES TO ANY CLARIFICATIONS. Exemption Forms

should be signed by the applicant and supervisor/head of school and the signed forms

should be retained by the school.

I, the undersigned researcher, have read the UCD Guidelines and Policy for

Ethical Approval of Research Involving Human Subjects and Further

Exploration of the Process of Seeking Ethical Approval for Research and

agree to abide by them in conducting this research. I confirm that, based on

my understanding of these guidelines and policy documents, I consider that

this research protocol meets the requirements for exemption from a full

ethical review. I confirm that the information provided on this form is

correct and accurate.

We the undersigned researchers acknowledge or agree with the

University:

(a) It is our sole responsibility and obligation to comply with all domestic

Irish and European legislation and to obtain such statutory consents as

may be necessary;

(b) Not to commence any research until any such consents have been

obtained;

(c) To furnish to the proper officer of UCD a true copy of any consent

obtained;

(d) That neither the University, the Committee, nor individual members of

the Committee

accept any legal obligation (to us or to any third party) in relation to the

processing of this application or to any advice offered in respect of it nor

Page 117: Developing and Creating Interactive e-Tutorials

for the subsequent supervision of the research;

(e) That the research will be conducted in accordance with any approval for

an exemption from full review granted by the Committee and in

conformity with the documentation submitted with this application and

with licence granted under any legislation;

(f) That the undersigned researcher(s) have read the most recent UCD

Research Ethics Committee Guidelines and Policy for Ethical Approval

of Research involving Humans – which are available on the UCD

website (www.ucd.ie/researchethics) and agree to abide by them in

conducting this research;

(g) Confirm that the information provided on this form is correct and

accurate;

(h) In conducting research a researcher has both ethical duties and legal

obligations. Compliance with one set of responsibilities does not

guarantee compliance with the other - what is legally permissible may not

be ethical and vice versa. It is for the researcher to inform himself and

herself as to what ethical duties and legal obligations apply to his or her

research and to comply with these duties and obligations;

(i) It is not acceptable for an applicant to treat the grant of ethical approval

as absolving them from the responsibility of informing themselves of their

legal responsibilities in relation to data protection and of complying with

these;

(j) It must be understood that any ethical approval granted is premised on

the assumption that the research will be carried out within the limits of the

law;

(k) Ethical approval does not constitute any sort of advice or representation

Page 118: Developing and Creating Interactive e-Tutorials

to the applicant that compliance with the requirements, as laid down by

the UCD Human Research Ethics Committee, will be sufficient to comply

with the applicable law in the area.

Signature of

Applicant:

Date:

14/04/14

Endorsement of Supervisor (if applicable: students who are being supervised are

required to have their supervisor’s knowledge and endorsement of the study.

Supervisors confirm that they have read the above application, and are satisfied that

the study appears to meet all requirements for a grant of ethical approval with

Exemption from full review from UCD HREC.

Signature of

Supervisor(or

designate):

Date:

Approval of Head of School: Acknowledging exemption for this study and, if

applicable, permission if the research concerns students from one UCD School or

Unit, permission can be given by the relevant Head of School/Unit.

Page 119: Developing and Creating Interactive e-Tutorials

Signature of Head of

School or

Organization (or

designate):

Date:

Signature of

Applicant:

Date:

14/04/14

Signature of

Applicant:

Date:

14/04/14

Signature of

Applicant:

Date:

14/04/14

Page 120: Developing and Creating Interactive e-Tutorials

Signature of

Applicant:

Date:

14/04/14

Signature of

Applicant:

Date:

14/04/14

Page 121: Developing and Creating Interactive e-Tutorials

Appendix B UCD Insurance Policy

Page 122: Developing and Creating Interactive e-Tutorials

Appendix C Competitive Product Survey

1. Codecademy

What is it?

Codecademy is a website that offers various e-tutorials; in regards to

learning how to code. Codecademy provides users with the opportunity to partake

in interactive e-tutorials, such as HTML/CSS, JavaScript, jQuery, Python, Ruby,

PHP and APIs, which teach individuals how to code.

In order to access, and successfully use, the Codecademy e-tutorials, an

individual must create a username and password. This will allow for Codecademy

to record an individual’s progress; for instance, which e-tutorials the individual has

completed, which e-tutorials are in progress, which e-tutorials the individual may

be interested in, etc. After an individual has completed the necessary steps and is

officially signed up to Codecademy, the website will automatically bring the

individual to a page containing various e-tutorials available to use.

Page 123: Developing and Creating Interactive e-Tutorials

The e-Tutorials

In reviewing the e-Tutorials, I will use the one on learning how to code

HTML/CSS, as an example. The initial page, before the e-tutorial began, provided

an overview of information regarding what to expect in the HTML/CSS e-tutorials.

Codecademy provided a list of the e-tutorials available, as well as a percentage

bubble next to each e-tutorial so that would show the progress of completion. In

addition, at the very bottom of this webpage, additional links were available if the

user wanted to obtain more information regarding HTML/CSS coding.

Figure 1. Introduction Page 1 Figure 2. Introduction Page 2

Once the e-tutorial was opened, the user was immediately provided with

information regarding HTML, in a pane on the left-hand side, followed by a set of

instructions.

Page 124: Developing and Creating Interactive e-Tutorials

Figure 3. Instruction Page

After following the instructions, if the user is correct in completing the steps asked,

a notification will pop up at the bottom of the page.

Figure 4. Lesson Completion Icon

If the user is incorrect in completing a step within the instructions provided, a

notification stating “Oops, try again” will appear; along with a more detailed

message as to what may be incorrect.

Figure 5. Wrong Answer Pop-up

Page 125: Developing and Creating Interactive e-Tutorials

If a user is unsure of what the instructions are asking, there is a hint button!

Figure 6. Hint Option

In addition, there is a dropdown menu available in which the user may see how

many sections of the e-tutorial have been completed, are in progress, or are

remaining. This dropdown menu also enables the user to select a previously

completed section of the e-tutorial to review it if needed.

Page 126: Developing and Creating Interactive e-Tutorials

Figure 7. Progress Bar

Once all of the steps, or sections, within the e-tutorial have been completed, as

message stating “Congratulations” will appear, and the user will have successfully

completed that e-tutorial.

Figure 8. Completion

Review

Overall, I thought the content contained within the various sections of the

HTML/CSS tutorial provided useful in following most of the instructions. However,

I would have liked to see more detail, or more of an explanation, within the

information provided in some of the sections; considering that I found various

pieces of the content confusing or lacking enough information. More likely than

not, those who will partake in these e-tutorials will not be familiar with the content,

therefore it is imperative to make sure there is no confusing terminology and that

there is enough detail throughout. This being said, Codecademy did contain an

HTML Glossary and a CSS Glossary, which proved useful.

Additionally, Codecademy did maintain one format throughout the various

e-tutorials available. This format included the Codecademy logo located at the top

Page 127: Developing and Creating Interactive e-Tutorials

of each section within the e-tutorials. The format used throughout all of the e-

tutorials also made the information and instructions extremely easy to locate.

However, while the information was easily located, the information provided was

all text; there were no images or animations. I did enjoy the fact that, when writing

different sections of the code, the text would become different colours depending

on the coding being used. Nonetheless, there were no aspects that were able

actively maintain the user’s interest throughout the entire e-tutorial; for instance, in

taking the HTML/CSS e-tutorial, I became distracted at certain points and my

focus upon the e-tutorial was lost.

Furthermore, there was one aspect within the Codecademy e-tutorials that

I was thoroughly impressed with. Users were not only able to switch between the

various sections contained within the e-tutorial, from a dropdown menu, without

losing any of their previously written codes, but if the user happened to exit out of

the e-tutorial, the progress would automatically be saved and could easily be

returned to.

Pros

One template was used throughout all of Codecademy’s e-tutorials.

o Effective branding.

o Effective layout: spaced out, not cluttered.

o Information and instructions easy to locate.

Step-by-step process made it easy to manoeuvre throughout the e-tutorial.

Hints were made available throughout each e-tutorial.

Users could refer back to previous sections throughout the duration of the

e-tutorial.

Users could exit out of the e-tutorial and return to complete it, with the

progress previously made saved.

Page 128: Developing and Creating Interactive e-Tutorials

Cons

Must signup, through Codecademy, in order to access the various e-

tutorials.

Not enough information, or detail, provided regarding certain sections

within the e-tutorial.

It was challenging to maintain the user’s interest.

o There were no images or animations throughout the entire e-tutorial.

o The e-tutorial took too long to complete.

No audio was present throughout the entire duration of the e-tutorial.

Page 129: Developing and Creating Interactive e-Tutorials

2. Duolingo

What is it?

Duolingo is a website that offers different e-tutorials; learning how to speak

different languages. Duolingo provides users with the opportunities to learn new

languages through the exciting interactive e-tutorials, such as Spanish, German

and French for English speakers and English for Spanish speakers (more

languages will be added in due course).

In order for users to be able to access the e-tutorials and begin learning

their new language; users must create an account with Duolingo by setting up a

username and password. This allows Duolingo to keep track and record a

person’s progress and stage they are at in learning their chosen language.

Figure 1. Progress Tracker

Page 130: Developing and Creating Interactive e-Tutorials

Figure 2. Lessons Page

The e-tutorials

The e-tutorials offer learning of many different languages with a natural

progression curve, making it useful for both the novice and competent user alike.

The first page shows you a skills tree which you can follow along as you progress

through the different stages of the e-tutorials.

Figure 3. Progress Tree

Page 131: Developing and Creating Interactive e-Tutorials

After selecting your chosen language, to understand where you are and

what you are doing there is tabs along the top of the page you are logged into.

The activity tab allows you to see your skills tree which displays your progress to

date, the immersion tab offers you real articles from the internet to practise your

skills and translate. The discussion tab offers you different forums to discuss any

problems or advances you are making with your e-tutorials.

Figure 4. Immersion Option

Now we will look at some examples of key features offered by Duolingo. Below is

an example of the quizzing format used by the website, note the high quality of

image and colour in use; this creates an engaging vibrancy.

Page 132: Developing and Creating Interactive e-Tutorials

Figure 5. Quiz Example

After selecting an answer you think is correct, you can click the check button and

it will let you know that you are correct and you also hear a sound that indicates

you are correct and a box will appear below to illustrate this.

Figure 6. Quiz Correct Answer

If you select an answer that was incorrect, the highlighted box will show a box

underneath the answer that identifies the solution and a sound that indicates you

are incorrect.

Page 133: Developing and Creating Interactive e-Tutorials

Figure 7. Quiz Incorrect Answer

Review

The sounds you hear are an excellent addition to these e-tutorials when

you are correct and incorrect. When selecting the image you think is right and a

voice reads the word in the chosen language, this gives you the chance to hear it

and recognise the words again.

The skills tree is an excellent addition to duolingo, knowing where you are

and being able to see how far you have achieved. It is bright and colourful and

gives clear direction and instruction as to where you are and how much farther

you have to go until you have completed the entire tutorial.

The addition of selecting an answer and the voice reading the selected

picture of phrase back to you is excellent. However, I think there should be clearer

guidance and instruction upon this instance. Unless you play around with the

screen one might not discover this.

The best thing about duolingo is that if you knock off the app or window bar

containing the tutorial you will not have to restart the tutorial from scratch but

merely that specific section again. This will entice people to keep going as they

know they are beginning again each time.

Page 134: Developing and Creating Interactive e-Tutorials

Pros

Ease of manoeuvring around the interface

Bright and colourful

Very effective layout

Instruction can be easily located

High quality audio

Cons

No video content

Page 135: Developing and Creating Interactive e-Tutorials

3. Khan Academy

What is it?

Khan Academy is a non-profit web-based learning tool. It provides online

material on a variety of subject ranging from maths to history to cosmology.

Mode of information dissemination: The website provides three primary

pedagogical methods.

The e-tutorials

Firstly they use video tutorials, which are very well done. They feature a

good range of graphs, pictures, animation etc, depending on what the subject is

(math, history etc). The videos have very good sound quality and they are lively

and keep the user engaged. I felt we could take elements of this style of video and

incorporate it into our tutorials; as we know from our own usage, our current

tutorials are vapid and in need of colour, animation and excitement. This definitely

struck me as something, which Khan Academy does well.

Secondly the tutorials avail of simple text based information.

Figure 1. Text Based Material

This is probably not very relevant to our tutorials as we will be limited with

available space and doing blocks of text would most likely detract from the style of

Page 136: Developing and Creating Interactive e-Tutorials

our tutorials, while offering nothing more than the bullet point approach currently

adopted by SILS e-tutorials. Thus we can probably not take anything of particular

value from this section of Khan Academy.

Lastly the Khan Academy tutorials use a quiz at the end of each section.

Thus quizzes they use are much like the ones currently in use by SILS. MCQ

questions are used and the only extra function, which they use, is a ‘hint’ option

on each question. I don’t really think this is something we would want to copy.

Their quiz works differently in that one has to correctly answer a question to

continue. This is another function which I don’t like as students can just blindly

select options until they get a question right and continue this until the test is

complete; not exactly ideal for learning. The quiz is devoid of graphics or images,

making it rather bland and unattractive aesthetically; it is definitely not a model we

want to perpetuate through our work.

Figure 2. Quiz Screenshot

Page 137: Developing and Creating Interactive e-Tutorials

Review

In conclusion, Khan Academy uses three main methods of teaching

information in their e-tutorials. There are various important points which I think we

can take and incorporate into our own work. Notably the quality of audio and video

is of a very high standard and helps the user to remain engaged. Combining this

quality of media with intuitive and appropriate graphics and pop-ups brings the e-

tutorial to life and makes it a genuinely pleasant experience for the user.

Pros

Very strong audio and visual components

Excellent use of graphics keeps video engaging

Text based material is well written and serves as a useful

compliment to videos

Quiz avails of MCQ style questions

Cons

Text information is chunky and not appropriate for our e-tutorials

Quiz is vapid and un-engaging

Page 138: Developing and Creating Interactive e-Tutorials

4. Monash University Library Tutorials

What is it

Monash University is a University in Melbourne Australia. The library at this

university makes e-tutorials available to aid the learning of their students. The

tutorials look specifically at University issues such as proper referencing, research

ethics etc.

The e-tutorials

The first e-tutorial I looked at was “Academic Integrity – Error 404”. The

opening page provided a brief one paragraph description on what Error 404 is.

The duration of the tutorial was also displayed (5 minutes). This was important as

when the viewing the “Demystifying Citing and Referencing” tutorial the user was

not told how long it would last. The tutorial seemed to go on forever.

Using different ways to display information was particularly good as the

user does not become bored with the layout. Especially in the case of the Error

404 tutorial, where pop-ups were used when the curser rolled over a certain area,

as displayed in Figure 1.

Figure 1. Error 404 e-tutorial - ‘pop-ups.’

Page 139: Developing and Creating Interactive e-Tutorials

The user was also notified that this would happen. Clarity is extremely important

for the user. Any confusion and they will almost immediately disengage.

Also, a text version of the e-tutorial is also available. This is useful for users

with disabilities or learning difficulties. The text version describes what is on each

screen in the e-tutorial but this is not too detailed. It also provides the text that

appears on each screen. This is not entirely useful for the screen in which you

had to roll over the references to see what was retrieved in the Error 404 tutorial.

In the “Retweet” tutorial there seemed to be too many options to be clicked to

retrieve information (Figure 2). A balance should be reached between enough

information to be effective and enough input required by the user.

Figure 2. Retweet e-tutorial.

In the “Collusion” e-tutorial I noticed Questions were asked in different

visual formats and layouts to engage the user. This was effective as it prevented

the user from growing complacent or bored. I also noticed that at the end of each

of the “Academic Integrity” tutorials a list of the points learned in the process was

given. This seems more useful than the redundant slides in the “Demystifying

Page 140: Developing and Creating Interactive e-Tutorials

Citing and Referencing” e-tutorial which outline what will be discussed in each

section of the tutorial. These slides waste the user’s time and can become quite

frustrating.

Figure 3. Demystifying citing and referencing e-tutorial – redundant page.

The tutorial was stated to last twenty minutes. Combined with this, many of

the pages were crammed with text. This is an example of a tutorial which is trying

to do too much at once. I could see this tutorial being divided into very short

sections similar to the “Academic Integrity” series.

Finally, in this tutorial, choosing to complete the quizzes opens a new tab

which is annoying for users and could probably be avoided. Some of the

questions were not clear or did not provide clear instructions on how to answer. If

the answers were incorrect, the reasons why were not given. On the final page of

the tutorial, the user is given the details of departments and individuals which can

offer more assistance in the university.

In the “Developing a Search Strategy” tutorial, examples were picked and

worked through slowly and comprehensively with the user. The only problem with

Page 141: Developing and Creating Interactive e-Tutorials

this tutorial was that there was an error in screen size and the entirety could not

be displayed.

I don’t think I would consider the “Reading” e-tutorial as an actual tutorial.

Although it is carefully thought out and very clear, it does not seem to engage the

user at all. The content is displayed in the same vein as a text book. Some

interaction is required by the user but not enough to entice them to finish it to the

end. Finally, the user must scroll down some of the pages to view content (Figure

4).

Figure 4. Reading e-tutorial.

Review

In reviewing several of the e-tutorials made available by Monash University

Library it was clear that the positives outweighed the negatives. The best of these

tutorials appeared under the heading “Academic Integrity”. This series of tutorials

were consistent in their visual style and design. Buttons were often used, i.e. if

you clicked a button it would produce a small bubble containing more information.

Questions were asked throughout the quiz to keep users engaged, using a variety

Page 142: Developing and Creating Interactive e-Tutorials

of multiple choice, true or false, etc. This was particularly effective in my opinion. If

you got an answer wrong, the reason why it was wrong popped up immediately so

the user knew why. The user is not forced to complete the questions, if they wish

they can continue regardless. This is a significant point in terms of whether

tutorials will be graded or not.

Pros

In test questions keep the user involved and engaged.

Length of tutorial is stated prior to the user starting it

Use of pop-up items creates an interactive feel

Good presentation in general

Text version of tutorials available, useful for the disabled

Cons

Certain tutorials too long and cramped

Occasional of harmony between the various tutorials

Page 143: Developing and Creating Interactive e-Tutorials

5. Universal Class e-Tutorials

What is it?

Universal class is used by Fingal County Libraries to offer e-Learning to its

patrons. There are hundreds of courses on offer ranging from Accounting and

Management to Dog Grooming. The course comprise of assignments, exams

coursework. The course work is a mixture of written text and e-tutorials.

The e-Tutorials

The first thing that stands out with the e-tutorials is the quality &

professionalism of them. As the e-tutorial starts, music plays in background as the

slide shown in Figure1 comes into the screen. This slide contains the information

about the topic of the e-tutorial and also gives the website and copyright

information. This is a standard slide for every e-tutorial, with the title being the

only thing to change.

Figure 1. Introduction Slide

Once the tutorial starts, the first thing the user notices is the high quality of the

audio. It is easy to hear and very clear. There is no interference when the audio

was being recorded.

Page 144: Developing and Creating Interactive e-Tutorials

The e-tutorial uses slides like Figure 2, with the folder images appearing on the

bottom of the slide, to indicate a new topic within the coursework.

Figure 2. Topic Headings

As the information is being read aloud to the student, the main quotes and

information is presented on screen like in Figure 3. These are very text heavy

slides and the narrator reads the content word for word. This can be very boring

for the user as there can be a number of slides one after another with vast

amounts of text. It can be hard to read as the narrator is talking over as the user

tries to read it as well.

Page 145: Developing and Creating Interactive e-Tutorials

Figure 3. Use of Text

However, I did find useful, was the use of generic photos and videos (Figure 4.)

being displayed by when the narrator was introducing a topic or giving general

information about the topic. This was useful as I found I listened more to what the

narrator was saying rather than losing concentration by trying to read text and

listen at the same time.

Figure 4. Generic Videos and Photos

Page 146: Developing and Creating Interactive e-Tutorials

Review

Overall, the content and structure of the e-tutorial was very professional.

Universal Classes are very conscious of branding as you can see by the

introductory slides. These slides also appear at the end of the tutorial with the

same accompanying music. Located in the top right hand corner, is the Universal

Class logo, which is ever present during the e-tutorial. It is very noticeable, but

never encroaches onto the space of the tutorial itself.

The audio quality is fantastic, and most people would not have any

problems hearing the speaker. Universal Class also provides scripts of the e-

tutorials for people who have hearing disabilities.

Pros

High Quality Audio

Good Branding - Intro slide and logo in top right corner

Contact information is made available at the end of the e-tutorial

Script for hearing disability

Use of Generic Photos & Videos

Short in length - normally between 5 - 10 minutes in total.

Cons

Use of big chunks of text - use of keywords would have been better.

Transition between text slides - very hard to read for a few seconds as all

the texts gets blurred together.

Page 147: Developing and Creating Interactive e-Tutorials

Appendix D Wireframes

1. Digital Footprints Wireframes

Page 1

Page 2

Page 148: Developing and Creating Interactive e-Tutorials

Page 3

Page 4

Page 5

Page 149: Developing and Creating Interactive e-Tutorials

Page 6

Page 7

Page 8

Page 150: Developing and Creating Interactive e-Tutorials

Page 9

Page 10

Page 11

Page 151: Developing and Creating Interactive e-Tutorials

Page 12

Page 152: Developing and Creating Interactive e-Tutorials

2. Evaluating Digital Information Wireframe

Page 1

Page 2

Page 153: Developing and Creating Interactive e-Tutorials

Page 3

Page 4

Page 5

Page 154: Developing and Creating Interactive e-Tutorials

Page 6

Page 7

Page 8

Page 155: Developing and Creating Interactive e-Tutorials

Page 9

Page 10

Page 11

Page 156: Developing and Creating Interactive e-Tutorials

Page 12

Page 13

Page 14

Page 157: Developing and Creating Interactive e-Tutorials

Page 15

Page 16

Page 17

Page 158: Developing and Creating Interactive e-Tutorials

Page 18

Page 19

Page 20

Page 159: Developing and Creating Interactive e-Tutorials

3. Flow Wireframe

Page 1

Page 2

Page 3

Page 160: Developing and Creating Interactive e-Tutorials

Page 4

Page 5

Page 6

Page 161: Developing and Creating Interactive e-Tutorials

Page 7

Page 8

Page 162: Developing and Creating Interactive e-Tutorials

4. How to Find an Article Wireframe

Page 1

Page 2

Page 163: Developing and Creating Interactive e-Tutorials

Page 3

Page 4

Page 5

Page 164: Developing and Creating Interactive e-Tutorials

Page 6

Page 165: Developing and Creating Interactive e-Tutorials

Appendix E Letter of Information

Letter of Information for the Research Study:

Developing and Creating Interactive e-Tutorials to support Blended Learning in

selected Modules in the School of Information & Library Studies

(UCD SILS students, all 18+ years)

Dear Participant,

I am a Master’s Student at University College Dublin and I am part of a group undertaking

a capstone project to create and improve e-Tutorials for modules in the Schoool of

Information & Library Science. We are conducting this research under the supervision of

Dr. Crsytal Fulton & Claire McGuinness, both of whom are lecturers in the School of

Library & Infromation Science in University College Dublin.

E-tutorials serve to complement and enhance teaching within certain modules provided

for undergraduate and postgraduate students. As there is a lack of uniformity in the

structure of the current e-tutorials used across UCD. Our research will rectify the issue of

uniformity in the School of Information & Library Studies by establishing a template and

best-practice guide for creating e-tutorials. On a more a more practical level, we hope to

create two new e-tutorials that may be rolled out as digital learning objects in SILS in the

coming academic year 2014/15

What is this research about?

The study is being undertaken with the aim of researching, developing, creating, and

testing e-tutorials.

Why is this research being done?

This goal of the research is to obtain feedback from existing students of the School of

Library & Information Science on the modifications made from the original structure to the

e-Tutorials used in the school. From the participants comments, we can make further

Page 166: Developing and Creating Interactive e-Tutorials

improvements to the e-Tutorials to help provide a better standard of e-Learning.

How will the data be used?

The information gathered in this study will be used in research results, however, you will

not be identified in any ensuing publication of this research.

What will happen if you decide to take part in this research study?

If you volunteer to participate, you will be asked to take part in a usability test looking at

either a newly created e-Tutorial or a revised e-Tutorial. The usability test will consist of a

think aloud session, which will involve you completing the tutorial while telling us what you

are doing through every step. The usability test will also include the use of ‘Eye-Tracker’

software, which will monitor the participant’s eye movements during the e-tutorial.

Following the usability test, there will be a short semi-structured interview lasting

approximately 20 minutes.

Because we are trying to obtain feedback to improve the learning experience of future

School of Library & Information Science students, we will snowball sample our

participants from existing students who have experience in using the existing e-Tutorials.

If you are selected and agree to participate, we will arrange to meet in the computer

science building to undertake the usability test on a specific computer with the ‘eye-

tracker’ software. We will then conduct the semi-structured interview in Room 106 in the

School of Library & Information Science building. The entire process will be audio-

recorded.

What are the risks of taking part in this research study?

The risk associated with this study is minimal. You will only be asked about your opinions

on e-tutorials and e-Learning in general. Participation in this study is voluntary. You may

refuse to participate; you may also refuse to answer any questions or withdraw from the

study at any time. Should you feel you need a break during the interview; we can do so at

any time.

How will your privacy be protected?

University College Dublin and those conducting this study subscribe to the ethical

conduct of research and to the protection at all times of the interests, comfort, and safety

of participants. All personal information about participants will kept confidential; this

information will be destroyed after the completion of the study.

Interview recordings will be stored on an encrypted storage device for the duration of the

study and destroyed at the end of the study. Interview transcripts will not name

individuals; any potential references to your identity will be removed as interviews are

transcribed. Participants will not be identified in securely stored data nor in resulting

Page 167: Developing and Creating Interactive e-Tutorials

reports. Data will not be archived; data will be destroyed at the conclusion of the study.

What you tell me will be used only for the purposes of this research project, and you will

not be identified in the final report.

We have received insurance from University College Dublin to conduct this research. If

you have any queries or complaints regarding the research or the groups behavior,

please feel free to contact our supervisors. Dr. Crystal Fulton - [email protected] or

Claire McGuinness - [email protected].

What are the benefits of taking part in this research study?

It is hoped that this project will improve the way in which the e-Tutorials are used within

the School of Library & Information Science & thus improve the learning experience for

future students.

At the end of the study, if you wish, I will send you a summary of the results of this

research. Should ou have any questions about the study, please do not hesitate to

contact me. Thank you for your interest in the research.

Sincerely,

Mark McLoughlin

Masters Student

Email: [email protected]

Telephone: +353 86 8631471

Page 168: Developing and Creating Interactive e-Tutorials

Appendix F Consent Form

Consent form for Participants for the Research Study:

Developing and Creating Interactive e-Tutorials to support Blended Learning in

selected Modules in the School of Information & Library Studies

(USD SILS students, all 18+ years)

The School of Information & Library Studies continuously seeks to explore and

experiment with skills-strengthening digital learning platforms. The use of video-casting,

e-tutorials, collaborative remote learning, and social media effectively engages students

in unique and dynamic learning experiences. E-tutorials are increasingly being used in

conjunction with traditional teaching methods. This study will focus specifically on e-

tutorial development, with the focus of improvement to meet the needs of current and

future students.

Participation in this study is voluntary and you may withdraw at any time. There are no

known physical or psychological risks associated with this study.

By signing this consent form, I am verifying that, on the date below, I have read and

understood the Letter of Information provided to me by the project researcher, and I

agree to participate in this research project. I consent for the project researcher to use the

‘Eye Tracker’ software to recording my actions while completing the tutorial. I consent to

the audio recording of my interview with the project researcher. I consent to the storage of

this recording for the duration of the study (2 months); I understand that, during this time,

the interview recording and transcription of this recording will be kept on an encrypted

storage device, and that this data will be destroyed after the project concludes. I

understand that my answers to questions will be used only for the purposes of this study

and that I will not be identified in any ensuing publication of the study findings.

Participant: _________________________________

Signed: _________________________________

Researcher: _________________________________

Date: _________________________________

Tick this box if you would like to receive a summary of our research findings:

Email Address: _________________________________

[2 copies signed: 1 returned to participant; 1 returned to researcher]

Page 169: Developing and Creating Interactive e-Tutorials

Appendix G Interview Protocol

Interview Schedule for Digital Footprints Tutorial

To begin, the web browser should display a neutral page.

Hi, ___________. My name is ___________, and I’m going to be walking you

through this usability test session today.

You probably already have a good idea of why we asked you here, but let me go

over it again briefly. We’re asking people to go through an e-Tutorial and think

aloud as you do. You can say anything on your mind. While you are running

through the tutorial, we have some software which will track your eye movements

on the screen.

When you have completed the e-Tutorial we’d like to conduct a brief informal

interview. In total, the session should take about an hour.

Before we begin, I have some information for you, and I’m going to read it to make

sure that I cover everything.

Read the Information on the consent form.

If you are satisfied with what’s there I’d like you to sign and date this form.

The first thing I want to make clear right away is that we’re testing the e-Tutorial,

not you. You can’t do anything wrong here. In fact, this is probably the one place

today where you don’t have to worry about making mistakes.

As you use the e-Tutorial, I’m going to ask you as much as possible to try to think

out loud: to say what you’re looking at, what you’re trying to do, and what you’re

thinking. This will be a big help to us. Also, please don’t worry that you’re going to

hurt our feelings. We’re doing this to improve the e-Tutorial, so we need to hear

your honest reactions.

If you have any questions as we go along, just ask them. I may not be able to

answer them right away, since we’re interested in how people do when they don’t

Page 170: Developing and Creating Interactive e-Tutorials

have someone sitting next to them to help. But if you still have any questions

when we’re done I’ll try to answer them then. And if you need to take a break at

any point, just let me know.

Do you have any questions so far?

Make notes & expand on any comments that the user makes.

Closing Statement

I’d like to thank you very much for agreeing to participate in our usability testing.

Do you have any questions for me now that we’ve finished?

Thank the participant again and escort them out.

Page 171: Developing and Creating Interactive e-Tutorials

Appendix H Interview Questions

Interview Questions

Technical Proficiency Questions

Q.1 On a scale of 1-5, how would you rate your proficiency with computers?

(Circle which one applies)

1 2 3 4 5

Q.2a Have you ever used e-tutorials, apart from the ones used in UCD modules?

Q.2b If Yes, Where have you used them?

General Interview Questions

Q.3 How do you feel about the style and layout of the e-Tutorials overall?

(Colour scheme/presentation of information on the slides/videos)

Q.4 How did you find the pace of the e-Tutorials?

(Was the information displayed too quickly/slowly?)

Q.5 What do you think of the audio/narration?

(too fast/slow, clear/confusing)

Q.6 Was there anything which stood out while taking the quiz portion of the e-

Tutorials?

(presentation of questions/wording of questions/directions given)

Q.7 What did you think of the amount of content in the tutorial?

(Circle which one applies)

Too Little Just Enough Too Much

Q.8 Is there any other content you would like to see included in the tutorial?

Q.9 If there was anything you would change in the e-tutorial, what would that

be?

Page 172: Developing and Creating Interactive e-Tutorials

Digital Footprints Tutorial

Q.10 Having completed the e-Tutorial, do you feel you now have an adequate

understanding of what a digital footprint is?

Q.11 Did you find any of the information given confusing or unclear?

Evaluating Digital Information Tutorial

Q.12 Did the examples help you understand how to evaluate digital information?

Q.13a Did you find the videos helpful?

Q.13b If Yes, how did you find them helpful?

How to find an Article Tutorial

Q.14 How did the use of videos help facilitate the learning experience?

Q.15 After completing the tutorial, how confident do you feel about going to the

library to find an article?

The ProQuest Flow Tutorial

Q.16a Having completed the tutorial, how confident would you be in setting-up

your own Flow account?

Q.16b If not, what else should the tutorial include?

Page 173: Developing and Creating Interactive e-Tutorials

Appendix I Group Reflection

In reflecting upon the project we can identify various things that happened

both positive and negative during the process. The beginning of the project was a

tumultuous time as the group needed to address the some fundamental elements

of teamwork. We needed to be clear about the respective roles each of us would

play in the group. This turned out to be quite dynamic as group roles were

determined at different stages of the project based not just on skill-sets but also

on availability as we each had to adapt to one another’s respective domestic

schedules.

Developing a consistent medium for communication was another basic

requirement of group work, which we had to determine in the early stages of the

project. We found that using Google chat was the best medium for our group, as

we were not only able to talk face to face but we were also able to utilise functions

of Google drive such as screen share. This meant that we could work

collaboratively on various key documents throughout the course of the project. We

also utilised the Google drive for storing and editing our work over the course of

the project, which we found very advantageous to us. Perhaps one thing that we

would suggest doing differently in future would be to have agreed on a

standardized font and text size when working on documents. The combination of

multiple collaborators lead to a situation where various sections of a document

would be different, this ended up creating more editing work for editors in the

closing stages of the project.

The appointment of one individual in the group to be our contact person with our

clients was a good decision. While this person was not necessarily the group

leader, they did provide a valuable focal point for us, making client communication

as efficient as possible. This high quality of understanding between our team and

Page 174: Developing and Creating Interactive e-Tutorials

our client allowed us to best comprehend the end result they desired, we are

confident that we succeeded in achieving the goals set at the start of the project.

Having a single communication person also meant that we had a clear chain of

delegation (as he would hear the client request and then tasks would be divided

as needed).

Although we did consistently complete our tasks as they arose, we still think that

were we to do another project like this that we would spend more time creating a

comprehensive schedule. This is because, as was said, although work was

always completed, there were cases where work was not completed until the last

minute and we ran the risk of not being finished in time.

Getting participants to take part in our study was a challenging aspect of

the project and it meant that at the time of testing, the group had to be particularly

flexible. This was because participants were available at many different times and

on different days. In a ideal world we could have just organised a series of tests

one after the other, but in reality it meant days of coming in to test a person at 9

AM and then waiting around until 4 PM for our next participant. Yet we coped

rather well with this difficulty and were able to conduct our testing, in full, within

the space of just a few days.

We felt that as the project progressed, that we grew stronger as we

became more acutely aware of each other’s respective strengths. In conclusion

we feel that as a group we performed very well and complimented our respective

strong points and disguised our respective weaknesses. The project drew on all of

us to provide our input be it one’s technological skill, document writing, recruiting

(test subjects), organizational skill or delegating skills.