31
(2015). LATUX: An iterative workflow for designing, validating, and deploying learning analytics visualizations. Journal of Learning Analytics, 2(3), 9–39. http://dx.doi.org/10.18608/jla.2015.23.3 ISSN 1929-7750 (online). The Journal of Learning Analytics works under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0) 9 LATUX: An Iterative Workflow for Designing, Validating, and Deploying Learning Analytics Visualizations Roberto Martinez-Maldonado Connected Intelligence Centre, University of Technology Sydney, Australia School of Information Technologies, The University of Sydney, Australia [email protected] Abelardo Pardo School of Electrical & Information Engineering, The University of Sydney, Australia Negin Mirriahi School of Education & Learning and Teaching Unit, University of New South Wales, Australia Kalina Yacef School of Information Technologies, The University of Sydney, Australia Judy Kay School of Information Technologies, The University of Sydney, Australia Andrew Clayphan School of Information Technologies, The University of Sydney, Australia ABSTRACT: Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human–computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now need frameworks that meet the specific demands of the cross-disciplinary space defined by learning analytics are needed. In particular, LAK needs a systematic workflow for creating tools that truly underpin the learning experience. In this paper, we present a set of guiding principles and recommendations derived from the LATUX workflow. This is a five-stage workflow to design, validate, and deploy awareness interfaces in technology-enabled learning environment. LATUX is based on well-established design processes for creating, testing, and re-designing user interfaces. We extend existing approaches by integrating the pedagogical requirements needed to guide the design of learning analytics visualizations that can inform pedagogical decisions or intervention strategies. We illustrate LATUX in a case study of a classroom with collaborative activities. Finally, the paper proposes a research agenda to support designers and implementers of learning analytics interfaces. 1 Keywords: Design, HCI, groupware, visualizations, design, dashboard, awareness, software engineering 1 An earlier, shorter version of this paper (Martinez-Maldonado et al., 2015) is the foundation for this article, which has been extended in light of feedback and insights from LAK ʼ15.

LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 9

LATUX: An Iterative Workflow for Designing, Validating, and Deploying Learning Analytics Visualizations

RobertoMartinez-Maldonado

ConnectedIntelligenceCentre,UniversityofTechnologySydney,AustraliaSchoolofInformationTechnologies,TheUniversityofSydney,Australia

[email protected]

AbelardoPardoSchoolofElectrical&InformationEngineering,TheUniversityofSydney,Australia

NeginMirriahi

SchoolofEducation&LearningandTeachingUnit,UniversityofNewSouthWales,Australia

KalinaYacef

SchoolofInformationTechnologies,TheUniversityofSydney,Australia

JudyKay

SchoolofInformationTechnologies,TheUniversityofSydney,Australia

AndrewClayphan

SchoolofInformationTechnologies,TheUniversityofSydney,Australia

ABSTRACT: Designing, validating, and deploying learning analytics tools for instructors orstudentsisachallengethatrequirestechniquesandmethodsfromdifferentdisciplines,suchassoftwareengineering,human–computerinteraction,computergraphics,educationaldesign,andpsychology. Whilst each has established its own design methodologies, we now needframeworks thatmeet thespecificdemandsof thecross-disciplinaryspacedefinedby learninganalyticsareneeded.Inparticular,LAKneedsasystematicworkflowforcreatingtoolsthattrulyunderpin the learning experience. In this paper, we present a set of guiding principles andrecommendations derived from the LATUX workflow. This is a five-stage workflow to design,validate,anddeployawarenessinterfacesintechnology-enabledlearningenvironment.LATUXisbasedonwell-establisheddesignprocessesforcreating,testing,andre-designinguserinterfaces.Weextendexistingapproachesbyintegratingthepedagogicalrequirementsneededtoguidethedesignof learninganalyticsvisualizations thatcan informpedagogicaldecisionsor interventionstrategies.WeillustrateLATUXinacasestudyofaclassroomwithcollaborativeactivities.Finally,the paper proposes a research agenda to support designers and implementers of learninganalyticsinterfaces.1

Keywords:Design,HCI,groupware,visualizations,design,dashboard,awareness,softwareengineering

1Anearlier,shorterversionofthispaper(Martinez-Maldonadoetal.,2015)isthefoundationforthisarticle,whichhasbeenextendedinlightoffeedbackandinsightsfromLAKʼ15.

Page 2: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 10

1 INTRODUCTION Thefieldoflearninganalytics(LA)hasemergedinrecentyearsasamultidisciplinaryresearchareawiththeaimofimprovingtheoveralllearningexperienceforinstructorsandstudents.Severalauthorshavehighlighted the potential impact of this discipline to enhance teaching practice and student learning(Ferguson,2012b;Siemens,2012;Siemens&Baker,2012;Verbertetal.,2013)butat thesametime,thereisalsoarecognitionoftheimportanceofestablishingaconnectionwithotherresearchfieldssuchas the Learning Sciences (Siemens, 2012), Human–Computer Interaction (HCI) (Verbert et al., 2013),Information Visualization (Chatti, Dyckhoff, Schroeder, & Thüs, 2012) and Computer Graphics (Duval,2011).Thisisbecausethedesignoftoolsinthisspacedemandschallenginginterdisciplinaryprocesses(Balacheff&Lund,2013)inordertosatisfymultiplegoalsandstakeholderrequirementsandtoleveragetheknowledgebaseofthesedifferentdisciplines.Thereissubstantialresearchdemonstratingthevalueof LA (Ferguson, 2012b; Siemens & Baker, 2012), describing the basic requirements for LA tools(Dyckhoff, Zielke, Bültmann, Chatti,& Schroeder, 2012), envisioning the future of this field (Siemens,2012),andprovidinggeneralanalyticalsolutionsforspecificcontexts(Dyckhoffetal.,2012).However,littleattentionhasbeenpaidtoexploringframeworksthatcansystematicallyhelpdesignerstodevelop,evaluate,anddeployeffectiveLAtools.Thisisacriticaltimeforlearninganalytics,becausethereisclearrecognitionof theneed foreffective interfacesbut therearenostandardsyet.Ourapproachaims tofacilitatetheexplorationofpotentialLAtoolsandtospeedupthecreationofvalidatedtoolsforvariouspurposes.TheareaofSoftwareEngineering(SE)providesestablishedmethodologiestohelpdesignersrecognizeprocesses,understand the requirements for interactivesystemsand iterativelyevaluate theseas theyare built (Sommerville, 2011). The area of visualization and computer graphics also has establishedguidelinesfordesigninggeneralvisualanalyticsinterfaces(Thomas&Cook,2006).Similarly,theHCIandUser Experience (UX) disciplines have a large array of methods for establishing user needs, anddesigningandevaluatinginterfaces(Hartson&Pyla,2012).However,thesemethodologiesalonearenotsufficientforbuildingLAtoolsbecausetheydonottakeaccountofthelearningcontext.Dillenbourgetal. (2011) stated that traditional usability testing (mostly focused on individual or group interactionswithcomputers) isnotenough topreparedesigns for thecomplexityofa learningcontext.Thesameauthorssuggestedconsidering“usabilityattheclassroomlevel.”Forexample,inaclassroom,instructorstypicallyhave todealwithmultipleconstraints andcontingencies suchas time limitations, curriculumalignment,lessonplans,physicalspaceconstraints,unexpectedevents,andtoolusage.Thiscomplexityalsooccursinblendedandonlinelearningsettings(Prieto,Dlab,Gutiérrez,Abdulwahed,&Balid,2011)whereLAtoolsarecommonlyusedandpotentiallyhaveaparticularlyvaluablerole.Similarly,Siemens(2012)proposedaholisticviewofLAthatincludessuchpracticalissues,butalsoaspectsrelatedtothedata,suchasopenness,accessibility,andethics,aswellas theparticularpedagogicalgoalsandusagecontext of the LA tools (e.g., detecting at-risk students, supporting self-awareness, or enhancinginstructorawareness).

Page 3: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 11

These complexities point to the need for new design methodologies for LA tools, providing apedagogical underpinning and considering the different actors (e.g., instructors and students), thedimensionsofusability in learning contexts (Dillenbourget al., 2011) (individuals, groupsof students,and the classroom), the learning goals, data sources, and the tasks to be accomplished. This paperpresentsLATUX(LearningAwarenessTools—UsereXperience),aworkflowfordesigninganddeployingawareness tools for technology-enabled learning settings. Byawareness tools,we refer to those thatprovide a user (instructor) with an enhanced level of awareness of what other actors (students) aredoing in the learning space. Theworkflowextendswell-establisheddesign approaches fordevelopinguser interfaces (UIs) by integrating the pedagogical requirements, collecting the relevant data fromstudents, and evaluating the awareness tools. The evidence supporting the design of the LATUXworkflowisillustratedwithacasestudyoftheiterativedevelopmentofvisualizationsforin-classroomtechnology-enabled group activities to assist instructors in monitoring student collaboration andprogress(Martinez-Maldonado,2014).The rest of the paper is organized as follows: The next section provides an overview of the currentrelated LA methods followed by a brief description of the most relevant methods for interfacedevelopment and evaluation used in SE and HCI. Then,we define the LATUXworkflow,with its fourprinciples and five stages. Theworkflow is illustrated in the final section in termsof the iterationsofdesign,evaluation,andthedeploymentofvisualizationsofstudentgroupactivityinourcasestudy.2 RELATED WORK 2.1 Awareness Tools: Visual Analytics, Learner Models and Dashboards There has been growing interest in the use of visual representations and interaction techniques thatexploit the human perception to let instructors or institutional decision makers see, explore, andunderstand a varied range of sources of student data simultaneously (Duval et al., 2015; Kay& Bull,2015;Ritsos&Roberts,2014).ThisareahasbeencalledVisualLearningAnalytics,makingallusiontothemore general field of Visual Analytics (Thomas & Cook, 2006) that focuses on the provision ofinteractive, visual support for assessment, planning anddecisionmaking in general.Nevertheless, forthe case of learning and teaching, a number of visual tools have been targeted to help teachers,learners,andotherstakeholdersgainawarenessof,analyze,orreflectonrelevanttracesofthelearningactivity collected from the various learning spaces where students interact. These tools include, forexample, Open Learner Models (OLMs), mostly targeted to students (Bull & Kay, 2007), AcademicAnalytics,mainlyfocusedattheinstitutionallevel(Campbell,DeBlois,&Oblinger,2007),scrutablegroupmodels(Upton&Kay,2009),orteacher’sdashboards(Verbertetal.,2013).Wecanconsiderthatthesevisualtoolsusedinlearningenvironmentsaredifferentformsoflearninganalyticsinterfaces.Theneedforaframeworktoanalyzeandidentifyeffectivevisualizationsisoneofthekeyaspectsthatmustbeaddressedtoadvancethefieldoflearninganalytics(Duvaletal.,2015).Themainissueishow

Page 4: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 12

to identify the realvalueofvisualizationswhenused in real learningcontexts,and the impacton thelearning experience. The trade-off to consider is between the relevance of the data that might bepresentedandthemanageablelevelofcomplexityforthestakeholders,inthecontextwheretheymakeuse of the LA tool. For example, in visualizations provided to instructors to estimate academic risk,Ochoa (2015) identified the need to include a sense of the accuracy or source of uncertainty. Thesevisualizationsprovideinstructorswithanuanceddescriptionoftheunderlyingresultsproducedbythealgorithms. What is the actual impact of including this information in a visualization? How are theinstructors affected? These questions can be applied generically as in other visualizations, such asnetworks to represent short answers (Bakharia & Dawson, 2015), visualization of tags previouslycollected by students in a learning environment (Niemann et al., 2015), networks of aggregatedconcepts derived from collaborativewriting (Hecking& Hoppe, 2015), or interactions captured in aninteractivesurface(Charleer,Klerkx,&Duval,2015).Themain challenge for creating such a framework is tomake it generic enough to apply to thewidevariety of possible visualizations, but still provide meaningful insight into the design process. Theconsequenceofthelackofsuchformalismtranslatesintoaproliferationofvisualizationsexploringthelimitationsofwhatcanbeshown,insteadoftacklingthemuch-neededgoalsofrelevance,usability,usersatisfaction,andoveralleffectiveness.Thispaperaimstoaddressthisbyprovidingagenericworkflowtargetedtosupportthedesign,validationanddeploymentoflearninganalyticsinterfaces.2.2 Learning Analytics Methods for Pedagogical Intervention Technology-enabled learningenvironments,whetheronlineor face-to-face in theclassroom,have theadvantageofcapturingcopiousamountsofdataleftinstudents’digitalfootprints(Duval,2011;Greller&Drachsler,2012).Whenthisdataismadeaccessibletoinstructors, itcanprovidevaluableevidence-based insights into student learning. Using data-mining techniques, the data captured from studentinteractions can be aggregated and analyzed by instructors to make better-informed pedagogicaldecisions. For example, data on student use of a Learning Management System (LMS) can revealpatternsofstudentengagementwithcertaintools (Beer, Jones,&Clark,2012;Macfadyen&Dawson,2010).However,thesepatternsarebestunderstoodbyinstructorswhenmadeavailableandvisualizedthrough user-friendly and intuitive LA tools. The challenge is to offer a low effort start-up for non-technicaluserswhileofferingadditionalfunctionalityformoreadvancedusers(Siemens,2012).The targetaudience for LA toolsmustbeconsideredduring thedesignprocess toensure the insightsgleanedfromthedatabecomeactionable(Olmos&Corrin,2012).Inthecaseofcourse-levelanalytics,visualizationtoolsshouldhighlighttheintendedlearninggoalsidentifiedbytheinstructorandwhetherthese have been achieved (Gluga et al., 2013), or even allow instructors to apply the performanceindicatorsappropriatefortheirlearningcontext(Dyckhoffetal.,2012).Theabilitytoobservewhetherstudentsareengagingwithalearningtoolorprogressingasintendedallowsinstructorstoidentifywhento intervene in order to support the students. This is especially challenging in online learning

Page 5: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 13

environments (Dyckhoff et al., 2012) and in large classes. For example, Social Networks AdaptingPedagogicalPractice(SNAPP)representsstudentinteractionswithinanonlinediscussionforumtohelpinstructorsidentifywhentoconsultwithstudentswhoarenotengagingorwhentochangethenatureof the activity (Lockyer, Heathcote, & Dawson, 2013). Other learning analytics dashboards, such asPurdueUniversity’sCourseSignals,providedatavisualizations (trafficsignals) tomonitorperformanceandtriggerearly interventionwhenstudentsmaybeatriskofpoorperformanceordroppingoutofacourse(Arnold&Pistilli,2012).WhilethereisnoshortageofevidencetosupporttheargumentthatLAtools to visualize student activity at the course-level are needed to inform instructors about theirstudents’ learningbehaviour,thereis limitedresearchoneffectiveworkflowsfordesigningsuchtools,particularly when focusing on data collected from face-to-face learning environments. This paperaddressesthisgapbyproposingadesignworkflowthatcombinestherigourofHCIparadigmsregardinguserexperiencewithasolidpedagogicalunderpinning.2.3 Designing User Interfaces Severalmethodologieshaveevolvedovermanyyears,tofacilitatetheprocessofsoftwaredevelopmentand, specifically, user interface design. From a general SE perspective, these methodologies arecommonly known as lifecycles (Sommerville, 2011) which, for example, may include waterfall, spiraldevelopment,prototyping,andextremeprogrammingapproaches.ThepurposeofthelifecyclesinHCIis tohelpdesignersdevelopmoreefficientlyand, for thecaseofuser interfaces, toenhance theuserexperience.Inotherwords,thelifecyclesprovidebestpracticesfordevelopersanddesigners.The lifecycles in both SE and HCI disciplines have similar structures and activities. These includerequirement identification, design, and evaluation. However, the methodologies for designing userinterfacesinSEandHCIhavebeenformulatedalmostindependently(Hartson&Pyla,2012).Thereareanumber of different HCI lifecycles, such asMayhew’s usability lifecycle, the Star lifecycle, the LUCIDframework of interaction design, and theWheelmodel (Hartson & Pyla, 2012). The latter is amorerecent elaboration of the iterative process model for usability engineering that provides a generalframework into which designers can fit existing or new techniques or methods to apply “best userexperiencepractices”(Helms,Arthur,Hix,&Hartson,2006).ThisisparticularlyrelevantasafoundationforourLATUXworkflow.Broadly,mostofthelifecyclesabovehavefourgenericstages(analysis,design,implementation,andevaluation)thatshouldbeperformediteratively,withpotentialforeachstagetofeedbacktorefineanearlierone.Thisiterativeprocessforinterfacedesigncanleadtoahigherqualityuserexperience.Adding to the challenge for LA, Bevan (2009) argued for greater use of usability standards, butacknowledgesthatgenericstandardsmayfailtomeetallthespecializedneedsofdesigners inspecificareas(e.g.,forLAtoolsdesign).AwidelyacceptedstandardrelevantfordevelopinginterfacesforLAisthe“Human-centreddesignfor interactivesystems”standard—ISO9241-210(2009). Itssixprinciplesare as follows: 1) the design should be based upon an explicit understanding of users, tasks, and

Page 6: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 14

environments; 2) users should be involved throughout the design; 3) the design should be driven byuser-centred evaluation; 4) the process is iterative; 5) the design should address the whole userexperience;and6)thedesignteamshouldbemultidisciplinaryintermsofskillsandperspectives.The LATUX workflow is based on this standard, and the evaluation-centred UX model (the Wheel)lifecycle concept (Hartson & Pyla, 2012). Its novelty resides in the fact that it instantiates the userexperience lifecycle with specific evaluation stages and requirements for designing awareness toolsaimedat improving the learningexperience,usingdatacaptured fromstudents’activitywith learningtechnologies.2 THE LATUX WORKFLOW The LATUX workflow is defined by 1) the principles on which it is grounded, and 2) the evaluation-orientedstagesthatcomposeit(seeFigure1).Theentireworkflowisevaluation-oriented.Hence,eachstageevaluatesdifferentpartsofanawarenesstool.Theworkflowhasoneproblemidentificationstageandfourevaluationstages.Theseprinciplesandthestagesaredescribedintherestofthissection.2.1 LATUX Principles Based on LA literature and our first-hand experience, we identified a set of principles to take theworkflowbeyondgenericdevelopmentlifecycles.Theseprinciplesare:Principle 1: The development process in LA has a strong research component. In traditional softwaredesign,developmentiscommonlybasedoniterativerequirementselicitation.TheclassicUXapproachinvolves exploring the design space and gaining feedback iteratively,with careful evaluation on eachiteration.Bycontrast,developingeffectiveawarenesstoolsfor learningsettingsalsorequiresresearchtoexplore thenewpossibilities that learnerdata canoffer to support instructors.Aswithevery fieldthat has been revolutionized by technology, LA tools can have a transformative role in education, inadditionto improvingitsprocesses.Researchneedstobecarriedoutconcurrentlytounderstandhowthese changes can take place. At the current state-of-the-art, there is a prominent gap betweenresearchandpracticeinLA(Siemens,2012)indicatingtheimportanceofthisprincipleintheproposedworkflow. This opportunity to close the gap is what Duval (2011) has highlighted as “Data-basedResearchonLearning.”Principle2:Theimpactofauthenticity;wherethetermauthenticcapturesthenatureofexistinglearningcontextswithoutLAtools.InlinewithrecognizedUXpractice,weneedtodistillthenatureoflearningcontextsandhowtheyimpacttheneedsofeachstakeholderandactasdriversforinterfacedesign.Thisaspecthighlights theneed togobeyond standardusability. Testingwitha smallnumberofusers is ageneral method to evaluate interfaces and user experience. However, for awareness tools, thecontingencies of real learning scenarios may affect and be crucial for the adoption of technology

Page 7: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 15

(Dillenbourg et al., 2011). As also noted by Rodríguez Triana, Martínez Monés, Asensio Pérez, &Dimitriadis(2014),weneedtoalignthedesignofthelearningactivities(toolsusage,tasks,pedagogicalstrategies)andthelearninganalyticssolutionstosupportinstructorsmoreeffectively.Inourworkflow,eachevaluationstageaimstoincreasethelevelofmatchbetweentheawarenesstoolandthedemandsof authentic learning contexts, while moving towards the deployment of the tool in-the-wild (theclassroom).Principle 3: Holistic design and evaluation. In the sameway that the LA field ismultidisciplinary andfocusedonunderstandingsystemsasawhole(Siemens&Baker,2012),thedesignprocesstobuildLAtoolsmustconsiderthecomplexityoflearningenvironments.Thisincludesthefeaturesofthelearningtools,theadoptedpedagogicalapproach,thephysicalspace,thestudentsocialroles,etc.Additionally,theevaluationofthetoolsshouldalsobeholistic.Forexample,ratherthanevaluatingtheeffectivenessof just theuser interfaceof theawarenesstool itself (in isolation), theevaluationshouldconsiderthetoolinthecontextofthelearningsettinganditsfurtherimpactontheactuallearning-relatedactivities—e.g.,studentretention,changein learningbehaviour(Verbertetal.,2013),or instructor’sattention(Martinez-Maldonado, 2014). This holistic view translates into a close relationship with LearningSciences(Siemens&Baker,2012).Principle4:Evaluationisbasedondataanditsmeaningfulness.InadditiontotheevaluationoftheUI,the design of LA tools has an additional dimension for evaluation; i.e., making sense of the studentactivitydataorinterpretingitinordertoinformpedagogicaldecisions.Acrucialpartofthedesignandevaluation process includes the ways to distill data, present it, and validate it according to thepedagogicalelementsoflearningandinstruction(Knight,BuckinghamShum,&Littleton,2013)includingrepresentation, purpose, timing, context, learning goals, teaching goals, epistemic beliefs, andassessmentapproaches.Forexample,aspecificdatavisualizationmayenlighteninstructorswhentheyreflect on student progress over the semester, but the same visualization may be useless in theclassroom,or inother learningcontexts.Thus,thedesignprocessshouldnotonlydealwithquestionsaboutwhatdatatopresentandhowthatdatashouldbepresented,butalsowhereandhowitwillbeused. While the evaluation in terms of usability and user experience still applies for LA tools, theworkflow presented in this paper is not a software development or user experience lifecycle. Thisprocessincorporatesanddrawsonthebodyofknowledgeaboutlearningcontextssuchasthelearningcycle,wherethestudents’ learningactivityor instruction is thefocusofattention.Further, thedesignand evaluation of LA tools should be connected to both the context of the learning activity and thecontextofusage.2.2 LATUX Stages Figure1showstheconceptualrepresentationofthestagesfeaturedinourproposediterativeworkflowfordesigninganddeployingawarenesstoolsintheclassroom.Theproblemdefinitionisthefirststageof

Page 8: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 16

recommendedpractice inHCI and SE. The figure also shows the four evaluation-oriented stages thatconstitutetheiterationstowardsdeployment.Thefivestagesaredescribedbelow.2.2.1 Stage1—ProblemidentificationMost literature presented above, concerning LA, has explored the different elements of the problemdefinition(Siemens,2012;Siemens&Baker,2012;Verbertetal.,2013).ThefirststageaimstoidentifytherequirementsforLAanduserinterfacedesign.Thisprocessoffersanopportunitytoidentifypossiblenewandradicalfeaturesthatcanbeofferedbythedatatoaddressstakeholderneeds,butwherethestakeholdersmaynotrealizethis.Asnotedunderthefirststage,Problemidentification,inFigure1,thefirsttwoquestionstoconsiderwhenidentifyingtheLA-relatedproblemconcerntherequirementsandthe unexplored possibilities that could be investigated. This can also lead to the consideration ofmultidisciplinaryknowledgeorexpertiseneededforthedevelopmentofthelearninganalyticssolution.For example, basic knowledge about information visualization is key when designing a dashboard;foundations of HCI can also be required to communicate learner’s captured interactions to theinstructors.Thethirdquestionconcernsthe identificationofthestakeholders.AlthoughLAhasmostlyfocusedoninformingandempoweringinstructorsandlearners(Siemens&Baker,2012),Siemens(2012)identifiedother relevant stakeholders including faculties, departments, the whole institution, researchers,corporations,andgovernmentfunders.FromaSEperspective,stakeholderidentificationisabasicstepfor successful design (Sommerville, 2011). Hence, the design of LA tools should adopt this, carefullyidentifyingthetargetstakeholders.Once the stakeholders have been identified, the available data sources will need to be considered.According toVerbert et al. (2013), todate, researchon LAhasmostly focusedonusing studentdatalinkedtoresourceusage,interactiontime,numberofartefactscreated,collaborativeinteractions,andtask results. By contrast, assessing the impact of the data displayed in awareness tools has not beendeeply explored. In addition, there aremany additional data sources that canoffer new insights intostudent learning (Blikstein, 2013), but these have been under-explored in the LA field. They includegesturesensing,wearablebiosensors,headtrackers,infraredimaging,depthimaging,andeyetracking.EvenmanualtrackingofactivityhasnotbeenwidelyusedinLA(Verbertetal.,2013).Asacontinuationofthedatasourceidentification,Verbertetal.(2013)alsoindicatedthatthedesignofLAtoolsshouldinclude an analysis of the feasibility and trade-offs of considering alternative data sensors used tocapturemore complete student data. Currently,most LA solutions are based on application logs andformsofaudioandvideocapture.Theproblemdefinitionshouldincludetheanalysisoftheimplicationsof introducing additional sensors in terms of the potential undesirable impact on the learningenvironment.OtherelementsoftheproblemdefinitionaddressedbyLAresearch,asdiscussedbelow,arethefollowing:stakeholderidentification,datasources,typesofdatalogging,featuresofthelearningsettinganddesignforevaluation.

Page 9: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 17

Alongsideidentificationofthedatasources,thelearningandteachingapproachorintendedpedagogyshouldbeconsidered.ThisisinlinewiththeargumentbyVerbertetal.(2013)thatthereisaneedformore research to identify the settingswhereparticular LAapproachesworkwell, andalso to identifylimitationsandalternativesforlearningenvironmentswheresomeLAsolutionshavebeenunsuccessful.ThissuggestsastronglinkbetweenthedevelopmentofLAtoolsandthecontextasawhole,andthustheriskofover-generalizingLAsolutions.Forexample,thelearningenvironmentaswellasthedetailsofthe learning context and tools deployed such as the activities and assessments, the student’s priorexperiences,technologiesused,andintendedlearningoutcomesshouldbeconsideredinordertomakesenseofthedatarepresentingstudentactivityandtomakerecommendationsforothersimilarlearningenvironmentsratherthanover-generalizations.Anotherkeyaspecttoconsiderinanearlystageofthedesignofawarenesstoolsisthedefinitionofthelearner’s rights, the student’s data ownership, and ethical issues that can emerge andwhich can bespecificallyrelevant for learninganalytics (Ferguson,2012a;Siemens,2012).SladeandPrinsloo(2013)stated that it is important to understand both the opportunities and ethical challenges of learninganalytics,whicharesociallyandepistemologicallyshapedbytheeducationalcontextandpurposeoftheawareness tools. Institutions, researchers, designers and/or practitioners must consider privacy andlegalissueswhendefiningtheprotocolsforcollecting,analyzing,interpreting,andstoringstudentdata(Bienkowski,Feng,&Means,2012).Someearlyactionscanbetakentoaddresspossibleethicalissuesin

Figure1.ConceptualRepresentationoftheLATUXWorkflowforDesigningandDeployingAwarenessTools.Eachstagefeaturestheguidingquestionsthatdesignerscanconsiderbeforeor

whiledesigningandvalidatinglearninganalyticsvisualizations.

Page 10: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 18

subsequentstagesof thedevelopment.Forexample,policiescanbedefinedforobtainingastudent’sconsenttousetheirdata(Slade&Prinsloo,2013).Itcanalsobeconsideredtoagreeonadataprivacystatement or disclaimer. The de-identification of student data can also be required to accomplishspecificpurposes (e.g., for sharingdatawith third-partyentitiesor for conducting research).All theseconcerns must be addressed within the constraints of national and broader privacy legislation. Thisbroadlyconsidersthatthelearnersshouldbeabletoaccesstheirdataandbeincontrolofitsuses.Butthe understanding of appropriate use of learning data is still evolving. Broad principles for ethicalmanagement of learning data in MOOCs were established at the Asilomar Convention for LearningResearch in Higher Education in 2012.2 This aimed to facilitate progress in harnessing the potentialpowerofthedata,butatthesametimerespectingthelearner’srights.Finally,asthelastquestionunderProblemidentificationinFigure1shows,itisimportanttodefinehowtoevaluatetheeffectivenessoftheuseofLAtoolsinthetargetlearningsetting.Thismeansdesigningtheevaluationoftheuserexperienceaccordingtotheprinciplesthisworkflowisgroundedupon(e.g.,consideringtheauthenticityofthelearningsetting,meaningfulness,ethicalconcernsthatcanemergeinstudiesin-the-wild,etc.).Differentmeasurementtechniquesshouldbeused,asperceptionmaynotbeanaccuratewaytoevaluatetheimpactofatool(Hartson&Pyla,2012).Measuringusers’actionsandthe impact of such actions are also an indirectmeasure of the effectiveness of the learning analyticstool. These include usability testing, measuring the impact on teaching, learning gains, reduction inmisconceptions, improvedcollaboration,enhancedinstructorawareness,reducedloadontheteacher,enhancedmanagementofinstructortime,etc.Inaddition,theconditionsinwhichtheevaluationwillbeperformed (e.g., in the classroom,with simulated class conditions, simulateddata, etc.) are required.FurtherdetailsonthispointwillbediscussedinthefollowingstagesoftheLATUXworkflow.2.2.2 Stage2—Low-fidelityprototypingTheideaofprototypingintheearlystagesofdevelopmentistoprovideaquickandflexiblepreliminaryvision of the proposed system (Hartson& Pyla, 2012). To accomplish this, a prototypemust containsome key elements of the system without actually building the system. Low-fidelity prototypes areinitial,high-levelrepresentationsoftheintendeddesign,whichmaybesketchedorbeillustratedusinggraphicalwireframetools.Theyareappropriatewhenthedetailsofthetoolhavenotbeendefinedorare likely to change, so the construction of prototypes should involve little effort. In this stage, wepropose the construction of low-fidelity prototypes for the purposes of: 1) testing a wide range ofpossible visualizations that can be part of the intended awareness tool; 2) testing usability and datasense making in the early stages of the design; and 3) evaluating and triggering dialogue withstakeholdersbeforeactuallyimplementingtheLAtool.Toachievethese,Figure1listsasetofquestionsthat may guide a designer in deciding: 1) how to generate and evaluate low-fidelity prototypes ofawareness tools including the typesof visualizations tobe generated; 2) the strategies to beused to

2 http://asilomar-highered.info/

Page 11: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 19

initially illustrate and generate the visualizations; 3) the data required; and 4) how the evaluationstrategyofthelow-fidelityprototypeswillstrivetomimictheuseofsuchvisualizationsinarealsetting.In LA, low-fidelity prototypingmay focus on either the evaluation of the UI (user interface orientedprototype,e.g., apaper-basedversionof the interfacewhere somevisualizationscanbedraftedwithsimulateddata)ortheactualinformationconveyedinthevisualizations(data-orientedprototypes,e.g.,visualizationsofdatacapturedbytheactuallearningsystemorfromshareddatasets).ThefirstcaseisclosertoatypicalHCIevaluation,whiledata-orientedprototypingisperhapsmorerelevant(anduseful)inLA.WewillprovideanexampleofthisintheCaseStudydescribedinthenextsection.Ineithercase, low-fidelityprototypesarecommonlypaper-basedrepresentationsofpartsofasystemor tool.Theyare lowcostandarehighlyeffective to informthedesigncomparedwithhigher-fidelityversionsofthetool(Tohidi,Buxton,Baecker,&Sellen,2006)(seeTable1,column1,firstrow).This isbecauseonecanreadilycreatemultipleprototypes,eachexploringquitedifferentapproaches,andsogaininsightsintotherelativestrengthsandweaknessesofeachofthem.Thisalsoenablesparticipantsintheevaluationstudiestocommentonthemincomparativeterms,withoutfeelingthediscomfortofbeingnegative.Theyalsoallowevaluationswithmultipleusersandundercontrolledlabconditions(e.g.,it is easier to evaluate paper-based versions of a dashboard in one-hour interviews with multipleinstructorsthanwhentheactualsystemisused).By contrast, low-fidelity prototypes in LAhavemany limitations (see Table 1, column1, row2). Theycannotreplicatetheconnectionbetweenwhatoccursinanenvironmentandthegenerationofdataatthatprecise instant. In addition, theseevaluations completely ignore theeffectof interventions (e.g.,instructors’ actions or provision of feedback) or changes in student behaviour. They also afford verylimitedunderstandingoftheuserexperienceunderauthenticlearningconditions(seeTable1,column1,rows3and4).Analternativewaytogetrapidfeedbackintheinitialstagesofdesignisthroughtheuse of Wizard of Oz studies (Dahlbäck, Jönsson, & Ahrenberg, 1993). In this method, an interfaceprototype looks as though it operates as intended, but actually lacks the backend to drive the fullfunctionalityandsoapersonmustprovidethis.2.2.3 Stage3—Higher-fidelityprototypingThenextstageintheworkflowisto implementhigherfidelityprototypesbycreatingat leastapartialbackendwithmeaningfuldata.FromaHCIperspective,ahigh-fidelityprototypeisamoredetailedandrealisticrepresentationofthedesignedtoolthanalow-fidelityprototype(Hartson&Pyla,2012).Itmayinclude details of either the appearance or the interaction behaviour needed to evaluate how usersinteract with the tool or the data. High-fidelity prototypes require more effort and time, but theevaluation is still less expensive; these prototypes are faster to produce and more flexible thandevelopingthefinaltool.

Page 12: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 20

ForLAawareness tools specifically,wediscuss thesimulationandgenerationofstudentdataandtheconditions of the learning setting. This adds some degree of authenticity to the evaluation since thesimulationmayinvolverealstudentdataandreal-timeusage.AsindicatedinTable1(firstrow,column2),usingsimulationprototypesallowsfortheinclusionofatimefactorand,therefore,theevaluationofthedecisionmakingprocessonthefly(e.g.,itispossibletoanalyzewhataninstructorwoulddecidetodo,rightafterlookingatthetool).Similartolow-fidelityprototypes,evaluationswithmoreuserscanbeperformed under lab conditions compared with more authentic but costly pilot studies. In addition,someaspectsofinteractiondesigncanbeevaluated(e.g.,analyzinghowuserswouldexplorethedata).Experimentsthataremorecomplexcanbedesignedtoevaluatetheuserexperiencewithasimulationtoolsincesubjectscanjustfocusonthedetailsofthevisualizationswhilethetaskscanbereplicatedformultipleusersinlesstime.However,asnotedinTable1(secondrow,column2),theimpactofstudentor instructor actions are still ignored. Hence, the simulation of a tools’ usage does not reflect whatwouldactuallyhappeninarealenvironmentwherestudentsortheinstructorscanactivelyrespondtothe data presented via the awareness tool. Therefore, the effect of the tool on instructor or studentactionscannotbetested.Asaresult,thedesignershouldconsiderthetrade-offbetweenthelowdegreeofauthenticitywiththistypeofprototypingandtheeffortneededtobuild it.Figure1presentssomequestionsthatcanhelpthedesignerevaluate:1)thewaysvisualizationscanbedynamicallygenerated;2)howdatacanbegeneratedandprocessed;3)theconditionsofthelearningenvironmentthatcanbesimulated to evaluate the use of the visualizations within the awareness tool as authentically aspossible;4)howinterventionscanbesimulated(e.g.,theteacherspeakingwithstudentswhomaynotbeengagingwiththetoolasintended);5)howsuchinteractionscanbesupported;andfinally,6)howtheimpactofthetoolcanbeevaluatedwithoutactuallymeasuringlearningorbehaviouralchange.

Table1.Strengthsandlimitationsoftheevaluation-orientedstagesoftheworkflow.

Page 13: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 21

2.2.4 Stage4—Pilotstudies:ClassroomuseThefourthproposedstagebeforedeployinganawarenesstoolin-the-wildistheevaluationoftheuserexperience inpilot studies.Apilot study is commonlyaccomplishedunderconditions similar toa realdeploymentbutonasmallerscale(e.g., ina limitednumberofclassroomsessionsorduringa limitedperiodinacourse).Apilotstudycanhelpproveaconceptandobservetheliveusageofthetoolinanauthenticcontext.ThisisrecommendedaspartofthedesignworkflowtominimizetheriskofdeployinganLAtoolonalargescalewhilethereisstillaneedformoreresearchandevaluation.Keynewknowledgecanbegainedfrompilotstudiesincludingtestinginteractivity.Thiscanincludetheanalysisoftheimpactoftheinterventionsandchangeofbehavioursbecauseofusingthetool.Inpilotstudies, it isalsopossible to test theeffectof the tooland theunexpectedevents thatmayoccuronlearningandinstruction(seeTable1,column3,firstrow).Figure1providesfivequestionsthatshouldbeconsideredwhendesigningapilotstudy:1)thetypesofvisualizationsthatcanbegeneratedinreal-time (e.g., to be used when the teacher wants to whether during the class or external); 2) howinteractivitywith theawareness toolwouldbe supportedduring thepilot; 3) strategies for testingorevaluating the impact of the tool; 4) how contingencies will be addressed, and 5) how teacherinterventionsorchangesinstudentbehaviourwillbetestedorevaluated.However,pilotstudiesaretypicallymoreexpensiveintermsofeffortandcoordination,comparedwithevaluating low- or high-fidelity prototypes in the lab (Hartson & Pyla, 2012). Added to that, resultsobtained from pilot studies are not necessarily generalizable. Setting controlled variables can berestrictive; dependingonhowauthentic the learning conditions are (e.g., pilot studies in real schoolsmay impose limitations on the range of items that can be evaluated). In addition, the amount ofevidence and subjects can bemore limited than performing shorter butmore numerous evaluationswithprototypesunderlabconditions(seelimitationsinTable1,column3,secondrow).2.2.4 Stage5—Validationin-the-wild:ClassroomuseEvaluatingin-the-wildhasbecomeastandardmethodwithinHCItesting(Brown,Reeves,&Sherwood,2011). In-the-wild testing consists of field trials of experimental systems with groups of subjects inrelativelyunconstrained settingsoutsideof the laboratory, at a larger scaleand for a longerdurationthan in pilot studies. For LA research and practice, this can help designers recognize the actualdeploymentsasanotherevaluationstageofuserexperience.Similar toapilot study,deployments in-the-wildprovideanauthentic test-bed toanalyzenotonlyusabilityanduserexperience,butalso theusage and the impact of awareness tools in an unpredictable learning environment and within thecontextofthepedagogicalrequirementsandgoals(seeTable1,column3,firstrow).The actual use of the awareness tool can be tested, considering the possible contingencies thatmayoccur.Astheauthenticityofthesettingishigherandthedurationofthestudycanbelongerthanapilotstudy, the results of the evaluation of the awareness tool may be more generalizable to real worldcontextsanduse. Inaddition, the longer-termusageofanawareness toolcanminimize the“novelty-

Page 14: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 22

effect” inherent in introducing new technology. This effect cannot be easily analyzed in smaller pilotstudies or in prototype testing. This aspect is important for designing and testing LA tools since theultimategoalofasuccessfuldeploymentisthesustainedusageofthetoolforitsintendedlearningorteachinggoal.Inaddition,testingthetoolin-the-wildmayrequiremorechallenginglearningmanagement,ethicalandprivacyissuestobeaddressedgiventhelargerscaleuseofthetoolscomparedwithpilotstudies(Table1,column3,secondrow).TheseaspectsmayinfluencethedesignofanawarenesstoolandinformthetwokeyquestionsthedesignershouldconsiderpriortodeployinganawarenesstoolasnotedinFigure1.Namely,thedesignershouldconsiderthelimitationsposedby“in-the-wild”conditions(e.g.,classsizeproducingalargerdataset,technologicalinfrastructureorlackthereof,andtheextentofteachers’dataliteracy to be able tomake sense of the visualizations represented) and any particular ethical issuesemerging from the evaluation of the tool in-the-wild.We include this consideration at this stage, inadditiontothefirststage’splanning,becausetheearlierstagesworktowardsatoolthatcanbeusedinmultiple classesof envisaged settings.When it comes to testing in aparticular class and context, thepragmaticsofeachwillneedtobeconsidered.Finally,evaluatingLAtoolsinauthenticdeploymentsisthemostexpensivestageinadesignworkflow.Theymayalsobemorerestrictedtofitthecontext.Testingunderauthenticconditionscanalsoaffectthe use of the tool and the evaluation. Typically, they offer less flexibility and this means that onecannot ensure the systematic testing of a broad coverage of diverse cases. By contrast, a lab settingmakes it possible to ask the users to work through a carefully designed and comprehensive set ofartificialtasks(Table1,column4,secondrow).2.2.5 IterativeDesign–Deployment–EvaluationFinally,bridgingthegapbetweenpracticeandresearchcallsforcommunicationofwhathappensinthedeployment to inform the possible tuning or re-design of the tools. This invites the definition ofiterations of design either to develop the LA tools in small incremental steps, or to improve certainelements of the UI or data processing. Mendiburo, Sulcer, and Hasselbring (2014) highlights theimportance of iterative interaction design to generate their LA tool, but other than this example,iterativedesigninLAhasnotbeenexploredandreportedindepth.Thedesignermaywanttorespondtosomequestionsoncethesystem,orpartof it,hasalreadybeendeployed,suchaswhatevaluationstagesmaybeneededinaniteration(e.g.,somefunctionalityoftheawarenesstoolmaynotrequireallstagesofprototypetestingorpilotstudies)?Whatisthegoalofthecurrentiteration?Whatknowledgeshouldbegainedfromaresearchperspective?Howmanyiterationsarerequired?AsshowninTable1(rows3and4),eachstagewithinan iterationoftheproposedworkflowfeaturesatrade-offbetweentheauthenticityanddegreeofcontrolofthelearningsettingconditions.

Page 15: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 23

3 CASE STUDY We illustrate our workflow by describing the iterative development of theMTDashboard (Martinez-Maldonado, Dimitriadis, Kay, Yacef,& Edbauer, 2013). This interactive dashboardwas designed for ateacherinamulti-tabletopclassroom.Itaimedtohelptheteachermanagethemultipledevicesinthetechnology-enhanced environment and gain understanding of the students’ collaboration and taskprogress.Thedashboardprovedeffectiveinenhancinginstructorawarenessandinhelpingthemdecidehowtodirectthetimeandattentionofstudentsbyprovidinginformedfeedbackinanumberofhighereducationclassroomstudiesinclasseswithmorethan500students.3.1 Stage 1 — Problem Identification Wedescribethecontextoftheworkintermsoftheproblemdefinitionasastartingpointofourdesignworkflow.Theobjectiveoftheproject istodesign,evaluate,anddeployanumberofvisualizationsofstudentgroupworkinaclassroom.Thepurposeofthisistoenhanceinstructorawarenessofstudents’small-groupcollaborationandtaskprogress.Thegroupactivitywassupportedeitherinthelaborintheclassroomusinginteractivetabletops.Thesesharedmulti-touchdevicesenablesmallgroupsofstudentsto work on a learning task where they manipulate digital media while communicating face-to-face.Figure 2 (left) shows three studentsworking on such a collaborative activity. Themain requirement,fromtheLApointofview,wastogenerateanawarenesstool(liketheoneshowninFigure2,right)inthe form of an instructor dashboard to monitor student collaboration and group progress on thelearningactivitywhilethestudentsworkedinsmallgroupsusingmultipletabletops.DatasourcesandDatalogging.Threemainsourcesofstudentdatawereidentifiedfortheproject:thephysicalmanipulationofvirtualobjectsonthetabletop,quantitativedimensionsofstudentspeech(e.g.,presenceanddurationofutterances),andstudents’taskprogress(e.g.,measuresofthequalityofthelearning artefacts produced). Regarding the tabletop technology, there has been little research onsensing technology to capture the above types of student data automatically in face-to-face learningsettings.Partoftheprojectincludedtheproductionofthetechnologytorecognizeuser-differentiatedtouchesonthetabletopsautomaticallyandtosynchronizethosewiththedetectionofaudioactivitybythe learners at the table. The sensing systemCollAidwasdeveloped to capturedatasets of students’activity at an interactive tabletop, to provide fine-grained data about learner activity similar to thatcurrentlycollectedinonlinelearningenvironments.Itrecordsthetimestampofeachactionperformedby each student on themulti-touch learning application. CollAid can also determinewhich student isspeakingat any time. So it captures thedurationof each studentutteranceandwhether suchverbalparticipation followed another student speaking.More information about how the speech and touchactivitycanbeautomaticallycapturedintabletop-basedface-to-facesettingscanbefoundinMartinez-Maldonado,Collins,Kay,andYacef(2011).Ananalysisoftheaccuracyofthevision-basedsystemthatCollAidusestoidentifytouchesindicatedthatitwashighlyaccuratefornormalusebystudents;detailsofwhichcanbefoundinClayphan,Martinez-Maldonado,Ackad,andKay(2013).

Page 16: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 24

Features of the learning setting. For the prototyping stages and pilot studies, the learning activityinvolvedtheconstructionofaconceptmaptotackleproblemsolvingandcase-basedresolutiontaskscollaboratively. Collaborative concept mapping is representative of a group learning activity thatrequires students to share their perspectives, visually represent their ideas, and agree on a groupsolution (Chaka, 2010). For the collaborative tasks at the tabletop, students used the multi-touchconceptmapeditorCMate(Martinez-Maldonado,Kay,&Yacef,2010).Forthein-the-wilddeployment,twoadditional learningapplicationswereused:1)ScriptStorm,abrainstormingapplication (Clayphan,Kay, & Weinberger, 2013); and 2) WellMet, a tool for supporting project management meetings.Additional details about the features of these tools can be found inMartinez-Maldonado, Clayphan,Ackad,andKay(2014).Evaluation. In this project, classroom instructors were the main stakeholders and source ofrequirements for the tool. Therefore, their role was crucial in the evaluations discussed below.Additionally,anumberofresearchgoalsweresetfortheprojecttoexplorevariousvisualanalyticstoolsandtheireffectivenessinconveyinginformationaboutstudents.Thus,additionalrequiredsourceswereidentifiedbyreviewingrelevantresearchliterature.3.2 Series of Studies The design process in the case study included the following series of smaller studies, which supportStages 2 to 5 respectively of the LATUX workflow. The first explored a wide range of candidatevisualizations based on paper prototypes (Martinez-Maldonado, Kay, & Yacef, 2011). The secondconsisted of trials in the lab, with teachers analyzing real student data, but simulating classroomsessions(Martinez-Maldonado,Yacef,Kay,&Schwendimann,2012).Thethirdincludedtwopilotstudiesinrealclassroomsconductedintwodifferentsemesters(Martinez-Maldonadoetal.,2013).Thefourthwas a longitudinal classroom experience with four instructors delivering two one-semester courses(Martinez-Maldonadoetal.,2014).3.2.1 Stage2—ValidationofprototypesDescriptionofthestudy:Thefirststudyfocusedonexploringvisualizationsofsmall-groupface-to-faceworkthatcouldbegeneratedfromthedatacapturedfromatabletopdeviceinalabsettingsuchastheone shown in Figure 2 (left). The learning task had two stages: first, an individual stage,where eachgroupmembercreatedtheirownconceptmap;andthen,agroupstage,wheretheycametogethertocreate a shared map using the CMate tabletop tool. The data for the study was collected for 10participants in threegroups,eachwith threeor fourparticipants.Drawingonpreviousworkonsocialproxies (Erickson et al., 1999), equity of oral participation and decision making (Bachour, Kaplan, &Dillenbourg, 2010), and levels of participation at the tabletop (Harris et al., 2009), four visualizationsweregenerated.Thisinitialsetofvisualizationswasdesignedtoprovidekeyindicatorsofaccountabilityofstudentcontributionstothegroupsolution,theprogressofthetask,andhowegalitariantheactivity

Page 17: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 25

was amongst learners. The visualizationswere generatedusing a computer-drawing editor.Details ofthese visualizations can be found in Martinez-Maldonado, Kay, and Yacef (2011). Two sets of thevisualizationswereprintedonpaper(seeFigure2,right).Thesesetsweregeneratedbasedonrealdataobtainedfromthevideorecordingsoftwoselectedgroupsofthreestudentseachcollaboratingatthetabletopontheconceptmappingtask.Evaluation:Weillustrateawaytoevaluatecandidatevisualizations inearlystagesofdesignusingrealstudentdatabydescribing theevaluationof thepaper-prototypesdescribedabove.Theusabilityandmeaningfulness of the visualizations were both tested using the prototypes. Regarding usability, aninstrument was first applied to investigate if all the instructors were able to understand thevisualizations. Then, a second instrument tested a series of hypotheses that questioned if the set ofvisualizations revealed key facetsof the group collaborationprocesses to the instructors. Thesewereaboutequityandamountofparticipation,overallcollaboration,equityof intellectualcontribution,andthecreationprocess.Inordertoassesswhetherthevisualizationsconveyedenoughinformationaboutgroup work, we ran evaluation sessions with five skilled instructors, each with experience inorchestrating classroom group work. We provided each instructor with a set of the visualizationsgenerated fromthe tabletop sessionsof twogroups:AandB.Wealsoprovided the finalmaps, fromboth the individual and collaborative stages. Then, we invited the instructors to respond to a set ofquestions about the equity of participation, quantity of participation, the collaboration, the equity ofintellectual contributionof thegroupmembers, and theusefulnessof thevisualizations todepict thecreationmapprocessintermsofcontributionandgroupwork.Weaskedtheinstructorstothink-aloudwhileinspectingthevisualizations,explaintheirresponses,andstatewhichvisualizationstheyusedorifthe visualizations did not give enough support for making a decision. As a basis to compare theinformation inferred by the instructors from the visualizations, the video recordings of each group’ssessionwereanalyzedsotheycouldbeusedtocompareagainsttheinstructor’sthinkingprocess.Theinstructorsneitherwatchedthevideosnorhadaccesstosummariesofwhatactuallyhappenedduring

Figure2.Case-studylearningenvironment:Left:athree-membergroupacomplishingacollaborativelearningtaskinthelab;Right:Exampleofapaper-basedprototypeforvalidating

visualizationswitheachrowshowingthenextvisualizations:1)theVerbalParticipationRadars,2)theTouchParticipationradars,3)theMapContributionChart,and4)theMapEvolutionDiagram.

.

Page 18: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 26

thegroupsessions.Theyhadjusttheinformationinthelow-fidelityprototypevisualizations.Instructors were able to use the visualizations to answer four out of the five questions. Thisdemonstrated that the visualizations were promising in conveying information that could helpinstructors inmonitoring some aspects of the collaborative situation at the tabletop even when thefacilitatordidnothavetheopportunitytoobservetheactionsofthegroupinsitu.Thismimicsrealclasscontextswheretheteacherdevotesherattentiontoonegroup,withothergroupsworkinginparallel.Resultsshowedthatthevisualizationsprovidedinstructorswithkeyquantitativeinformationabouttheequityofparticipation,thequantityofparticipation,andcollaboration.However,theyfailedtoprovideinformation about the equity of intellectual contribution. Importantly, instructors indicated whichaspectsofthevisualizationsshouldbeimproved,whichoneswouldbevaluableforclassroomuse,andwhichotheraspectsofgroupworktheyconsideredimportanttovisualize.Mainlessonslearnt:InlinewiththeUXevaluationliterature(Tohidietal.,2006),thisinexpensivepaper-basedprototypingallowedexplorationof initialuserexperience todrive thedesignof theawarenesstool. For example, the evaluation described above demonstrated how instructors can identifyvisualizationsthattheypredictedwouldbeusefulinclassanddistinguishedthemfromothersthattheywould prefer to use for post-hoc reflection, outside of the demands and time pressure of the actualclassroom. This informed the choice of goals in the following studies and the iterative design of theawareness tool in general. Results from this evaluation also helped to refine the design of the visualaspectsofthegroupindicatorsaswellastodetectwhatothervisualizationswouldbeusefultoexplore.3.2.2 Stage3—SimulationswithrealinstructorsDescriptionof the study:Buildingon the resultsof theevaluationof low-fidelityprototypesdescribedabove,inthisstudy,ahigh-fidelityprototypeofadashboardwasbuilt.Fourinstructors,allexperiencedinusingcollaborativeclassroomactivities,wereinvolvedintheearlystagesofthedashboardprototypedesign. The design process consisted of a series of interviews, design of preliminary versions of thedashboards,andempiricalevaluationsofboththevisualizationsinthedashboardandtheuserinterfacelayout.Theresultwasadashboarddesignedtoenableinstructorstogainawarenessofgrouplearningprocesses, to help them determine whether groups or individual learners needed attention. Theinterfaceshowedfourkeyfacetsofgroupwork:1)symmetryofactivity,2)degreeof interactionwithothers’contributions,3)levelofcollaboration,and4)overallprogressofthetask.Thisinformationwasdisplayedat two levelsofdetail: 1)theclass level,which showedminimal informationaboutmultiplegroupsatthesametime(withthreevisualizationspergroup);and2)thedetailedgrouplevel(basedonShneiderman, Borkowski, Alavi, & Norman, 1998), which allowed instructors to drill down to morespecificinformationaboutaselectgroupwhenrequired(withfivevisualizationsinatimeline;seeFigure3,right).Eightvisualizationsweregeneratedanddistributedonthetwodifferentuserinterfaces.MoredetailsofthestudycanbefoundinMartinez-Maldonadoetal.(2012).

Page 19: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 27

Evaluation:Thisstudyevaluatedhowinstructorsusedthedashboardtointerveneinagroupandwhatvisualizations weremost effective in conveying key information to inform their decisions. Controlledstudieswere conducted in a labwith real data simulating classroom sessions. Student data capturedfromfourlabsessions,withuniversitystudentsingroupsofthree,wereusedinthesimulations.Eightsimulatedsessionswererunwitheight instructors(alldifferentfromthepreviousevaluationstage)toevaluate the dashboard prototype. The high-fidelity prototype simulated the real-time generation ofdataforeachinstructor,asifheorshewasconcurrentlymonitoringuptothreegroupsfor30minutes(asinaclassroomsituation).TheevaluationrecreatedtheclassroomorchestrationloopdocumentedbyDillenbourgetal.(2011),whereinstructorsmonitortheclassroom(attheclasslevelofthedashboard),compareittosomedesirablestate,andintervenetoadapttheteaching(selectingthekeyvisualizationsthat helped in the decision making process and drilling down to the detailed group level of thedashboard).Iftheinstructorsdecidedtointervene,theyhadtowaitatleasttwominutesinthedetailedgrouplevelsimulatingthetimetakentotalkwiththatgroup.Instructorsfollowedthisloopthroughoutthedurationofthetrials.Datacapturedfromtheinstructors’useofthedashboard,aquestionnaire,andinterviewswereusedtounderstandtheinstructors’experiencewiththetool.Additionally,similartotheevaluationof low-fidelity prototypes described earlier, video recordings of each group’s collaborationwasmanuallyanalyzedbyanexternalobserver. Evaluationresults revealedthat theprototypeeffectivelyenabled instructors to identifywhichgroupsencountered problems during the group work. This indicated that providing our key quantitativeindicatorsofgroupworkwashelpful;teacherswereabletodecidewhichgroupneededtheirattentionand to provide timely feedback. The class level of the dashboard provided information from thebeginningoftheactivityandwasusedasadecisionmakingtooltohelpteachersmanagetheirattentionand interventions. Thedetailedgroup level,which showed chronological information,was consideredonlytobeeffectiveforassessingtaskprogressafterclass.Additionally,ourparticipants indicatedthatthose visualizations providing less processed data were more effective for the management of theirattentionandintervention.Bycontrast,theysawlessvalueinthevisualizationsgeneratedusingmoresophisticated data processing (a graph showing overall level of collaboration as detected by a data-

Figure3.Simulationofaclassroomusingrealstudentdata:Left:athree-membergroupacomplishingacollaborativelearningtaskinthelab;Right:ahigh-fidelityprototypeofaninstructordashboardto

beusedbyinstructorsinthelab.

Page 20: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 28

miningalgorithm).Mainlessonslearnt:Eventhoughtheconstructionofaprogrammedsimulatorisamoreexpensivetask,itprovidedamorerealisticuserexperiencethanstaticpaper-basedprototypes(Hartson&Pyla,2012).Itmeantthatwecoulddesignastudywherethe instructorscouldexperiencearole-play,wheretheyplayedateacherwhowasusingthevisualizationsandanalyzingstudentdataonthefly,astheywoulddo inarealclassroom.Thishelpedthemconsider if theywouldrealisticallyusetheawarenesstool inthe classroom. Forexample, in this study instructorswereable to identify the key features that gavethemcluesindicatinggroupswithcollaborationproblems.Additionally,evaluatingaprototypewithtwodifferent levels of detail of student data provided design insights about the kind of information thatshould or should not be delivered to instructors in class. Providing more detailed information toinstructors should be considered for caseswhere theywant to do a careful post-mortem analysis inordertounderstandspecificaspectsofthecollaborationorthelearningprocesses.3.2.3 Stage4—PilotstudiesDescription of the study: The effectiveness of the implementation of the dashboard prototype in theclassroomwasvalidatedintwoconsecutivesemestersofauthenticuniversityleveltutorials. Thesetwopilotstudieswereconductedintwodifferentweeksfortwocourses(ataBusinessschool)withasingleinstructor conducting 22 classroom sessions (14 and 8 sessions with 236 and 140 students in each,respectively).SimilartothesettingshowninFigure4(left),theclassroomfeaturedfourinterconnectedinteractivetabletops,eachloggingstudentactionstoacentralrepository.AlltabletopswereenhancedwithourCollAid system to identify studentswho touched the interactive surface,basedonoverheaddepth sensors as described above. The instructor was provided with a handheld tablet, with a newversion of the dashboard prototype (see Figure 4, right). This showed two real-time selectedvisualizations of 1) group task progress (the quality of students’ work), or 2) equity of studentparticipation (an indicatorofgroupwork),withineachgroup.While thedesignof thesevisualizationsbuiltonlessonsfromthepreviouslabstudies,weusedittocreatenewvisualizationsoftheinformationthattheteacherwanted.

Figure4.Pilotstudies–studiesinthewild:Left:ahigh-fidelityprototypeofaninstructordashboard;andRight:aninstructorusinganawarenesstoolin-class.

Page 21: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 29

Evaluation:Learningtasksweredesignedbytheteachertofitintoonefifty-minuteweeklytutorialclassineachcourse.Theclass sessions involved small-groupdiscussionsofapreviouslydefinedcase studyandthetaskrequiredstudentstorepresenttheirgroupsolutionstotheposedcasevisuallybybuildingaconceptmap. During classes, the teachermoved freely among the tabletops, reviewing each group’sconceptmap,identifyinggroupsthatneededhelp,andprovidingfeedbackaccordingly.Toevaluatetheimpactofthetool intheclassroom,wecollectedinformationfromanumberofsourcestotriangulateevidence.Thesesourcesincludedapplicationlogswithuser information,snapshotsoftheevolutionofeachgroup’sconceptmapsolution, theactionsperformedbythe instructorontheorchestrationtooland attending to groups, and notes from a second external observerwhose taskwas to assess eachsmall group’s work at regular intervals. The evaluation focused on investigating if the instructorattended to the “lowerachieving” groups,basedon the informationprovidedon thedashboard, andthen if this intervention had some effect on student learning or on their task. Further details of thisevaluationprocesscanbefoundinMartinez-Maldonadoetal.(2013)andMartinez-Maldonado(2014).Mainlessonslearnt:Resultsshowedthatthedatapresentedtotheteacherintheclassroomcoulddirecttheirattention,especiallywheninformationaboutthequalityofstudentworkisdelivered.Bycontrast,thestudyalsoconfirmedthattheteacherdidvalueindicatorsofgroupworkandindividualparticipation,butmostlyforpost-hocanalysis.Thisstudymadeevidenthowimportantitistoevaluatethetoolsinarealclassroomenvironment.Intheactualclass,theinstructorhadtocopewithmanyconstraints(e.g.,timelimits,answeringstudentquestions,aligningwiththecurriculum,studentsarrivinglate,organizingtheactivity,andorchestratingthewholeclass). Inaddition,ourprovisionoftheawarenesstooladdedcomplexity to the alreadymultifaceted instruction activity. This study also proved that an instructorcouldbenefitgreatlyfromhavingquickaccesstoinformationthatisnotnormallyeasytoaccessinclasstime;forexample,aspectsofthequalityofstudents’evolvingsolutions.Therealismofthisstudymadeit difficult to test experimental conditions (e.g., alternating visualizations in the awareness tools orcomparing the usage of the tool with not using it at all) due to practical or ethical issues and theteacher’swishes(e.g.,shewantedallstudentstohavesimilaropportunitiesforlearning).Additionally,theevaluationinthesepilotstudiesallowedtestingtheimpactoftheactualuseoftheawarenesstool,goingbeyondsimpleusabilitytopedagogicandlearningaspects.Thiswasnotpossibleintheprevioustwolabstudies.3.2.4 Stage5—ClassroomuseDescriptionof thestudy:Finally, theawareness tool thatwas iterativelybuiltandevaluated in the laband in pilot studies was deployed in long-term classroom studies. This deployment consisted of alongitudinalclassexperiencerunningacrosstwofulluniversity(semester)courseswithfourinstructors(all different from those in the previous evaluation stages). The classroom experience involved 145students.Themulti-touchclassroomecologywasusedforthemajorityoftheone-hourweeklytutorialsforthetwocourses.ThefirstwasanundergraduatecourseonHuman–Computer Interactionthathad105 studentsenrolleddistributed in six tutorial classes taughtby three instructors. The secondwasapostgraduate course on Pervasive Computing that had 40 students enrolled divided in two tutorial

Page 22: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 30

classestaughtbytwotutors. Inthisstudy,a fullversionof theawarenesstoolwasprovided.Figure5showsanexamplescreen.The interfaceprovidedfunctions for the instructor tocontrol the tabletopsandverticaldisplaysintheclassroom(Figure5-Aand5-B).Forexample,toblockallthetabletopssotheinstructorcouldgiveinstructionstothestudents,tomoveallthetabletopstoadifferentphaseoftheclass task,or tosend thecontentofa specific tabletop tobedisplayedonaclassroomvertical“wall”display. It also showed one visualization per group (Figure 5-C) and it included the provision ofnotifications(squareswithroundedcornersaroundvisualizations—Figure5-D)thatwereautomaticallytriggeredwhenmisconceptionsweredetectedorwhengroupsachievedatleast50%ofthetaskastheinstructor had pre-defined it. The misconceptions were detected by comparing the runtime of eachgroup’sconceptmapwithalistofcommonmisconceptionsdefinedbytheteacherindesigntime.Thedashboardreflectedthechangeinstudentbehaviourortheinterventionsperformedbytheinstructor.Forexample,iftheinstructorattendedtoagroupthathadamisconception,thedashboardupdatedinruntimewhen(andif)thegrouphadaddressedthatmisconception.Evaluation:Similartothepreviousstudy,testingtheawarenesstoolin-the-wildprovidedunderstandingoftheimpactofthetoolonlearningandinstruction.However,afullsemesterdeploymentshedlightonadditionalrequirementsforatrulyusefulteacher’sdashboard.First,eachweek,theclassroomsessionswere different and instructors had to follow a particular script. Then, the notion of a script and avisualizationportrayingitsenactmentinrealtimewereintroduced.Thegroupvisualizationshadtobegeneratedinthiscontext,matchingthestageofthescriptwheretheclasswasupto,atanygiventime.Moreover, itwas important for instructors to switch theactive learningapplicationson the tabletopseasily.Thus,thevisualizationshadtoconveymeaningfulindicatorsofgroupworkfordifferenttypesof

Figure5.Onefinaldeployedteacher’sdashboardfeaturingA)functionstocontrolthetabletops;B)functionstocontrolthevertical‘wall’displays;C)smallgroupvisualizations;andD)notifications

Page 23: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 31

taskssupportedbythreedifferenttabletopapplications(inourcase:conceptmapping,brainstorming,andgroupmeetings).Eachoftheselearningapplicationswasusedatleasttwiceduringthesemesterineach course. Results showed how the different instructors used the visualizations according to theirpreviousteachingexperience,personalstylesandeachparticulartaskeachweekduringthecourses.Insomeweeksthedashboardwasmorecrucial,especially inthosetaskswhereparticularteachinggoalshad to be accomplished (e.g., when equal contributions from each groupmemberwere expected orwheninstructorshadtoreviseknowledgeofparticulartopics).Bycontrast,formoreopentaskssuchasteammeetings, theworkload of regulating the class sessions wasmore distributed among students;hence,theinstructorsdidnothavetomonitortaskprogressorgroupachievementexplicitly.Main lessons learnt:Theevaluationoftheawarenesstool in-the-wild inthis largerstudygave insightsabout a higher degreeof generalizationof the results (Brown,Reeves,& Sherwood, 2011). As in anydesigntask,thevisualizationsinthedashboardarenotnecessarilyfinal.However,theLATUXprocessesdemonstrated that they are useful and meaningful. This held true for more than just the conceptmappingactivity, to includeother learningactivities aswell (brainstormingand face-to-facemeetingsandcoordinationofamajormulti-weekproject).Additionally,thein-the-wildstudyallowedfortestingthetoolwithmoreinstructorsthanthepilotstudiesandforalongertime,thusminimizingthenoveltyeffect.3.2.5 IterativeDesignThedesignofeffectiveawareness tools, similar tootheruser interfaces, calls foran iterativeprocess.The purpose of this case study is not to recommend all the details of these visualizations anddashboards themselves. For this there has been nascent but extensive work (Verbert et al., 2013).Rather,weusedittoillustratetheprocessofdesigningandrefiningvisualawarenesstoolsthatmaybeeffective for a specific context. For example, Figure 6 shows the evolution of two visualizations that

Figure6.Evolutionoftwovisualizations,fromprototypestodeploymentA)RadarsofParticipationandB)Contributioncharts.

Page 24: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 32

were deployed in a classroom and were initially evaluated with prototypes. Although simple, theyprovideusefulinformationtotheinstructor.Itcanbeobservedthat:

1. There was a tendency towards minimalism. For example, for the visualizations “radars ofparticipation”(seethethreevisualizationsshowninFigure6,toprow)thefirstprototypeshadtwotrianglesdepictingequityofverbal(blue)andtouch(red)participationwiththeinteractivetabletops. In the classroom, given the challenges for capturing clean speech data, thevisualizationhadtobesimplifiedtoshowjusttouchactivitydatafromthetabletopsandnamesofstudentsratherthansymbolicrepresentations(colouredcircles).Thisobservationispartlyinline with the data–ink ratio principle (Tufte, 1983). This principle promotes the avoidance ofdistracting elements as part of the visualizations design,which could be observed as the twovisualizationsevolvedineachiteration.Theprinciplealsopromotesthenon-redundantdisplayof data-information,which could be observed in the evolution of the radars of participation.However, some redundancy remained in the last design of the contribution charts (e.g., thenumberoflinksrepresentedbothasanumberandasacolouredarea).

2. Additionally,therewasanincreasedfocusonhigher-levelindicatorsandlessonaccountability.For example, for the “contribution charts” (see the three visualizations shown in Figure 6,bottom row), the pie chart used for the prototypeswas simplified to indicate the size of thesolution (outer orange circle, in the rightmost visualization) and the portion that matchedessential elements expected by the instructor (the inner yellow circle) instead of individualcontributionsdepictedbytheslicesofthepiechart.

This project is continuing to explore other visualizations of group activity that can be useful toinstructorsorstudents.Therefore,itisimportanttohaveaniterativeperspectivefromthebeginningofthedesignprocessthoughtotheend.4 CONCLUSIONS AND RESEARCH AGENDA The design of effective LA tools requires the convergence ofmethodologies frommultiple disciplinessuch as software engineering, human–computer interaction, and education. Even though thesedisciplinesprovideguidelinesandframeworkstoguidedesignerstowardseffectivedesignsandsystems,theLAcommunitymaybenefit fromparadigmsthatreflect itsadditionalandspecificmultidisciplinaryneeds. There is a need for systematic processes to tackle the critical challenge of designing effectiveinterfacesforLAapplications.Ourworksetouttotackletherecognizedneedforaprincipledapproachfor designing the interfaces of LA tools. In this paper,we have reported the tabletop case studies toillustratehowtousetheLATUXworkflow.TheseriesofcasestudiesalsodemonstratethattheprocessresultedinvaluableLAinterfacesforlearningandteachingwithinteractivetabletops.Figure1providedanoverviewofthefivestagesoftheworkflow,withquestionsfortheLAtooldesignertoconsider.TheLATUX workflow calls for an iterative process, with five stages that build towards the objective ofproducing robust tools suitable for large-scale adoption, and based on the combination of well-established techniques to improve user experiencewhilemaintaining an explicit connectionwith theunderpinning learningenvironment.OurLATUXworkflowdrawsonthesubstantialbodyofworkfrom

Page 25: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 33

SEandUXdisciplinesbutalsoconsiders thepedagogical requirementsandchallengesofdesigningLAtools to support teachingand learning in technology-enabled learningenvironments.WenowdiscussthekeylessonsforusingtheLATUXworkflow.ThereadermaybeconcernedabouttheamountofeffortinvolvedinourseriesofstudiestoprovideLAtools for tabletop classrooms.However, this is largely due the significant effort needed toworkwithveryuntried learningtechnology.Wewantedtoharnessthedigital footprints fromstudents’ touchofthe tabletops and the recorded audio of their conversations. Thismeant thatwe tackled theprocessfromavery fundamental startingpoint: thedevelopmentof thedatacollectiontechnology.Formanyvaluableusesof LA, thereareestablished,well understooddata sources. For suchdata sources (e.g.,data captured from the learning management system or video technologies), we expect that theprogressfromlow-fidelitytohigh-fidelityprototypesandontopilotandclassroomstudiescanbedonewithmodesteffort.AsweassemblemorecasestudiesofevaluatedLAtools,wecanexpecttocreateever-moresuccessfulLAinterfaceswithfeweriterations.Inourwork,akeychallengewastoconsidertheLAgoalsatthesametimeasthedesignofthelearningactivities. Thiswas valuable, as it demanded careful thought, decision-making, anddocumentation ofthelearningobjectives.Initially,wesetouttoprovideLAtoolsforvisualizinggroupcollaboration.Whenwemovedtotheclassroom,thetopic-levellearningoutcomeshadtobeincorporated.Inmanywidelyusedlearningplatforms,westillneedtobuildalargersetofcasestudiesfortheProblemIdentificationstage.Notably, it is likelytobevaluabletoimprovetheprimitivesforlinkinglearningoutcomestotheavailable data. We then need case studies that move through the other four stages. The LATUXworkflowhasaroletoplayinsupportingthisexploration.Inthelongterm,weexpectthatasetofcoreclassesofrequirementswillemerge.Forexample,manyonlinelearningtoolscouldusefullyprovideLAtools that show student progress in the set learning objectives. This is similar to our tool showingprogress in creating the teacher’s target elements in the concept maps, as well as anticipatedmisconceptions.We emphasize that our contribution is the process. In addition, the series of case studies discussedrepresentavaluablefoundationforcreatingtoolsforteacherstouseforclassroomsthatinvolvedsmallgrouplearningactivitieswheredataisavailableaboutindividualcontributionsandabouttheprogressofthe group task. These findings are likely to be useful for other contexts that share the same goal ofenabling a teacher to see the progress of multiple small groups and to decide how to direct herattention to groups that most require it. This may also be useful for online collaborative learningactivities,withmultiplesmallgroupsandsetclasstimes.The LATUX workflow has been supported with a series of case studies on the development of anawareness tool for instructors to observe small group interactions in a classroom. Even though thestudiesdiscussed involve rather leading-edge technology (e.g., interactive tabletops), thecore lessonslearnt, and the proposed workflow have broader applicability to both existing and future learning

Page 26: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 34

technology.Furtherwork isneeded tovalidate theapplicationof theworkflow inblendedandonlinelearning environments, and explore cases where awareness tools are given to students or otherstakeholders. Futurework is alsoneededonaligning the LA solutionswith thedesignof the learningactivities(includingtheconfigurationoftools,tasks,andpedagogicalstrategies)tosupport instructorsmoreeffectively.Webelievethatthisworkisaninitialsteptowardsmuchneededresearchtoprovidemethodologies for thedesignofLAawareness tools.Weconsider theareaofdesign frameworksasacrucialaspecttocontributetotheholisticviewofLA,fosteritswidespreadadoption,andimprovetheoveralllearningandteachingexperience.ACKNOWLEDGEMENTS This researchwas supportedby funding from the Faculty of Engineering& Information Technologies,TheUniversityofSydney,undertheFacultyResearchClusterProgram.REFERENCES Arnold, K. E., & Pistilli, M. D. (2012). Course signals at Purdue: Using learning analytics to increase

student success. Proceedings of the 2nd International Conference on Learning Analytics andKnowledge(LAKʼ12),267–270.http://dx.doi.org/10.1145/2330601.2330666

Bachour,K.,Kaplan,F.,&Dillenbourg,P.(2010).Aninteractivetableforsupportingparticipationbalancein face-to-face collaborative learning. IEEE Transactions on Learning Technologies, 3(3), 203–213.http://dx.doi.org/10.1109/TLT.2010.18

Bakharia,A.,&Dawson,S. (2015).Usingsentencecompressiontodevelopvisualanalytics forstudentresponsestoshortquestionanswers.InE.Duval,K.Verbert,J.Klerkx,M.Wolpers,A.Pardo,S.Govaerts,…D.Parra,(Eds.),ProceedingsoftheFirstInternationalWorkshoponVisualAspectsofLearningAnalytics(VISLAʼ15)(pp.11-13).Poughkeepsie,NY,USA:CEUR-WS.org.Retrievedfromhttp://ceur-ws.org/Vol-1518/paper2.pdf

Balacheff, N., & Lund, K. (2013).Multidisciplinarity vs. multivocality, the case of “learning analytics.”Proceedingsofthe3rdInternationalConferenceonLearningAnalyticsandKnowledge(LAK’13),5-13.http://dx.doi.org/10.1145/2460296.2460299

Beer,C.,Jones,D.,&Clark,D.(2012).Analyticsandcomplexity:Learningandleadingforthefuture.InM. Brown, M. Hartnett, T. Stewart (Eds.), Proceedings of the 29th Annual Conference of theAustralasian Society for Computers in Learning in Tertiary Education (ASCILITE 2012): FutureChallenges,SustainableFutures,25–28October2012,Wellington,NewZealand(pp.78–87).

Bevan,N.(2009).Internationalstandardsforusabilityshouldbemorewidelyused.JournalofUsabilityStudies,4(3),106–113.

Bienkowski,M., Feng,M., &Means, B. (2012). Enhancing teaching and learning through educationaldata mining and learning analytics: An issue brief. Washington, DC: US Department ofEducation, Office of Educational Technology. Retrieved from https://tech.ed.gov/wp-content/uploads/2014/03/edm-la-brief.pdf

Page 27: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 35

Blikstein, P. (2013).Multimodal learning analytics.Proceedings of the 3rd International Conference onLearning Analytics and Knowledge (LAK ’13), 102–106.http://dx.doi.org/10.1145/2460296.2460316

Brown,B.,Reeves,S.,&Sherwood,S.(2011). Intothewild:Challengesandopportunitiesforfieldtrialmethods.Proceedingsof theSIGCHIConferenceonHumanFactors inComputingSystems (CHIʼ11),1657–1666.http://dx.doi.org/10.1145/1978942.1979185

Bull, S.,&Kay, J. (2007). Studentmodels that invite the learner in: TheSMILIopen learnermodellingframework.InternationalJournalofArtificialIntelligenceinEducation,17(2),89–120.

Campbell, J.P.,DeBlois,P.B.,&Oblinger,D.G. (2007).Academicanalytics:Anewtool foranewera.EducauseReview,42(4), 40.Retrieved fromhttp://er.educause.edu/articles/2007/7/academic-analytics-a-new-tool-for-a-new-era

Chaka,C.(2010).Collaborativelearning:Leveragingconceptmappingandcognitiveflexibilitytheory.InP.L.Torres&R.d.C.V.Marriott (Eds.),Handbookof researchoncollaborative learningusingconceptmapping(pp.152–170).Birmingham,UK:IdeaGroup.

Charleer,S.,Klerkx,J.,&Duval,E(2015).Exploringinquiry-basedlearninganalyticsthroughinteractivesurfaces.InE.Duval,K.Verbert,J.Klerkx,M.Wolpers,A.Pardo,S.Govaerts,…D.Parra,(Eds.),Proceedingsof theFirst InternationalWorkshoponVisualAspectsofLearningAnalytics (VISLAʼ15) (pp.32-35). Poughkeepsie,NY,USA: CEUR-WS.org. Retrieved fromhttp://ceur-ws.org/Vol-1518/paper6.pdf

Chatti,M.A.,Dyckhoff,A.L.,Schroeder,U.,&Thüs,H.(2012).Areferencemodelforlearninganalytics.International Journal of Technology Enhanced Learning, 4(5), 318–331.http://dx.doi.org/10.1504/IJTEL.2012.051815

Clayphan,A.,Kay,J.,&Weinberger,A.(2013).Scriptstorm:Scriptingtoenhancetabletopbrainstorming.PersonalandUbiquitousComputing,18(6),1433–1453.http://dx.doi.org/10.1007/s00779-013-0746-z

Clayphan, A., Martinez-Maldonado, R., Ackad, C., & Kay, J. (2013). An approach for designing andevaluatingaplug-invision-basedtabletoptouch identificationsystem.Proceedingsofthe25thAustralian Computer-Human Interaction Conference: Augmentation, Application, Innovation,Collaboration(OzCHI'13),467-476.http://dx.doi.org/10.1145/2541016.2541019

Dahlbäck,N., Jönsson,A.,&Ahrenberg,L. (1993).WizardofOzstudies:Whyandhow.Proceedingsofthe International Conference on Intelligent User Interfaces (IUI 1993), 193–200.http://dx.doi.org/10.1145/169891.169968

Dillenbourg,P.,Zufferey,G.,Alavi,H., Jermann,P.,Do-Lenh,S.,Bonnard,Q.,Cuendet,S.,&Kaplan,F.(2011).Classroomorchestration:Thethirdcircleofusability.InH.Spada,G.Stahl,N.Miyake,&N.Law(Eds.).Connectingcomputer-supportedcollaborativelearningtopolicyandpractice:CSCL2011conferenceproceedings(Vol.1),(pp.510–17).Lulu:ISLS

Duval,E.(2011).Attentionplease!Learninganalyticsforvisualizationandrecommendation.Proceedingsof the 1st International Conference on Learning Analytics and Knowledge (LAK ʼ11), 9–17.http://dx.doi.org/10.1145/2090116.2090118

Duval, E., Verbert, K., Klerkx, J., Wolpers, M., Pardo, A., Govaerts, S., Gillet, D., … Parra, D. (2015).

Page 28: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 36

Proceedingsof theFirst InternationalWorkshoponVisualAspectsofLearningAnalytics (VISLAʼ15).Poughkeepsie,NY,USA:CEUR-WS.org.Retrievedfromhttp://ceur-ws.org/Vol-1518/

Dyckhoff, A. L., Zielke, D., Bültmann, M., Chatti, M. A., & Schroeder, U. (2012). Design andimplementationof a learning analytics toolkit for teachers.Educational Technology& Society,15(3),58–76.

Erickson, T., Smith, D., Kellogg, W., Laff, M., Richards, J., & Bradner, E. (1999). Socially translucentsystems:Socialproxies,persistentconversation,andthedesignof“babble.”ProceedingsoftheSIGCHI Conference on Human Factors in Computing Systems (CHI ʼ99), 72–79.http://dx.doi.org/10.1145/302979.302997

Ferguson,R.(2012a).Learninganalytics:Drivers,developmentsandchallenges.InternationalJournalofTechnologyEnhancedLearning,4(5/6),304–317.http://dx.doi.org/10.1504/IJTEL.2012.051816

Ferguson,R.(2012b).Thestateoflearninganalyticsin2012:Areviewandfuturechallenges(TechnicalReport KMI-12-01). Retrieved from the Knowledge Media Institute, The Open University, UKhttp://kmi.open.ac.uk/publications/techreport/kmi-12-01

Gluga,R.,Kay,J.,Lister,R.,Simon,S.,Charleston,M.,Harland,J.,&Teague,D.M.(2013).Aconceptualmodelforreflectingonexpectedlearningvs.demonstratedstudentperformance.InA.Carbone&J.Whalley.(Eds.)Proceedingsofthe15thAustralasianComputingEducationConference(ACE2013)(pp.77–86).Adelaide,Australia:AustralianComputerSociety.

Greller,W.,&Drachsler,H.(2012).Translatinglearningintonumbers:Agenericframeworkforlearninganalytics.EducationalTechnology&Society,15(3),42–57.

Harris,A.,Rick,J.,Bonnett,V.,Yuill,N.,Fleck,R.,Marshall,P.,&Rogers,Y.(2009).Aroundthetable:Aremultiple-touch surfaces better than single-touch for childrenʼs collaborative interactions? InO’Malley, C., Suthers, D., Reimann, P. & Dimitracopolou, A. (Eds.) Proceedings of the 9thInternationalConferenceonComputer SupportedCollaborative Learning (CSCL2009), (pp.335-344).Rhodes,Greece:ISLS.

Hartson, R., & Pyla, P. S. (2012). The UX book: Process and guidelines for ensuring a quality userexperience.Elsevier.

Hecking, T., & Hoppe, H. U. (2015). A network based approach for the visualization and analysis ofcollaborativelyeditedtexts.ProceedingsoftheFirstInternationalWorkshoponVisualAspectsofLearningAnalytics(VISLAʼ15)(pp.11-13).Poughkeepsie,NY,USA:CEUR-WS.org.Retrievedfromhttp://ceur-ws.org/Vol-1518/paper4.pdf

Helms, J. W., Arthur, J. D., Hix, D., & Hartson, H. R. (2006). A field study of the wheel: A usabilityengineering process model. Journal of Systems and Software, 79(6), 841–858.http://dx.doi.org/10.1016/j.jss.2005.08.023

ISO(2009).Ergonomicsofhumansysteminteraction–Part210:Human-centreddesignfor interactivesystems (ISO 9241-210:2010; formerly known as 13407). Geneva, Switzerland: InternationalOrganizationforStandardization

Kay, J.,&Bull,S. (2015).Newopportunitieswithopen learnermodelsandvisual learninganalytics. InConati, C., Heffernan, N., Mitrovic, A., Verdejo, F. (Eds.). Artificial Intelligence in Education

Page 29: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 37

(Lecture Notes in Computer Science Series) (pp. 666–669). Berlin/Heidelberg: Springer.http://dx.doi.org/10.1007/978-3-319-19773-9_87

Knight, S., Buckingham Shum, S., & Littleton, K. (2013). Epistemology, pedagogy, assessment andlearning analytics. Proceedings of the 3rd International Conference on Learning Analytics andKnowledge(LAK’13),75–84.http://dx.doi.org/10.1145/2460296.2460312

Lockyer,L.,Heathcote,E.,&Dawson,S.(2013).Informingpedagogicalaction:Aligninglearninganalyticswith learning design. American Behavioral Scientist, 57(10), 1439–1459.http://dx.doi.org/10.1177/0002764213479367

Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” foreducators: A proof of concept. Computers & Education, 54(2), 588–599.http://dx.doi.org/10.1016/j.compedu.2009.09.008

Martinez-Maldonado, R. (2014). Analysing, visualising and supporting collaborative learning usinginteractive tabletops. (Doctoral dissertation, the University of Sydney). Retrieved fromhttp://sydney.edu.au/engineering/it/~judy/Homec/Pubs/2014_Roberto_Martinez_Thesis_FINAL.pdf

Martinez-Maldonado,R., Clayphan,A.,Ackad,C.,&Kay, J. (2014).Multi-touch technology in ahighereducationclassroom:Lessonsin-the-wild.Proceedingsofthe25thAustralianComputer-HumanInteractionConference:Augmentation,Application, Innovation,Collaboration(OzCHI '13),189–192.http://dx.doi.org/10.1145/2686612.2686647

Martinez-Maldonado,R.,Collins,A.,Kay,J.,&Yacef,K.(2011).Whodidwhat?Whosaidthat?Collaid:Anenvironmentforcapturingtracesofcollaborativelearningatthetabletop.ProceedingsoftheInternational Conference on Interactive Tabletops and Surfaces (ITS 2011), 172–181.http://dx.doi.org/10.1145/2076354.2076387

Martinez-Maldonado, R.,Dimitriadis, Y., Kay, J., Yacef, K.,& Edbauer,M.-T. (2013).MTClassroomandMTDashboard: Supporting analysis of teacher attention in an orchestrated multi-tabletopclassroom. In R. Nikol, M. Kapur, M. Nathan, S. Puntambekar (Eds.), Proceedings of theInternational Conference on Computer Supported Collaborative Learning (CSCL 2013) (Vol. 1),(pp.119–128).Madison,USA:ISLS.

Martinez-Maldonado, R., Kay, J., & Yacef, K. (2010). Collaborative concept mapping at the tabletop.Proceedings of the International Conference on Interactive Tabletops and Surfaces (ITS 2010)(pp.207–210).http://dx.doi.org/10.1145/1936652.1936690

Martinez-Maldonado, R., Kay, J., & Yacef, K. (2011). Visualisations for longitudinal participation,contribution and progress of a collaborative task at the tabletop. In H. Spada, G. Stahl, N.Miyake, & N. Law (Eds.).Connecting computer-supported collaborative learning to policy andpractice:CSCL2011conferenceproceedings(Vol.1),(pp.25–32).Lulu:ISLS

Martinez-Maldonado, R., Pardo, A.,Mirriahi, N., Yacef, K., Kay, J., & Clayphan, A. (2015). The LATUXworkflow: Designing and deploying awareness tools in technology-enabled learning settings.Proceedingsofthe5thInternationalConferenceonLearningAnalyticsandKnowledge(LAKʼ15),1–10.http://dx.doi.org/10.1145/2723576.2723583

Page 30: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 38

Martinez-Maldonado, R., Yacef, K., Kay, J., & Schwendimann, B. (2012). An interactive teacher’sdashboard for monitoring multiple groups in a multi-tabletop learning environment.ProceedingsoftheInternationalConferenceonIntelligentTutoringSystems(ITS2012),482–492.http://dx.doi.org/10.1007/978-3-642-30950-2_62

Mendiburo, M., Sulcer, B., & Hasselbring, T. (2014). Interaction design for improved analytics.Proceedingsofthe4thInternationalConferenceonLearningAnalyticsandKnowledge(LAKʼ14),78–82.http://dx.doi.org/10.1145/2567574.2567628

Niemann,K.,Rojas,S.L.,Wolpers,M.,Scheffel,M.,Drachsler,H.,&Specht,M.(2015).Gettingagraspontagcollectionsbyvisualisingtagclustersbasedonhigher-orderco-occurrences.InE.Duval,K.Verbert, J. Klerkx,M.Wolpers,A.Pardo, S.Govaerts,…D.Parra, (Eds.),Proceedingsof theFirst International Workshop on Visual Aspects of Learning Analytics (VISLA ʼ15) (pp.14-18).Poughkeepsie,NY,USA:CEUR-WS.org.Retrievedfromhttp://ceur-ws.org/Vol-1518/paper3.pdf

Ochoa, X. (2015). Visualizing uncertainty in the prediction of academic risk. In E.Duval, K. Verbert, J.Klerkx, M. Wolpers, A. Pardo, S. Govaerts, … D. Parra, (Eds.), Proceedings of the FirstInternational Workshop on Visual Aspects of Learning Analytics (VISLA ʼ15) (pp.4-10).Poughkeepsie,NY,USA:CEUR-WS.org.Retrievedfromhttp://ceur-ws.org/Vol-1518/paper1.pdf

Olmos, M., & Corrin, L. (2012). Academic analytics in a medical curriculum: Enabling educationalexcellence.AustralasianJournalofEducationalTechnology,28(1),1–15.

Prieto, L. P.,Dlab,M.H.,Gutiérrez, I., Abdulwahed,M.,&Balid,W. (2011).Orchestrating technologyenhanced learning: A literature review and a conceptual framework. International Journal ofTechnologyEnhancedLearning,3(6),583–598.http://dx.doi.org/10.1504/IJTEL.2011.045449

Ritsos,P.D.,&Roberts,J.C.(2014).Towardsmorevisualanalyticsinlearninganalytics.InM.Pohl&J.Roberts(Eds.),ProceedingsoftheEurographicsConferenceonVisualizationWorkshoponVisualAnalytics (EuroVis 2014). Swansea, Wales: The Eurographics Association.http://dx.doi.org/10.2312/eurova.20141147

RodríguezTriana,M.J.,MartínezMonés,A.,AsensioPérez,J. I.,&Dimitriadis,Y. (2014).Scriptingandmonitoringmeeteachother:Aligninglearninganalyticsandlearningdesigntosupportteachersin orchestrating CSCL situations. British Journal of Educational Technology, 46(2), 330–343.http://dx.doi.org/10.1111/bjet.12198

Shneiderman,B.,Borkowski,E.,Alavi,M.,&Norman,K.(1998).Emergentpatternsofteaching/learningin electronic classrooms. Educational Technology Research and Development, 46(4), 23–42.http://dx.doi.org/10.1007/BF02299671

Siemens, G. (2012). Learning analytics: Envisioning a research discipline and a domain of practice.Proceedingsofthe2ndInternationalConferenceonLearningAnalyticsandKnowledge(LAKʼ12),4–8.http://dx.doi.org/10.1145/2330601.2330605

Siemens, G., & Baker, R. S. J. d. (2012). Learning analytics and educational data mining: Towardscommunicationandcollaboration.Proceedingsofthe2nd InternationalConferenceonLearningAnalyticsandKnowledge(LAKʼ12),252–254.http://dx.doi.org/10.1145/2330601.2330661

Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas.American BehavioralScientist,57(10),1510–1529.http://dx.doi.org/10.1177/0002764213479366

Page 31: LATUX: An Iterative Workflow for Designing, Validating ...interactive surface (Charleer, Klerkx, & Duval, 2015). The main challenge for creating such a framework is to make it generic

(2015).LATUX:Aniterativeworkflowfordesigning,validating,anddeployinglearninganalyticsvisualizations.JournalofLearningAnalytics,2(3),9–39.http://dx.doi.org/10.18608/jla.2015.23.3

ISSN1929-7750(online).TheJournalofLearningAnalyticsworksunderaCreativeCommonsLicense,Attribution-NonCommercial-NoDerivs3.0Unported(CCBY-NC-ND3.0) 39

Sommerville,I.(2011).Softwareengineering(9thed.).USA:PearsonEducation,Inc.Thomas,J.J.,&Cook,K.A.(2006).Avisualanalyticsagenda.ComputerGraphicsandApplications,IEEE,

26(1),10–13.http://dx.doi.org/10.1109/MCG.2006.5Tohidi,M.,Buxton,W.,Baecker,R.,&Sellen,A. (2006).Getting the rightdesignand thedesign right.

ProceedingsoftheSIGCHIConferenceonHumanFactorsinComputingSystems(CHIʼ06),1243–1252.http://dx.doi.org/10.1145/1124772.1124960

Tufte,E.R.(1983).Thevisualdisplayofquantitativeinformation.Cheshire,CT:GraphicsPress.Upton,K.,&Kay,J.(2009).Narcissus:Groupandindividualmodelstosupportsmallgroupwork.InG.-J.

Houben, G. McCalla, F. Pianesi, & M. Zancanaro (Eds.), User modeling, adaptation, andpersonalization (pp. 54–65). Berlin/Heidelberg: Springer. http://dx.doi.org/10.1007/978-3-642-02247-0_8

Verbert, K., Govaerts, S., Duval, E., Santos, J. L., Assche, F., Parra, G., & Klerkx, J. (2013). Learningdashboards: An overview and future research opportunities. Personal and UbiquitousComputing,18(6),1499–1514.http://dx.doi.org/10.1007/s00779-013-0751-2