6
Challenges in Collaborative Immersive Visualization Wolfgang Büschel Interactive Media Lab Dresden, Technische Universität Dresden, Dresden, Germany [email protected] Georg Eckert Interactive Media Lab Dresden, Technische Universität Dresden, Dresden, Germany [email protected] Raimund Dachselt Interactive Media Lab Dresden, Cluster of Excellence Physics of Life, Technische Universität Dresden, Dresden, Germany [email protected] Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s). Copyright held by the owner/author(s). 4th Workshop on Immersive Analytics: Envisioning Future Productivity for Immersive Analytics, April 2020 ACM 978-1-4503-6819-3/20/04. http://immersiveanalytics.io/ Figure 1: (Left) Example setup in CollARVis, our collaborative AR visualization toolkit. Our work on this toolkit helped us to gain insights into the challenges discussed in this paper. (Right) Two users exploring data collaboratively. The red marker over the person on the right shows his viewing direction and indicates to the other user that hologram sharing is working. Abstract In this paper we discuss challenges in collaborative mixed reality visualization. Despite recent advances in immersive technologies in general and, specifically, Immersive Analyt- ics, systems rarely leave prototype status and are seldom found in real-world usage. Based on our ongoing research in Mixed Reality data visualization and experiences with our own toolkit for collaborative immersive visualization, CollARVis, we identify important challenges of such sys- tems and examine possible approaches to address them. By doing so, we contribute to the ongoing discussion how future immersive applications can be made more practical for productive environments. Author Keywords Information Visualization; Immersive Analytics; Augmented Reality; Collaboration CCS Concepts Human-centered computing Information visualiza- tion; Mixed / augmented reality; Introduction & Background The last few years saw significant advances in the still- young research field of Immersive Analytics (IA) [9]. Com- mercial AR and VR hardware is available and Mixed Reality

Challenges in Collaborative Immersive Visualization

  • Upload
    others

  • View
    8

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Challenges in Collaborative Immersive Visualization

Challenges in Collaborative ImmersiveVisualization

Wolfgang BüschelInteractive Media Lab Dresden,Technische Universität Dresden,Dresden, [email protected]

Georg EckertInteractive Media Lab Dresden,Technische Universität Dresden,Dresden, [email protected]

Raimund DachseltInteractive Media Lab Dresden,Cluster of ExcellencePhysics of Life,Technische Universität Dresden,Dresden, [email protected]

Permission to make digital or hard copies of part or all of this work for personal orclassroom use is granted without fee provided that copies are not made or distributedfor profit or commercial advantage and that copies bear this notice and the full citationon the first page. Copyrights for third-party components of this work must be honored.For all other uses, contact the owner/author(s).

Copyright held by the owner/author(s).4th Workshop on Immersive Analytics:Envisioning Future Productivity for Immersive Analytics, April 2020ACM 978-1-4503-6819-3/20/04.http://immersiveanalytics.io/

Figure 1: (Left) Example setup in CollARVis, our collaborative ARvisualization toolkit. Our work on this toolkit helped us to gaininsights into the challenges discussed in this paper.(Right) Two users exploring data collaboratively. The red markerover the person on the right shows his viewing direction andindicates to the other user that hologram sharing is working.

AbstractIn this paper we discuss challenges in collaborative mixedreality visualization. Despite recent advances in immersivetechnologies in general and, specifically, Immersive Analyt-ics, systems rarely leave prototype status and are seldomfound in real-world usage. Based on our ongoing researchin Mixed Reality data visualization and experiences withour own toolkit for collaborative immersive visualization,CollARVis, we identify important challenges of such sys-tems and examine possible approaches to address them.By doing so, we contribute to the ongoing discussion howfuture immersive applications can be made more practicalfor productive environments.

Author KeywordsInformation Visualization; Immersive Analytics; AugmentedReality; Collaboration

CCS Concepts•Human-centered computing → Information visualiza-tion; Mixed / augmented reality;

Introduction & BackgroundThe last few years saw significant advances in the still-young research field of Immersive Analytics (IA) [9]. Com-mercial AR and VR hardware is available and Mixed Reality

Page 2: Challenges in Collaborative Immersive Visualization

visualization frameworks such as IATK [6] and DXR [12]have been developed, in many cases decreasing the barri-ers of entry into the field.

Figure 2: Some of thevisualizations supported byCollARVis: histograms, parallelcoordinate plots, and scatterplots.

Accordingly, a number of works have been published in IAand the surrounding areas, including both research on sys-tems/environments (e.g., [5]) and on specific aspects suchas visual link routing ([11]), interaction (e.g., [7]), or investi-gations of individual visualization types (e.g., graphs [3]). Atthe same time, however, adoption for real-world use casesin professional settings seems to be lagging behind. Thereare first examples of evaluation in more realistic use cases(e.g., [1, 10]) but clearly, a variety of challenges still remainto make IA truly practical. Here, we specifically examinecollaborative immersive visualization in Augmented Real-ity. We first describe our in-house AR visualization toolkit,CollARVis. Then, we discuss challenges that we identifiedduring its design and while applying it to our research. Wealso point out how these challenges of collaborative AR vi-sualization might be addressed in the future.

A Prototypic AR Visualization ToolkitWe built CollARVis, a prototypic AR visualization toolkitfor the Microsoft HoloLens, using the Unity 3D engine. Itis based on VinkedViews [8] and uses Microsoft’s Holo-Toolkit1. CollARVis supports loading arbitrary data tablesfrom CSV files and to visualize the data in a set of different2D and 3D visualizations that can be placed freely in theenvironment (see Figure 1). Currently, our toolkit supportsscatter plots, histograms, parallel coordinate plots, and linecharts, each in 2D and 3D (see Figure 2). We specificallydesigned CollARVis to be multi-user capable with one de-vice acting as the server that all other clients connect to.No additional server infrastructure is necessary, allowing for

1Microsoft HoloToolkit: https://github.com/microsoft/MixedRealityToolkit-Unity/tree/htk_development

co-located collaborative exploration of data sets in differentenvironments with minimal setup effort (see Figure 1).

In a typical workflow, the first user takes the server roleand all additional users would then connect via a sessionbrowser. The users are presented with a workbench-likeUI where they can load a dataset. After the data has beenloaded, the workbench shows all variables in the dataset(see Figure 3). Users create new visualizations by pickinga subset of variables and then choosing the desired typeof visualization (see Figure 4). Only those visualizationsthat match the number and type of the selected variablesare shown. All users can explore the data collaboratively,no distinction between user roles is made. Once placed,visualizations can be moved and rotated freely (see Fig-ure 6). They can also be temporarily disabled or deletedcompletely. All changes to the visualizations are synchro-nized between users in real time.

During the development and use of our toolkit, both in ourresearch projects and in students’ projects, we noticed dif-ferent challenges for actual, practical use. In addition, wealso carried out more structured hands-on sessions withcolleagues from the department to collect additional in-sights into these challenges. In the following, we discussthese insights and approaches to address them. For someof the challenges, we have initial measures in place in ourprototype, others remain for future work.

ChallengesWe roughly divide the challenges into three groups: Tech-nical challenges related to the software architecture andlow-level features, workflow challenges related to differentusability aspects, and the challenge of how to evaluate col-laborative immersive visualization prototypes.

Page 3: Challenges in Collaborative Immersive Visualization

Technical ChallengesEven with the progress in hardware capabilities and toolsupport that we have seen in the recent past, technicalchallenges still remain.

Figure 3: Configuration menu inour toolkit. Users can freely selectvariables and then choose betweenall matching visualizations. Here,three variables are selected andhighlighted in red, blue, and green.

Figure 4: After choosing fivevariables (highlighted) andselecting 2D parallel coordinateplot as the type, the newvisualization is spawned.

Networking On a technical level, fast and reliable net-working between the devices is one of the fundamentalchallenges inherent to collaborative systems. In principal,a multitude of solutions exist. However, when specificallylooking for network libraries running both on the HoloLensand working within Unity, there are surprisingly few op-tions, most of which are low-level in nature. As such, addingmulti-user support to an existing solution is especially chal-lenging. Ideally, this has to be considered already in theearly design phase. In our experiences, even then, newfunctionality will regularly lead to additions to the net code.

Registration & Hologram sharing Providing the userswith a stable, unified coordinate system is a particular chal-lenge. The HoloLens supports the concept of serializingworld anchors, i.e., descriptors of local spatial properties,to allow multiple devices to share one coordinate system.However, in our experience these world anchors lack preci-sion and also quickly become unstable when multiple usersdynamically occlude parts of the environment or the envi-ronment itself changes. All of these cases regularly man-ifested during demonstrations, both in-house and on con-ferences. To position content relative to specific real-worldobjects, we also used optical marker tracking (typically us-ing Vuforia). Orientation precision, in particular, was oftenunsatisfactory. Thus, a more practical solution includes anadditional manual alignment step. In [4], we used an in-frared marker tracking system to track both a HoloLens anda phone and to bring them into a shared coordinate system.However, this approach is not suitable outside of instru-mented research labs.

Framework extensibility & performance To increasegeneral flexibility or to address specific use cases, anypractical framework will support some form of extensibil-ity. On the other hand, performance optimizations are oftennecessary when developing for the HoloLens. A resultingproblem, which we observed in current solutions, is thatshaders are used to draw data objects, e.g., the glyphs in ascatter plot. While this can lead to significant performanceadvantages, it also means that any changes to, e.g., se-lection or highlighting become non-trivial, as shader codeneeds to be rewritten.

Workflow ChallengesIn addition to technical obstacles, there are also workflowissues regarding collaboration in immersive visualization.

Interaction Interaction for Immersive Analytics has beenresearched for quite some time, and rightly so, as naturalinteraction is one of the potential advantages compared totraditional setups [2]. Despite this, current mid-air gesturalinterfaces as used by the HoloLens (both the original and,although more sophisticated, its successor) are cumber-some and, for many users, quite hard to grasp. A practicalsolution needs to support both the quick setup of visual-izations including their general configuration, filtering, etc.and fast navigation & selection techniques. Using additionalinput devices may address this. To this end, we proposedcombined spatial/touch input with mobile phones for pan-ning and zooming in 3D data spaces [4]. We believe thatrelying on well-known, available devices for input wouldbe beneficial for productive use. In addition, these devicescould function as additional personal screens during groupwork. One particular use case is the configuration of vi-sualizations. In light of the strenuous deployment processfor the HoloLens, users need to be able to select, spawn,and reconfigure visualizations at runtime. In addition to

Page 4: Challenges in Collaborative Immersive Visualization

the immersive workbench approach pointed out above, wealso tested mobile phones as configuration interfaces inCollARVis. Offloading this part of the interface to a phonefrees up space for the actual visualizations and reduces oc-clusion for the collaborators. It is also less cumbersome forthis type of GUI-centric interaction than mid-air gestures.

Figure 5: Visual links betweendata objects help to locate viewsand emphasize logical connectionsamong them.

Figure 6: Translation widget on avisualization. Visualizations can bearranged freely in physical space.Using world anchors and persistentconfigurations, a setup can berestored in place at a later time.

Awareness One advantage of an Augmented Reality sys-tem, compared to Virtual Reality, is better awareness ofthe surroundings and, for collaborative systems, the otherusers. However, awareness is still limited. The eyes arecovered by HMDs, spatially distributed content may leadto collaborators not facing each other at all times, and indi-vidual data views may erode the concept of a shared con-text. Thus, we believe that supporting awareness of theother users’ actions is an important challenge. In our toolkitwe currently support this in two ways. We use visual linksbetween selected data objects in different visualizations.This helps users by pointing out relations of the individualviews and serves as a spatial cue to visualizations outsidetheir field of view (see Figure 5). We also highlight usersby drawing a marker over their head (see Figure 1). Whilethis may seem trivial at first, it shows whether a user is con-nected to the system or merely wearing their HoloLens. Italso serves as a real-time indicator of tracking/registrationquality and could be extended to show labels or color-coded status information.

Persistence & Transitions An aspect that is typicallyout of scope for research prototypes is to enable smoothtransitions between collaborative work and phases of moretraditional, individual work. This includes the persistenceof workspace configurations needed to come back to thesystem later on. During the work with CollARVis, we foundthat the configuration of the workspace, i.e., loading data,spawning visualizations, etc. requires some effort, even if

considerable thought is put into designing the UI. We ad-dress this by allowing users to save and load workspaceconfigurations. In our current implementation, workspacesare always loaded in addition to the current scene, allowingusers to combine pre-configured views at will.

Evaluation ChallengesFinally, we see the evaluation of current and future systemsas one of the principal challenges of collaborative immer-sive visualization. The evaluation of any multi-user systemalready poses challenges such as acquiring enough par-ticipants (especially domain experts), complex multi-devicesetups, fusion of log files, etc. For Augmented Reality sys-tems, this is exacerbated by the difficulty to visualize andanalyze 3D spatial interaction in environments containingboth physical and virtual objects. However, if we want toexamine the use of physical space, awareness of the col-laborators’ actions & system state, or user roles, we needtools that support such use cases while preserving the of-ten important relations between the physical environmentand the virtual content. Toolkits such as MRAT [10] are afirst step in this direction.

ConclusionIn this paper we examined typical challenges in the devel-opment of collaborative immersive visualizations in Aug-mented Reality. Building on the experiences with our ARvisualization toolkit, CollARVis, we identified technologicalchallenges, challenges for typical workflows, and the chal-lenge of evaluating such systems. We also briefly discussfirst steps to address some of these challenges. We hopethat these insights can help to nurture the discussion onhow we can bring Immersive Analytics into real-world use.

Page 5: Challenges in Collaborative Immersive Visualization

AcknowledgmentsThis work was funded by the German Research Founda-tion (DFG, Deutsche Forschungsgemeinschaft) under Ger-many’s Excellence Strategy – EXC-2068 – 390729961 –Cluster of Excellence Physics of Life of TU Dresden, as partof TRR 248 (grant 389792660), and DFG grant CollabWall(DA 1319/11-1).

REFERENCES[1] Andrea Batch, Andrew Cunningham, Maxime Cordeil,

Niklas Elmqvist, Tim Dwyer, Bruce H. Thomas, andKim Marriott. 2020. There Is No Spoon: EvaluatingPerformance, Space Use, and Presence with ExpertDomain Users in Immersive Analytics. IEEETransactions on Visualization and Computer Graphics26, 1 (Jan 2020), 536–546. DOI:http://dx.doi.org/10.1109/TVCG.2019.2934803

[2] Wolfgang Büschel, Jian Chen, Raimund Dachselt,Steven Drucker, Tim Dwyer, Carsten Görg, TobiasIsenberg, Andreas Kerren, Chris North, and WolfgangStuerzlinger. 2018. Interaction for Immersive Analytics.Springer International Publishing, Cham, 95–138.DOI:http://dx.doi.org/10.1007/978-3-030-01388-2_4

[3] Wolfgang Büschel, Stefan Vogt, and RaimundDachselt. 2019. Augmented Reality GraphVisualizations: Investigation of Visual Styles in 3DNode-Link Diagrams. IEEE Computer Graphics andApplications 39, 3 (2019), 29–40. DOI:http://dx.doi.org/10.1109/MCG.2019.2897927

[4] Wolfgang Büschel, Annett Mitschick, Thomas Meyer,and Raimund Dachselt. 2019. InvestigatingSmartphone-Based Pan and Zoom in 3D Data Spaces

in Augmented Reality. In Proceedings of the 21stInternational Conference on Human-ComputerInteraction with Mobile Devices and Services(MobileHCI ’19). Association for Computing Machinery,New York, NY, USA, Article Article 2, 13 pages. DOI:http://dx.doi.org/10.1145/3338286.3340113

[5] Marco Cavallo, Mishal Dolakia, Matous Havlena,Kenneth Ocheltree, and Mark Podlaseck. 2019.Immersive Insights: A Hybrid Analytics SystemForCollaborative Exploratory Data Analysis. In 25thACM Symposium on Virtual Reality Software andTechnology (VRST ’19). Association for ComputingMachinery, New York, NY, USA, Article Article 9, 12pages. DOI:http://dx.doi.org/10.1145/3359996.3364242

[6] Maxime Cordeil, Andrew Cunningham, BenjaminBach, Christophe Hurter, Bruce H. Thomas, KimMarriott, and Tim Dwyer. 2019. IATK: An ImmersiveAnalytics Toolkit. In 2019 IEEE Conference on VirtualReality and 3D User Interfaces (VR). 200–209. DOI:http://dx.doi.org/10.1109/VR.2019.8797978

[7] Adam Drogemuller, Andrew Cunningham, JamesWalsh, Bruce H. Thomas, Maxime Cordeil, andWilliam Ross. 2020. Examining virtual realitynavigation techniques for 3D network visualisations.Journal of Computer Languages 56 (2020), 100937.DOI:http://dx.doi.org/https://doi.org/10.1016/j.cola.2019.100937

[8] Georg Eckert. 2019. Vinked Views: An augmentedreality app for immersive analytics. limbusdev,Dresden, Germany.https://github.com/limbusdev/VinkedViewsMRBased on commit 26e4a05.

Page 6: Challenges in Collaborative Immersive Visualization

[9] Kim Marriott, Falk Schreiber, Tim Dwyer, KarstenKlein, Nathalie Henry Riche, Takayuki Itoh, WolfgangStuerzlinger, and Bruce H Thomas. 2018. ImmersiveAnalytics. Vol. 11190. Springer. DOI:http://dx.doi.org/10.1007/978-3-030-01388-2

[10] Michael Nebeling, Maximilian Speicher, Xizi Wang,Shwetha Rajaram, Brian D. Hall, Zijian Xie, AlexanderRaistrick, Michelle Aebersold, Edward G. Happ, JiayinWang, Yanan Sun, Lotus Zhang, Leah E. Ramsier, andRhea Kulkarni. 2020. MRAT: The Mixed RealityAnalytics Toolkit (to appear at CHI 2020).

[11] Arnaud Prouzeau, Antoine Lhuillier, Barrett Ens,Daniel Weiskopf, and Tim Dwyer. 2019. Visual Link

Routing in Immersive Visualisation. In InternationalConference on Interactive Surfaces and Spaces 2019.Daejon, South Korea. DOI:http://dx.doi.org/10.1145/3343055.3359709

[12] Ronell Sicat, Jiabao Li, Junyoung Choi, MaximeCordeil, Won-Ki Jeong, Benjamin Bach, andHanspeter Pfister. 2019. DXR: A Toolkit for BuildingImmersive Data Visualizations. IEEE Transactions onVisualization and Computer Graphics 25, 1 (Jan2019), 715–725. DOI:http://dx.doi.org/10.1109/TVCG.2018.2865152