45
Human-Centered Design for Immersive Interactions Jason Jerald, PhD

Human-Centered Design for Immersive Interaction - Jason Jerald

Embed Size (px)

Citation preview

Human-Centered Design for Immersive Interactions

Human-Centered Design for Immersive InteractionsJason Jerald, PhD

1

60+ VR Projects, 30+ Organizations

2

#@TheVRBookJason Jerald, PhdWhat makes me qualified to give a tutorial on VR interactions?20 years of VR experience on over 60 projects with over 30 organizations ranging from academics to startups to Fortune 500 Companies.What Ive learned is that every VR project is different. There are few absolute truths when it comes to VR. But theory, processes, and guidelines can be useful.

2

TheVRBook.netWin a free book! @TheVRBook#VRWTB

#@TheVRBookJason Jerald, PhdIf you want to learn more about human-centered design for VR then check out my book at TheVRBook.net.20 years of research summarized into 600 pagesNot necessarily a leisurely Sunday afternoon read, but dramatically simplified. No equations, no codethe focus is on the design from a human-centered approach.Topics include human perception, understanding and reducing sickness, environmental design, interaction patterns, and iteration concepts.Over 600 guidelines for designing VR applications, hundreds definitions, and over 300 references pointing to more detail.Go to TheVRBook.net to learn more about applying VR design concepts to your applicationIf you are teaching a VR course, then talk with me about getting a free evaluation copy.

3

VR is not New!1830s to 1990s

Images Courtesy of The VR Book (with permissions from original sources)

#@TheVRBookJason Jerald, PhdId like to point out VR is not new!Depending how you define it you could go back to prehistoric times of cavemen drawing on walls.Here are some systems ranging from the 1830s to the 1990s. For example, Wheatstone stereoscopeBrewster stereoscope from the 1860sFlight simulators were really what got VR going in the 1900sSensorama in the 1960s, included a stereoscopic display, wide field of view, color, stereo sound, wind, smell, a motion/vibration platform Tom Furness and Ivan Sutherland creating head-tracked computer rendered head-mounted displays in the 1960s

4

The Most Exciting VR Tech?

#@TheVRBookJason Jerald, PhdThere has been quite a bit of VR over the years. However, for me there is one tech that is most exciting.Many people are only vaguely aware of this.Absolutely essential, without this VR is not possible.Has been available in various forms for thousands of years.

5

The Most Exciting VR Tech

The Human Brain!~100B NeuronsThousands synapses/neuron~1B Synapses/mm^3150km of nerve fibersSix degrees of separation

#@TheVRBookJason Jerald, PhdThe Human Brain!A massive number of neuronsthe basic unit of the nervous system.However, the connection between those neurons is what is most impressive.On average, there are thousands of connections (synapses) from a single neuron connecting to other neurons.There is a billion of these synapses for a cubic millimeter of brain matter.150km of nerve fibers is enough to circle the earth multiple times!A neuron can reach any other neuron on average by traveling through only six other neurons.This results in a massive parallel machine.Todays computers cannot compare, some do but that is comparing apples to oranges.The result of this massive parallel machine called a brain integrated with the body is a perception and action entity that enables us to interact with the world

6

PerceptionThe Human SensesSightHearingBalance and physical motionTouchProprioceptionSmell and TasteMultimodalSensory Substitutions

#@TheVRBookJason Jerald, PhdOur interactions with the real and virtual worlds start with perception.VR is much more than just graphics! Especially when it comes to interacting.Most VR creators focus primarily on visuals.Some do audio, but mostly as an afterthought.Lack of physical touch is one of the greatest challenges of VR.Proprioception is extremely important for VRthe sense of where your body parts are located (e.g., close your eyes and touch your nose). Normally dont consciously think about proprioception but when its missing its a problem.Smell and taste are important for some applications but is difficult to do well with VR (although research and proof of concepts have shown this to be doable).Bringing the senses together in a coherent manner is important to induce a sense of presence in users and for interactions to be intuitive. Spatial and temporal compliance are most important (e.g., proprioception matches visuals). Multimodal input/output can be challenging but effective.

7

What We Perceive is not Reality

#@TheVRBookJason Jerald, PhdThis perception of the real (or virtual) world is not reality or the truthOur perceptions are an interpretation of reality.We do not perceive objects directly, for example, objects arent poking into our retinas, we instead perceive the photons bouncing off those objects into our eyes.For VR, we hijack the human senses and replace real world cues with artificially created cues that the mind interprets as a form of reality.Our sense of reality can be quite distorted, different for different people and under different circumstancesA song can be quite emotional and have some significant meaning for one person while not for another person.

8

The Iterative Perceptual ProcessWith VR, we hijack the human perception-action systemHuman Output = VR Input/TrackingVR Output = Human InputImage Courtesy of The VR Book(adapted from Goldstein [2007])

#@TheVRBookJason Jerald, PhdHere is a version of Goldsteins iterative perception-action processThis is a model of how distal stimuli out in the world reaches our receptors and how we process itWe can replace the real world distal stimuli with computer generated stimuli and sense the users actions in order to update that stimuliHuman output becomes VR InputVR output becomes human input9

LatencyCalibrationTracking accuracy & precisionField of ViewRefresh rateJudder & FlickerDisplay response & persistenceVergence/AccommodationStereoscopic cuesReal world peripheral visionFit/Weight/center of massMotion platformsHygieneTemperatureDirty screensFactors of Adverse Health EffectsSystemIndividualHistory of motion sicknessVR experienceHealthThinking about SicknessBeliefGenderAgeMental model/expectationsInterpupilary distanceNot knowingSense of balanceFlicker fusion frequency thresholdReal-world task experienceMigraine history

ApplicationFrame rateLocus of controlVisual accelerationPhysical head motionDurationVectionBinocular-occlusion conflictVirtual rotationGorilla armStanding vs sittingRest framesBinocular disparityVR entrance & exitLuminanceRepetitive strain

#@TheVRBookJason Jerald, PhdMany VR experts argue what causes VR sickness as if there were one single factor.The ones listed here are the ones we know about, many more have been hypothesized.The leading hardware manufacturers are solving many of the system challenges.E.g., latency is not much of a problem any more for the best HMD systems.But we should still be aware of them to select the best hardware and to recognize problems when they occur.Unfortunately we dont have much control over individual factors.But we should understand our target audience.We can also influence their mental model.The factors that we have the most control over are application factors.Especially important for navigation schemes.But bad non-navigation interfaces can cause problems as well.For example put the interface in a comfortable position in the torso reference frame near the waste.We will talk about some of these throughout the talk.

10

Motion SicknessThe BasicsThe biggest problem is motion sicknessVaries greatly from person to personAs users:Stop at the first sign of sicknessBe well-hydrated and not hungry, not hung overWatch out for colds and sinus congestionAs creators:Maintain the frame rate of the display hardware 100% of the timeMake interactions comfortableInitial signs of discomfort may very well lead to sicknessMinimize accelerations (e.g., roller coaster motions) and virtual rotationsRemind your users not to tough it outWipe down equipment periodically

#@TheVRBookJason Jerald, Phd

11

VR Hardware Beyond Displays12

#@TheVRBookJason Jerald, PhdThere is much more to VR beyond displaysHardware is varied and depends upon project goalsVR hand input is varied and depends upon project goals.E.g., location based entertainment can be very different than games in the home12

An Example VR Device

VR Apocalypse by & Virtuix

#@TheVRBookJason Jerald, PhdHere is an example of some work we did with Virtuix as shown on ABCs Shark Tank that uses an omni-directional treadmill to navigate through the virtual world.It is not clear if and how much this treadmill technique reduces motion sicknessA great area of research anyone in this audience could study.More interaction patterns and techniques will be presented later in the course.13

Input Device Classes

#@TheVRBookJason Jerald, PhdNo single input device class has all qualities or is the ideal solution for all applications.Most appropriate device depends on goals and the application/experience.Hybrid and multimodal interaction can be ideal (if designed and integrated well), but more expensive and difficult to implement. Currently tracked hand-held controllers are the most appropriate for most, but not all, VR experiences

14

The Most Important Input Device??

#@TheVRBookJason Jerald, PhdIve seen some crazy VR input devices.However, there is one type of input that is especially important for VRClue: Shown on previous slides. Multiple timesTurns out these input device is important for the real world as well.

15

The Most Important Input Device!The Human Hands!Large proportion of sensory and motor cortex devoted to the handsHand-tracking technology is simply the mediator that brings the hands into VR

16

#@TheVRBookJason Jerald, PhdThe human hands!A relatively large part of the human brain is dedicated to input/output of the handsHardware simply enables us to bring the hands into and interact with the experience.

16

Interaction PatternsHand SelectionPointingImage-PlaneVolume BasedSelectionWalkingSteering3D Multi-TouchAutomatedViewpoint ControlDirect HandProxy3D ToolManipulationWidgets & PanelsNon SpatialIndirect ControlPointing HandWorld-in-MiniatureMultiModalCompound

#@TheVRBookJason Jerald, PhdNow lets move on to some common interaction themes that occur across VR applications.I looked at ~100 interaction techniques and categorized them into interaction patterns.Very different from software design patterns. These patterns are from the users point of view.I organize the patterns into 5 overarching patterns.These 16 patterns can be further broken down into more specific interaction techniques.

17

The First Hand Technique

Trainexus by

#@TheVRBookJason Jerald, Phd- The first thing people do when hand tracking becomes available is direct selection and manipulation.Here is what I built back in 2013 just after the Oculus Kickstarter DK1 shipmentQuite a bit can be done just by tracking three pointsthe head and the two hands.Here, Im threatening Paul Mlyniec, the President of Digital ArtForms. Since his hands are tracked he doesnt have to think of where the surrender button is. He just naturally raises his hands to non-verbally say dont shoot!Very similar to how we interact in the real worldOther parts of the body can be moved with inverse kinematics and animations. For example the feet move with a walking animation when pressing forward with joystick on a hand-held controller.Inverse kinematics can be extremely compelling if done well and where appropriate.

18

Hand Selection/Manipulation PatternsHands with Arms

Cure Fred by

#@TheVRBookJason Jerald, PhdExample of some work we did with Digital ArtForms, Sixense, and the Wake Forest School of Medicine to teach kids about how their brains work.Learn by doingSimply reach out to grab the brain parts-intersect the hand with the object and then push a buttonCan do so much more than you can with a real-world puzzle. E.g., integrating with digital media, smarter interactions, special effects.Play movie

19

Audio Feedback

Cure Fred by

#@TheVRBookJason Jerald, PhdThe Pointing Pattern

Zombie Apocalypse by

#@TheVRBookJason Jerald, PhdPointing is normally considered to be non realistic (unless simulating a laser pointer)Enables selection at a distanceHere we have an exocentric view of the world from an egocentric perspective.Here, pointing enables the user to shoot electricity from his hands to protect a friend in a smaller world (Work with Digital ArtForms)

21

Interface in the Hands

Trainexus by

#@TheVRBookJason Jerald, PhdWhy do VR developers think user interface elements should go in the head or world reference frame?A simple solution:Put UI signifers/labels in the hands.Simply look at your hands when performing an action until you get good at the tasksimilar to real reality.The signifiers are simple transparent textures with labels attached to a Razer Hydra.A 3d model matching what the user physically feels would be better.No finger tracking, but close enough for users to intuitively know what button does what by touch.Enable users to turn on/off the visuals. Important for both novice and expert users.22

Realistic vs. Nonrealistic HandsHands without arms

Trainexus by

Zombie Apocalypse by

#@TheVRBookJason Jerald, PhdVR experts argue if arms or realistic looking hands are better.Like many other great VR debates, the answer is it depends!Arms are most appropriate when:Realism is importantIt can be assumed the user will always be facing in one direction or the torso is tracked.Arms are not appropriate when:Reaching further than the physical length of your arms is importantFor example:Reaching to pick up items off the floor from a seated positionYou want to reach out further into the distance than you could if you were physically constrained with how far your arms can reach.The Go-Go Techniques is a hand selection technique that has 1to1 mapping within personal space at 2/3 of your physical arm length. Beyond 2/3 of your arm length, your arm reach expands exponentially enabling selection at a distance. Hand selection/manipulation can be quite compelling even when the hands dont look like hands (i.e., 3d cursors).With appropriate spatial and temporal compliance, 3D cursors feel like they are your handseven though they dont look at all like your hands.

23

The Image-Plane PatternThe Head-Crusher TechniqueFrom Pierce et al. [1997]

#@TheVRBookJason Jerald, PhdThe Image-plane pattern is rarely used pattern but can work quite well for selecting objects at a distance.Shown here is the Head-Crusher Technique.Think of the world as a 2D plane in front of the user.Hold the hand up and place the two fingers of the dominant hand around the object of interestPush a button on the non-dominant hand or verbally say select.Requires closing one eye and can result in gorilla arm if used often.

24

Widgets & Panels Pattern

2D Desktop Integration TechniqueFrom Taylor et al. [2010]

#@TheVRBookJason Jerald, PhdWidgets and panels often take desktop applications or metaphors and bring them into the virtual worldOften not ideal, but easy to do.Hand pointing via ray selection is most common and most usable.Here Im bringing in arbitrary desktop applications into a CAVE VR system.If you want to do a quick mathematical calculation, you dont want to have to exit the virtual world to bring up a calculator app and its not worth writing an immersive 3D calculator app. Instead of exiting the virtual world, bring the existing application into VR.Many other forms of immersive widgets and panels. For example hand-held panels, panels above the head, finger menus, marking menus, etc.25

Hand-held panelsButtonsSlidersDialsColor cubes

Images Courtesy of Sixense & Digital ArtForms

#@TheVRBookJason Jerald, PhdPutting a virtual panel on the hand works surprisingly wellProvides double dexterity where the nondominant hand provides a reference frame to work in.Interface always availableTurn on/off with the click of a physical buttonExamples here include traditional GUI elements (buttons, sliders, drop down menus) as well as more interesting GUI elements (dials, marking menus, color cube)26

3D Multitouch PatternAppropriate for abstract non-realistic interactionsContent/Data independentEmergency preparednessMedical VisualizationMakeVRSolves Gorilla ArmReduces Sim Sickness

#@TheVRBookJason Jerald, PhdSimilar to 2D multitouchThink of manipulating the world as an objectgrab with both hands to translate, orient, and scale the worldThinking about manipulating the world as an object results in the user thinking of the world moving instead of moving through the world, resulting in reduced motion sickness.Appropriate for abstract non-realistic interactionsData independent. Weve used this across a variety of applications including volumetric datano polygons required because no intersection computations are required.Solves gorilla arm. A focus study found users could go 4 hours with no report of arm fatigue.To translate, simply grab the world with one or both handsTo scale/zoom, pinch with two hands instead of two fingersTo rotate, grab the world with both hands and orbit about the midpoint between the hands as if it were a globeCan do translation, scale, and rotation simultaneously with a single gestureinteraction is direct & immediatestarts as soon as you push the buttonno need to wait for the system to recognize the motionResults in the feeling that you are are crawling through and stretching space

27

3D Multitouch

Zombie Apocalypse by

#@TheVRBookJason Jerald, Phd3D Multitouch Medical Visualization

iMedic by

#@TheVRBookJason Jerald, PhdAn example of 3D Multitouch shown at SIGGRAPH 2011. Non-HMD here but the 3D multitouch actually works better with HMDs as that is what it was originally designed for.Play movie

29

3D Multitouch30

MakeVR by &

#@TheVRBookJason Jerald, PhdThe same interface is used with MakeVR (new company should be formed by VRDC 2016). An immersive CAD modeling program.3D content creation and world building intuitive and accessible by anyone.Currently works with Sixense STEM, Razer Hydras, and SpaceGrips.Support coming for Oculus Touch controllers and HTC Vive controllers30

Computer-Aided Design

MakeVR by &

#@TheVRBookJason Jerald, PhdFor those not familiar with MakeVR, here is an example video of what is possible.

31

What is Common to All VR Dev?Iteration!

#@TheVRBookJason Jerald, PhdWhat is common to all VR Development?Iteration is absolutely essential!More so than for creating any other productIf you get it wrong here you can make users sick or injure themWill get it wrong on first attemptsWhat you think would work great often does not

32

Expert Evaluations33

#@TheVRBookJason Jerald, PhdA big part of what we are working at NextGen Interactions is to creating processes for providing better feedback to VR creators for their specific projects.That starts with task analysis to really understand what is trying to be accomplishedThat leads to expert guidelines-based evaluations that are the first steps to making sure the most important items are being done well. That leads to formative usability analysis which is using techniques such as speak-out-loud protocols where the user explains out loud what he is doing and thinking, to observational analysis of breaks in presence.Then those that want to get more formalized results can conduct formal scientific studies to compare different techniques or conditions.

33

SummaryWidespread access to tracked hand-held controllers will take VR to the next levelBut only for applications that are designed wellThere are no universal answersSelecting devices and interaction patterns are project dependentUnderstand tradeoffs and where what is appropriateIterative. Iterate. Iterate.Have fun!

#@TheVRBookJason Jerald, PhdWidespread access to tracked hand-held controllers not only provides a large user base but provides developers with quality input devices that will lead to innovative interfaces that nobody has yet considered.Just because anything is possible doesnt mean everything will work. Viewpoint control especially requires caution due to risk of sickness and injury.Iteration is absolutely essential for VR.Most importantly have fun! Otherwise what is the point of reality, whether virtual reality or real reality?

34

@TheVRBook

15% Discount CodeTheBest16TheVRBook.net

#@TheVRBookJason Jerald, PhdIf you want to learn more about human-centered design for VR then check out my book at TheVRBook.net.20 years of research summarized into 600 pagesNot necessarily a leisurely Sunday afternoon read, but dramatically simplified. No equations, no codethe focus is on the design from a human-centered approach.Topics include human perception, understanding and reducing sickness, environmental design, interaction patterns, and iteration concepts.Over 600 guidelines for designing VR applications, hundreds definitions, and over 300 references pointing to more detail.Go to TheVRBook.net to learn more about applying VR design concepts to your applicationIf you are teaching a VR course, then talk with me about getting a free evaluation copy.

35

Questions?

#@TheVRBookJason Jerald, Phd

36

ReferencesJerald, J. [2015] The VR Book: Human-Centered Design for Virtual Reality. Morgan & Claypool Publishers and ACM Books.Gabbard J. (2014). Usability Engineering of Virtual Environments. In K. S. Hale and K.M. Stanney (Eds), Handbook of Virtual Environments (2nd ed., pp. 721-747). CRC Press.Norman D. [2013] The Design of Everyday Things, Expanded and Revised Edition. Basic Books.Pierce, J., Forsberg, A., Conway, M., Hong, S., Zeleznik, R., and Mine, M. R. (1997). Image Plane Interaction Techniques in 3D Immersive Environments. In ACM Symposium on Interactive 3D Graphics (pp. 3944). ACM Press.Taylor, R., Jerald, J., VanderKnyff, C., Wendt, J., Borland, D., Marshburn, D., Sherman, W., and Whitton, M. (2010). Lessons about Virtual Environment Software Systems from 20 Years of VE Building. Presence: Teleoperators and Virtual Environments, 19(2), 162178.

#@TheVRBookJason Jerald, PhdAdditional Slides

#@TheVRBookJason Jerald, PhdA New Artistic MediumThe screen disappearsThe best VR applications are designed from the beginning for VRToday the technology is being soldTomorrow the story and emotion will be sold

#@TheVRBookJason Jerald, Phd

39

Adverse Health EffectsChallengesMotion sicknessEye strainFlickerAccommodation-vergence conflictOcclusion-binocular Conflict

AftereffectsSeizuresPhysical FatigueHeadset FitInjuryHygiene

#@TheVRBookJason Jerald, PhdThere are multiple health challenges of VR.For now, we will focus on the worst offendermotion sickness.40

Reference FramesHeadEyesTorsoVirtual WorldReal WorldHands

Image Courtesy of NextGen Interactions

#@TheVRBookJason Jerald, PhdThinking about and understanding reference frames is important for human-centered interaction design.Many of you computer graphics experts are intimately familiar with reference frames.Screen spaceObject spaceWorld spaceReference frames define:The axii that determine translation along x, y, & z and also rotation along yaw, pitch, and roll.How objects move relative to other objects (e.g., attached to the head, hand, or world).The image here looks complex in 2D but becomes intuitively obvious when immersed.Head reference framesimilar to screen coordinates in 2D development. Blue Arrows in torso reference frame represent the forward direction.It is important for usability and comfort to place your interfaces in the appropriate reference frames.We will discuss these reference frames and how they are best used in more detail.In some cases reference frames are equivalent to each other. E.g., for a fully walkable system (e.g., HTC Vive) the virtual world and real world ref frames can be made to be equivalent

41

Example PatternWalking Patterna form of viewpoint controlSpecific walking techniquesreal-world walkingredirected walkingwalking in placetreadmill walking

#@TheVRBookJason Jerald, PhdViewpoint control patterns are especially important due to challenges of motion sickness.An example of a viewpoint control pattern is The walking pattern.There are various ways to walk within VR. Example walking techniques are:Real World walking, such as with the HTC Vive.Redirected walking, such as with The Void.Walking in Place, such as with Stompz.Treadmill walking, such as with the Virtuix Omni.42

World-in Miniature Pattern

iMedic by

#@TheVRBookJason Jerald, PhdA WIM is basically a real-time map of the world a user is in.Can hold in the hand or have it attached in the world or to your body (torso reference frame)Can reach into the map and move things that then moves the corresponding object in the larger world

43

3D Multitouch Medical Visualization

iMedic by

#@TheVRBookJason Jerald, PhdTake on role of robot.Crawl through a body with a friendGreen user has carved out the midsection of a body to take a closer look inside.Blue user follows himMedical teleconsultation2 doctors at remote locations could examine a dataset and point things out that would be difficult to do in 2D setting.

44

3D Tools PatternThe Jigs Technique

MakeVR by &

#@TheVRBookJason Jerald, PhdPrecision is normally extremely difficult in VR.No physical constraints as there is with a mouse on a desk.So instead add artificial constraintswhat we call jigs. Shown here as used with MakeVR. Think of carpenter tools to provide the user more precise interaction.E.g., grids can be attached to polygon faces and then the grid spacing can adjusted. Jigs are constraint tools, similar to tools that real world carpenters use to build things.They can constrain objects to a plane or a models surface.Or snap objects to grid points that can be spaced as needed.Or snapping an object to another object.

45