13
Feiner, COMS W4172, Spring 2018 1 COMS W4172 Introduction 2 Steven Feiner Department of Computer Science Columbia University New York, NY 10027 www.cs.columbia.edu/graphics/courses/csw4172 January 23, 2018 2 Early History Charles Comeau and James Bryan (Philco), Head-tracked orientation control of remote camera (1961) Head orientation sensor Head-worn display Video from remote camera controlled by head orientation http://human-factors.arc.nasa.gov/publications/ellis_what_ve.pdf

COMS W4172 Introduction 2feiner/courses/csw4172/classes… ·  · 2018-01-232016 HoloLens 2016 Oculus Rift ... and requires an extra download ... Microsoft PowerPoint - 3Dintro-18-2.ppt

Embed Size (px)

Citation preview

Feiner, COMS W4172, Spring 2018

1

COMS W4172Introduction 2

Steven FeinerDepartment of Computer ScienceColumbia University New York, NY 10027

www.cs.columbia.edu/graphics/courses/csw4172

January 23, 2018

2

Early History

Charles Comeau and James Bryan (Philco),Head-tracked orientation control of remote camera (1961) Head orientation sensor Head-worn display Video from remote camera

controlled by head orientation

http://human-factors.arc.nasa.gov/publications/ellis_what_ve.pdf

Feiner, COMS W4172, Spring 2018

3

Early History

Ivan Sutherland,Head-tracked VR/AR(1965–70s) Stereo, see-through head-worn

display Synthesized imagery combined

with view of real world

I. Sutherland, A head-mounted three dimensional display, Proc. Fall Joint Comp. Conf.,Dec. 9–11, 1968, 757–764. https://doi.org/10.1145/1476589.1476686

https://www.youtube.com/watch?v=Y2AIDHjylMI

4

Early History

Robert Burton,Scene scanning / tracking (1973) Real-time 3D tracking of

multiple LEDs Laser scanning of scene

(H. Fuchs)

SutherlandR. Burton, Real-Time Measurement of Multiple Three-Dimensional Positions, PhD Diss, U. Utah, 1973

Feiner, COMS W4172, Spring 2018

5

3D UI Taxonomy

Objects representational ↔ abstract, hybrid Mapping from task domain to object properties

Space natural ↔ abstract, hybrid Mapping from task domain to spatial axes

Actions representational ↔ abstract, hybrid Mapping from task domain to actions

Users skills, experience, abilities, body, age, sex,…

Collaboration individual ↔ community, colocated/remote

Tasks work, learn, play,…

6

Example: Panama Canal Locks ControlHybrid objects / natural space

Objects: Abstractions of actual locks, controls Space: Natural space of canal with some abstraction (esp. for

vertical axis)

— , March 15, 1914

Retired in 2007

Feiner, COMS W4172, Spring 2018

7

Example: Google EarthHybrid objects / natural space

Objects: Representational 3D earth model and buildings, with overlaid abstracted roads, political boundaries, icons, labels,…

Space: Natural space of earth with some abstraction (for icons, labels,…)

8

Example: Zygote Body (was Google Body Browser)

Hybrid objects / natural space

Objects: 3D body model, with some abstraction (labels) Space: Natural space of body with some abstraction (for labels)

https://zygotebody.com

Feiner, COMS W4172, Spring 2018

11

Example: PyMOL http://www.pymol.org

Hybrid objects / hybrid space

Molecular visualization E.g.,

Objects: Stick model emphasizes bonds, abstracts shape and color Space: Abstract location and size

12

Example: n-VisionAbstract objects/space: n-Vision(C. Beshers and S. Feiner, Columbia)

Surfaces represent financial instrument models Nested coordinate systems represent variables

Feiner & Beshers, UIST 1990

Feiner, COMS W4172, Spring 2018

24

Example: 3D GamesPlay (Indoors)

http://assassinscreed.ubisoft.com

25

Example: 3D GamesPlay (Indoors VR)

http://jobsimulatorgame.com

Feiner, COMS W4172, Spring 2018

26

Example: 3D GamesPlay (Outdoors AR)

Pokémon Gohttp://wearables.unisa.edu.au/projects/arquake/ Piekarski & Thomas, ISWC 2000

27

Example: 3D GamesPlay (Outdoors AR)

assassinscreed.ubi.comPokémon Go

Feiner, COMS W4172, Spring 2018

30

Example: Virtual Exposure TherapyHealth

Virtual enactment of problematic situation

www.virtuallybetter.com

31

Example: Tactical Iraqi Language and Culture

Training System https://www.alelo.com/tilts/

Work / learning

Virtual training 3D video game used to teach language and cultural nuances

Feiner, COMS W4172, Spring 2018

32

Example: Gunslinger http://ict.usc.edu/prototypes/gunslinger/

Work / learning

Virtual training Controlled 3D environment used to explore how to deal with

difficult situations

33

Reality–Virtuality (R–V) ContinuumP. Milgram, H. Takemura, A. Utsumi, & F. Kishino 1994

Real environment (RE) Completely real world

Virtual environment (VE) Completely synthetic world

Mixed reality (MR) Real world and virtual world objects presented (and experienced) together

Augmented reality (AR) Principally real environment with added computer-generated content

Augmented virtuality (AV) Principally virtual environment with added real content

RealEnvironment

(RE)

AugmentedReality(AR)

AugmentedVirtuality

(AV)

VirtualEnvironment

(VE)

Mixed Reality(MR)

Feiner, COMS W4172, Spring 2018

34

Reality–Virtuality (R–V) ContinuumP. Milgram, H. Takemura, A. Utsumi, & F. Kishino 1994

Properties of points in the MR space Reality AR ↔ AV

Immersion Egocentric ↔ Exocentric

Directness Primary world objects experienced

directly (e.g., optical see-through)↔synthesized (e.g., video see-through)

RealEnvironment

(RE)

AugmentedReality(AR)

AugmentedVirtuality

(AV)

VirtualEnvironment

(VE)

Mixed Reality(MR)

35

Reality–Virtuality (R–V) TaxonomyP. Milgram, H. Takemura, A. Utsumi, & F. Kishino 1994

Worldunmodeled

Worldcompletelymodeled

World partially modeled

Where / What Where + What

Extent of World Knowledge (EWK)

Simplewireframe

Reproduction Fidelity (RF)

Monoscopic

Visible-surfacedetermination

Color

Shading, texture,transparency

Stereoscopic

Global illumination(ray tracing, radiosity)

High definition

Real-time, HiFi,3D animation,Photorealistic

3D HDTV

Monoscopicimaging

Real-timeimaging

Panoramic imaging

Monitor-based Head-worn display

Extent of Presence Metaphor (EPM)

Multiscopicimaging

(multiple points of view;e.g., based on tracked head)

Surrogatetravel

(user can move)

Large-screen(s)M. Naimark, 1991

Note: I’ve kept the original historic terminology

Feiner, COMS W4172, Spring 2018

36

Historic Barriers to 3D UIs“Why has 3D taken so long to catch on compared to 2D?”

1967 SICGRAPH founded 1969 SICGRAPH SIGGRAPH 1974 First SIGGRAPH conference (600 attendees) 1994 PlayStation launched 2001 Xbox launched 2005 Xbox 360 launched 2006 Wii launched 2010 Kinect launched 2013 PlayStation 4, Xbox One launched 2013 Oculus Rift DK1 2014 Oculus Rift DK2 2016 HoloLens 2016 Oculus Rift, HTC Vive, Sony PlayStation VR

37

Historic Barriers to 3D UIs“Why is 3D taking so long to catch on compared to 2D?”

3D rendering (interactive, shaded graphics) Affordable for significant apps only since the late 90s

3D interaction devices Only made sense to build in quantity when 3D graphics

was affordable 3D interaction techniques

Tradeoff between complexity and familiarity, depends on devices Head tracking / stereo display

Needed for true 3D (vs. fixed view / monoscopic) Hard to do well, encumbering

Wide field of view Size/weight/quality/appearance/cost tradeoffs depend on approach

VR- and AR-specific issues Latency crucial when content depends on sensed body pose Displays that can combine real and virtual material:

size/weight/appearance, cameras, optics, projectors

Applicability to task Knowledgeable developers Resistance to change

Technology

Feiner, COMS W4172, Spring 2018

38

Reality Check: “2D is Better than 3D” J. Nielsen

“The screen and the mouse are both 2D devices, so we don't get true 3D unless we strap on weird head-gear and buy expensive bats (flying mice)

It is difficult to control a 3D space with the interaction techniques that are currently in common use since they were designed for 2D manipulation (e.g., dragging, scrolling)

Users need to pay attention to the navigation of the 3D view in addition to the navigation of the underlying model: the extra controls for flying, zooming, etc. get in the way of the user's primary task

Poor screen resolution makes it impossible to render remote objects in sufficient detail to be recognizable; any text that is in the background is unreadable

The software needed for 3D is usually non-standard, crash-prone, and requires an extra download (which users don’t want to wait for)”

—Jakob Nielsen, 1998 https://www.nngroup.com/articles/2d-is-better-than-3d/

39

Bad uses of 3D?

“Most abstract information spaces work poorly in 3D because they are non-physical…

…navigation through a hyperspace (such as a website) is often very confusing in 3D, and users frequently get lost. 3D navigation looks very cool in a demo, but that's because you are not flying through the hyperspace yourself …

Avoid virtual reality gimmicks (say, a virtual shopping mall) that emulate the physical world…”

—Jakob Nielsen, 1998 https://www.nngroup.com/articles/2d-is-better-than-3d/

Feiner, COMS W4172, Spring 2018

40

Good uses of 3D?

“When you visualize physical objects that need to be understood in their solid form. Examples include: surgeons planning where to cut a patient: the body is 3D and the

location of the tumor has a 3D location that is easier to understand from a 3D model than from a 2D X-ray

mechanical engineers designing a widget that needs to fit into a gadget chemistry researchers trying to understand the shape of a molecule planning the layout of a trade-show booth…

…entertainment applications and some educational interfaces can benefit from the fun and engaging nature of 3D,… Note that 3D works for games because the user does not want to accomplish any goals beyond being entertained.…”

—Jakob Nielsen, 1998 https://www.nngroup.com/articles/2d-is-better-than-3d/