49
Woontack Woo (禹雲澤), Ph.D. KAIST UVR Lab. Joint WS@ [July 30, 2015]

Introduction to UVR Lab 2.0

Embed Size (px)

Citation preview

Woontack Woo (禹雲澤), Ph.D.KAIST UVR Lab.

Joint WS@ [July 30, 2015]

Director: Woontack Woo

International CollaboratorsTae-Kyun Kim @ Imperial College, LondonVincent Lepetit @ TU Graz, AustriaAntonis Argyros @ U. of Crete, GreeceM.Billinghurst@USA, S. Feiner@CU, S.Kim@USCCTRI AHRC/UVR Lab. Members2 Post Doc.s & 7 Researchers4 Ph.D. & 4+2 MS students3 Interns1 Admin Staff1 Visitors

The Ubiquitous VR research aims at the development of new computing paradigms for ”DigiLog Life in Smart Spaces”:

UVR is Augmented P.I.E.

The Ubiquitous VR research aims at the development of new computing paradigms for ”DigiLog Life in Smart Spaces”:

UVR is Augmented P.I.E.

Crossing Boundaries for AH in UVR 2.0Augmented Perception, Intelligence, Experience

UVR

3D Visionfor User Localization

Quantified Self for Personalized

Augmentation

I3 Visualization for Organic UI

3D Interaction for Hand-based Collaboration

Augmented Reality as a New Medium

UVR, Real-Virtual Symbiosis

AR2.0 Augmented Human for UVR

Current UVR Lab Projects

Summary, Q&A

New Media TrendsSmaller, cheaper, faster, smarter, more intimateDesktop/LaptopMobileWearable?What’s Next?

1980s

• IBM : Personal Computer

1990s

• MS : Personal Computing

2000s

• Google : Information Sharing over the Internet

2010s

• Apple : Mobile Computing

2020s

• S, N, D ? : new value over IoT/IoE

C

R

C

R

C

RC C C

C

???

UI Paradigm Shift in New MediaType & Click:Desktop, LaptopPoint & Touch:MobileWhat’s Next?

Gaze & Gesture:Wearable

http://goo.gl/sFiJAo

Preferred location

for wearables

https://youtu.be/EvyfHuKZGXU

VR/AR will be “the next mega tech theme” through 2030.- Gene Munster (Piper Jaffray)

http://goo.gl/XCPkE0

Milgram's Reality-Virtuality Continuum [94]

Azuma's definition on AR [97]combines real and virtualis interactive in real timeis registered in real 3D world

R.Azuma,ASurveyofAugmentedReality,Presence,Vol.6,No.4,Aug.1997,pp.355‐385.

P.MilgramandA.F.Kishino,TaxonomyofMixedRealityVisualDisplays,IEICETrans.onI&S,E77‐D(12),pp.1321‐1329,1994.

Possible ApplicationsEnterprise Maintenance & Repair Boeing@AWE2015: workers 90% better, 30% fasterMedical, trainingConstruction, houses, apartmentsConsumer products, furniture, decorationAugmented ads, packages, productsIndividual Entertainment, gamesInfotainmentDigital arts, fashionDigital Heritage

1968 HM 3D Display (I. Sutherland @ Utah)

1992Augmented Reality (T. Caudell@ Boing)

1994WearComp (S. Mann)Taxonomy (P. Milgram)

1995NaviCam (J. Rekimoto @ Sony)

1997A survey of AR (R.Azuma)MARS (S. Feiner @ Columbia)Wearable AR (T.Starner)

1998Tinmith (B. Thomas)

1999ARToolkit (H. Kato @ HC)WorldBoard (J. Spohrer @ IBM)

2000AR Quake (B. Thomas @ SA)BARS (S. Julier @ )

2001mCollaAR (G. Reitmayr)MagicBook (M. Billinghurst)Ubiquitous VR (UVR Lab.)

2003Human Pacman (A.D. Cheok)iLamp (R. Raskar @ MERL)PDA-AR (D. Wagner @ TUGraz)

2004Phone-AR (M. Mohring @ BU)

2007PTAM (G. Klein)SONY “Eye of Judgement”DigiLog Book (UVR Lab.)CAMAR (UVR Lab)

2008Wikitude (Mobilizy)DigiLog Miniature (UVR Lab.)

2009Layar (SPRXmobile)Arhrrrr! (GATECH)SLAM on iPhone (G. Klein)SONY “EyePet”

2012Trans-Space (KAIST UVR Lab)SONY PS Vita (mAR)

Wearable AR/VR Industries Meta Google + Magic LeapMicrosoft HoloLensApple + Metaio Intel + ReconFacebook + Oculus RiftSamsung Gear VR + FOVESony Morpheus

Get video from camera

Recognize Object of Interest

Estimate position and orientation of

the camera

Render the augmented

scene

Process the interaction

Update the status

Components of an AR SystemFusing CG models with real environment

Get video from camera

Recognize Object of Interest

Estimate position and orientation of

the camera

Render the augmented

scene

Process the interaction

Update the status

Required Math: 3D Tracking for AR3D Geometry, Linear Algebra, Signal/Image Processing, Machine Learning, Numerical Robust Optimization, etc.

Range of Techniques

2D Picture Tracking (K.Kim) 3DVaseTracking(Y.Park) 3DVaseTracking(Y.Park)

Tracking (W. Baek) Tracking MO (W.Back) Multiple object Tracking (Y.Park)

ISMAR2008 ISMAR 2009 VC2010

3D objects Tracking, ISMAR2008 (K.Kim)

MotionBlurin3Dtracking,ISMAR2009(Y.Park)

ScalableTracking,VC2010(KKim)

ISMAR2010 ISMAR 2011 ICAT2011

Modeling & Tracking, ISMAR2010 (KKim)

Depth-assisted Tracking, ISMAR2011 (Y.Park)

Depth-assisted Detection, ICAT2011 (W.Lee)

Ubiquitous VR for DigiLog Life (Woo)3D Link between dual (real & virtual) spaces with additional information/contentCoI (Context-of-Interest) Augmentation, not just sight: sound, haptics, smell, taste, etc.Bidirectional Interaction for H2H/O2O communication in dual spaces

SPACESPrivate (3rd skin)

SocialGeneral

Real Space Social Networks

Seamless Augmentation btw

dual spacesHow to

LinkSeamlessly?

CoI

LINK

U-Content

Virtual space

From AR toward Ubiquitous VREnhance Experience, Engage, EdutainmentUVR @ UVR Lab 2008-09

UVRSimulator AugmentedRoom

From AR toward Ubiquitous VREnhance Experience, Engage, EdutainmentUVR @ UVR Lab 2008-09Enhance Experience, Engage, EdutainmentDigiLog MiniatureKildong DigiLog Agents

StorytellingapplicationIntegratedwithvirtools*

StorytellingapplicationIntegratedwithvirtools*

UVR with 'Internet of Everything (Cisco)'IoE, Internet of Things That ThinkBrings together people, process, data, and things to make networked connection more relevant and valuable

Process

People

Data,Information,Knowledge,

wisdom

Things (IoT)

P2P Collaboration(People to People)

P2M Analytics(People to Machine)

M2M (Machine to Machine)

P2K Analytics(People to Knowledge)

http://goo.gl/Fz2BNp

Metaverse

Modeling

Interaction

Networking

SensingMetaverse

Measured Space

Physical Space

Virtual Space

SensingModeling

Augmentation& Interaction

시간축

공간축(경도)

공간축(위도)

Map as a New Platform with Holistic Layers

• CoI-aware• Environmental context: • User context: knowledge,

experience, preference• Just-in-time visualization

• Interaction• Mash-up authoring• SNS• Extension of human

perception

User Engagement

Context-awareness

Realistic 3D augmentation

What are Keys for UVR eco-System?Augmented Content is a King, Context is a Queen controlling the King, and User is the God!

• Multicore/GPGPU• Object recognition/

tracking• Light source estimation• Physics engine• AI engine

How to avoid the fate of VR?"to look for DigiLog, the intersections between the digital world (3D, SNS) and analog world (CoI)”

Future Human in UVR era?AR (Push): extended space-timeAH (Pull): how to extend human’s physical, intellectual,

social abilities?

http://goo.gl/yJPDtC

… After Singularity?(by Ray Kurzweil)

?

From AR to Augmented HumanAugmented Human means

Augmented Perception, Intelligence, Experience

Enhancing 5+ Sensation and perceptionOffering wisdom with QS and Linked Open DataImproving spatio-temporal-social ability

How to achieve AH?Quantified Self Holistic QS for Qualified Social Life

Current ProjectsTrans-Space : Hand-based Collaboration with 3D Glasses Supported by KIST for 2010-2019

ARtlet 2.0: AR authoring with 3D Glasses Supported by NRF for 2014-2017

K-Culture Time Machine : AR for Historical Seoul Tour Supported by MoCTS for 2014-2017

QS from Emotion-Mining: Smile Coach in Smart PhoneSupported by KAIST for 2014-2015

Interaction & Collaboration in Trans Spaces• Supporting bare hand interaction and collaboration with virtual

object by high-DoF hand trackingGoal

• Interaction with virtual object without additional interface in AR• Remote collaboration in the same Trans-Space

Needs & Values

• Building Video see-through HMD based Augmented Reality• Hand Tracking by egocentric RGB-D cameraApproach

A’s avatarBare handinteraction

Space #1

Virtual/Physical Object-of-Interest

Space #2

Local user A

2-WayNetworking

Finger Tracking in Trans Spaces

Y.Jang, S-T.Noh, H.J.Chang, T-K.Kim, W.Woo, "3D Finger CAPE: Clicking Action and Position Estimation under Self-Occlusions in Egocentric Viewpoint," IEEE TVCG, vol. 21, no. 4, 2015. (presented in IEEE VR 2015).

Ours HSKL FORTH

Sensor-assisted Hand Tracking• Egocentric Hand & Arm pose estimation with Visual-Inertial

Sensor Fusion Goal

• Assist low frame-rate visual tracking with high frame-rate inertial tracking

Needs & Values

• Calibration for motion capture with on-body sensor network• Arm pose estimation and directApproach

In progress

Shoulder

Elbow

Upper Arm

ForearmIMU

IMU

Camera + IMU

Hand Gesture Tracking in Trans SpacesOurs HSKL FORTH

Y.Jang, S-T.Noh, H.J.Chang, T-K.Kim, W.Woo, "3D Finger CAPE: Clicking Action and Position Estimation under Self-Occlusions in Egocentric Viewpoint," IEEE TVCG, vol. 21, no. 4, 2015. (presented in IEEE VR 2015).

Collaboration in Trans SpacesOurs HSKL FORTH

S-T. Noh, H-S. Yeo, W. Woo, “An HMD-based Mixed Reality Framework for Avatar-MediatedRemote Collaboration Using Bare-Hand Interaction”, ICAT-EGVE 2015, under review

AiRSculpt: A Wearable AR 3D Sculpting• To quickly create and manipulate 3D virtual content directly with

their bare hands in a real-world setting Goal

• High entrance barrier of 3D graphic tool• Need of bare-hand user interface for HMD

Needs & Values

• Video see-through HMD with RGB-D Camera• 3D Tracking Module and Sculpting SystemApproach

Sung-A Jang, Hyung-il Kim, Woontack Woo, Graham Wakefield, “AiRSculpt: A Wearable Augmented Reality 3D Sculpting System,” HCII 2014, Vol.8530, pp.130-141, Jun. 2014

Wearable UI/UX

Jooyeun Ham, Jonggi Hong, Youngkyoon Jang, Woontack Woo, “Smart Glasses’ Augmented Wearable Interface based on Wristband-type Motion-aware Touch Panel,” in Proc. IEEE 3DUI, 2014. (Accepted)

ARtalet2.0: Augmented Space Authoring• Geometry-aware Interactive AR Authoring using Smartphone in

3D Glass EnvironmentGoal

• Authoring method that enables user to easily build AR world in in-situ environment and manipulate 3D virtual content to it

Needs & Values

• Obtains 3D image features from unknown 3D space, analyzes geometry, and interactively align local reference coordinates Approach

Concept Figures

하태진, 우운택, “착용형 증강현실 저작을 위한 기하 인식 기반 정합 좌표계 보정," 한국그래픽스학회 (KCGS),pp. 57-58, Jul. 2014.

ARtalet2.0: Subspace Selection• Remote Subspace Selection using Bare-hand input for Invoking

Subspace in Augmented Reality SpaceGoal

• Natural space selection under physical boundary conditions• Space-of-Interest as extended unit of Object-of-Interest

Needs & Values

• Definition of Subspace set by 3DoF freehand pinch-tip pointer• 4 progressive selection techniques ( RSC / RCO / TSC / TSL )Approach

H.Lee, S.Noh, and W.Woo, “Remote and Progressive Space Selection with Freehand Pinches for Augmented Reality Space," preparing for IEEE TVCG.

User’s view with Invoked Subspace

Egocentric Subspace Selection

leads

ARtalet2.0: Realistic Rendering

객체의 빛특성 분석객체의 빛특성 분석

실제 광원추정

실제 광원추정

가상 물체Rendering가상 물체

Rendering

현실과의이질감

현실과의이질감

사용자의몰입 감소사용자의몰입 감소

“Augmented Reality Virtual Fitting Room”

관련연구

활용사례 문제점

K-Culture Time Machine• Development of Creation and Provision Technology for

Time•Space-connected Cultural ContentsGoal

• New fusion of cultural contents that connect a variety of cultural contents in time and space of the various agencies

Needs & Values

• Semantic modeling and analysis of cultural contents• AR/VR visualization of time•space-connected cultural contentsApproach

“K-Culture Time Machine: Development of Creation and Provision Technology for Time•Space-connectedCultural Contents,” HIMI 2015

Virtual events

Virtual map

Space-Telling for ARRepresentation methodology reflecting characteristics of AR technologySpace-drivenAugmented Reality representational methodology based on mashup hyperlink to spatio-temporal related contents of heterogeneous database with authoring tool

2D Map

3D CG Model DB

Image DB

Video DB

Cultural Heritage Info. DB

3D Physical Space

SpatialCoordinate

TemporalCoordinate

15c 16c 17c

Hyperlink

Virtual Space

“E. Kim, W. Woo, “Augmented Reality Based Space-telling Framework for Archeological SiteTours” (Korean), Journal of HCI Korea, 2015. (will be published)

User Localization for Outdoor AR•Enabling Robust Camera Tracking & Localization in Outdoor AR Env.Goal

• Various Conditional Changes in Outdoor Environment.• (ex- non-static obj., light condition, scalability)

Needs & Values

• Keyframe-based 3D Point Registration.• Real-time Camera Tracking and Localization.Approach

off‐line processing

3D Feature PointExtraction & Reconstruction

Keyframes (R|t)

Sensor inf.

..…

..…3D Visual Data Management User Self‐localization

(Camera Detection & Tracking)

on‐line processingKeyPoints

User Localization for Outdoor AR3D Feature Point Extraction 

& ReconstructionDrone

Manual

City Experience with AR• 발터 벤야민의 도시 해석기반 ‘서울’ AR 경험 디자인Goal

• 공간적 차원과 시간적 차원에서 ‘파리’를 분석했던 벤야민의 시

각을 통해 증강현실을 통한 도시 경험의 층위를 확장할 수 있음Needs & Values

• 도시 만보객을 대상으로 다음과 같은 세가지 방식의 증강현실

UX 디자인:이미지로 읽기/흔적으로 읽기/체험으로 읽기Approach

일상과일상경험분석하기

Urban EX with ARUrban Flaneur의 특징

산책자로서도시바라보기

변증법적도시이미지

읽기

Location-based Film Experience with AR • To make AR video service, which provides location- based film exp

erience in augmented placesGoal

• The partial functionalities of context-awareness have prevented interactive and intelligent multimedia services

Needs & Values

• Using 5W1H (Who, When, Where, What, How, Why) metadata schema for interpreting contexts of places, user, and videoApproach

Visualization of PoI for Collaborative AR• Visualization of the Others’ AI(Attention Information) for

Enhancing A Sense of PresenceGoal

• Compelling shared experience by the absence of mediation• Specification of a permissible attention information

Needs & Values

• Visualized info. controlling user’s attention(transitional view)• Delineation of a location and a direction for the cameraApproach

김재인, 우운택, & 시정곤. (2014). 관심 객체의 위치 정보 가시화를 위한 증강현실 뷰어 설계. 한국 HCI 학회 학술대회, 90-91.

사용자 위치

사용자 위치

Augmented Organic UI for KCTM• 스페이스텔링과 타임머신 기반의 메타데이터를 투어가이드 목적

으로 현실공간에서 표현하기 위한 증강UI 개발Goal

• 현실공간 위에서 메타데이터 연관관계 가시화 필요• 증강현실 맥락에 적합한 정보 구성과 인터랙션 방식 개발 필요

Needs & Values

• 화면 움직임, POI 인식, 시야 등의 현실 조건 반영한 데이터 표현• 투어리스트의 사용 맥락과 니즈에 적합한 인터페이스 구성Approach

Concept Figures

Smart Mirror• Track facial landmarks efficiently in cases of occlusion, brightness changes

and deformations Goal

• Face alignment plays an important role in many applications such as face recognition, facial expression detection, face synthesis and 3D face modelling

Needs & Values

• We propose 1) a new set of facial landmarks and 2) a novel random regression forest designed to achieve real-time face alignment Approach

Landmark Re-definition RF-based Landmark Detection

Multi-device Interaction (cw J. Lee)• Context aware interaction on multi-device, multi-user situationGoal• Diverse smart devices exist• Support better interaction on all device generally, applicable

dominantlyNeeds & Values

• Using Field theory as a representative model• Laying all device and people as a object which has each own field on a

ideal interaction space using their essential and behavioral informationsApproach

Concept Figures

Mirror Mirror (collaboration With D. Saakes)• A personal clothing design system for use at homeGoal

• Select/design items with hand gestures in front of a mirror• have them fabricated on the spot with a projector

Needs & Values

• combine spatial AR with a mirror to achieve 3D feedback• Edit properties of patterns, color, density and layering w gestureApproach

D. Saakes1, H. Yeo, S. Noh, G. Han, W. Woo, “Mirror Mirror:," ACM SIGGRAPH2015 Studio

Augmented Reality as a New Medium

UVR, Real-Virtual Symbiosis

AR2.0 Augmented Human for UVR

Current UVR Lab Projects

Summary, Q&A

More InformationWoontack Woo, Ph.D.FB @wtwoo Twitter @[email protected] http://uvr.kaist.ac.kr

11th ISUVR 2016 @ YUST, China, Jun. 29 – Jul. 5, 2016

“The future is already here. It is just not uniformly distributed” by William Gibson (SF writer)