21
Natural User Interface(NUI) Application Development by YUVARAJ ILANGOVAN 1

Natural User Interface Microsoft Kinect and Surface Computing

Embed Size (px)

Citation preview

Page 1: Natural User Interface Microsoft Kinect and Surface Computing

Natural User Interface(NUI) Application Development

by YUVARAJ ILANGOVAN

1

Page 2: Natural User Interface Microsoft Kinect and Surface Computing

NUI Overview - Agenda

2

Understanding of Natural User Interface (NUI)

First look of Kinect for Windows sensor

Capabilities of Kinect for Windows Sensor Version 2

Some features of Kinect for Windows SDK

Components of Kinect for Windows SDK v2.0

Business application opportunities using Kinect for Windows

Case study – Kinect Banking POC

Microsoft Surface computing application development

Capabilities of Surface computing

Business application opportunities for Surface Computing

Sample apps on Surface and Kinect

Page 3: Natural User Interface Microsoft Kinect and Surface Computing

Understanding of Natural User Interface

It’s a new way of accessing computer applications that user operates the application through instinctive natural human behaviors and actions

An NUI based application can be accessed in number of ways based on requirements. Some NUIs rely on intermediary devices like touch monitors for interaction but more advanced NUIs are almost touch-free applications that will be generally accessed by human gestures, voice, facial expressions, etc.,

Some examples of Natural User Interfaces:

Gesture Recognition – Track user actions and translate those movements as defined instructions (Ex: Microsoft Kinect for Windows)

Speech Recognition - Allows user to interact with a system by voice commands (Microsoft Speech APIs, Kinect)

Gaze Tracking – Allows user to interact with a system through eye movements

Touch Screen interfaces – Direct interaction with digital content (Ex: Microsoft Surface Computing - PixelSense)

Brain-Machine interfaces – Real neural signals and use programs to translate those signals to action (Ex: Paralyzed person operates computer, motorized wheel chair)

Page 4: Natural User Interface Microsoft Kinect and Surface Computing

First look of Kinect for Windows sensor

Kinect for Windows – Version 1

4

Kinect for Windows – Version 2

Page 5: Natural User Interface Microsoft Kinect and Surface Computing

Capabilities of Kinect for Windows Sensor Version 2

Human Gesture Tracking

Kinect is capable to recognize 6 people and 25 body joints per person in front of the sensor and can recognize even smaller objects, able to view objects in 3-Dimensions

The device has a built-in color camera, an infrared emitter, and a microphone array to sense movement of the objects in front of the device and can sense voices. With 3x higher depth reliability it can visualize small objects too

Core capabilities of Kinect V2 – Depth Sensing, Full color 1080p HD video, new active infrared (IR) capabilities

5

Page 6: Natural User Interface Microsoft Kinect and Surface Computing

Capabilities of Kinect for Windows Sensor Version 2 [Contd…]

Speech Recognition

Like human gesture, we can control applications using voice as command. Kinect sensor is

having 4 microphones and able to recognize human voice as it is. We need to configure how

the application should respond for the spoken word from a human.

Face Expressions Tracking

The Face Tracking SDK’s face tracking engine analyzes input from a Kinect camera,

deduces the head pose and facial expressions, and makes that information available to an

application in real time. With SDK 2.0 Kinect can track whether eyes are open or closed,

nose position, facial expressions and the direction of the face. For example, the below

representation can be used to render a tracked person’s head position and facial expression

on an avatar in a game or a communication application or to drive a natural user interface

6

Page 7: Natural User Interface Microsoft Kinect and Surface Computing

Some features of Kinect for Windows SDK

7

Feature Prospective Usability Details

Data stream

processing

- Capturing and processing color image data stream,

depth image data stream, infrared stream, audio

stream

Enhanced body, hand

and joint orientation

- Ability to track 6 people and 25 skeletal joints per

person

- Improved understanding of body positioning

Speech Recognition - SDK enables the sensor to understand human voice

modulations, accents almost in possible natural way

Near Mode, Tilting

Sensor

- Tracking human body within close range

(approximately 40 centimeters)

- Sensor can be tilted through app in the range between

+27 and -27 degrees

Windows Store

Support

- Kinect enabled apps can be created for Windows

Store

Unity Pro Support - Other than gaming, Unity Pro supports cross platform

prototyping

Page 8: Natural User Interface Microsoft Kinect and Surface Computing

8

Feature Prospective Usability Details

Enhanced Tools

support

- Enhanced recording and playback

- Custom visual gesture builder (we can create our own

gestures)

Improved Facial

Tracking

- 20 times better resolution compared with Kinect

sensor V1

- Enabling the app to create a mesh of more than 1000

points for more accurate human face representations

Multi app access

simultaneously

- More than one application can access a single Kinect

V2 at the same time

Higher Depth Fidelity - Up to 3X higher depth reliability

Record and playback

gestures

- Using Kinect Studio we can record and playback

sensor’s data stream. This will be really useful for

testing and debugging the applications

Enhanced Tools

support

- Enhanced recording and playback

- Custom visual gesture builder (we can create our own

gestures)

Some features of Kinect for Windows SDK [Contd…]

Page 9: Natural User Interface Microsoft Kinect and Surface Computing

Components of Kinect for Windows SDK v2.0

High Level Component Representation

9

Page 10: Natural User Interface Microsoft Kinect and Surface Computing

Illustration of different layers of components:

10

Components of Kinect for Windows SDK v2.0

[Contd…]

Page 11: Natural User Interface Microsoft Kinect and Surface Computing

Business application opportunities for Kinect for Windows

Kinect sensor is announced as business ready by Microsoft. So, we can develop any innovative real time touch free applications for the customers of following:

Digital Signage

Interactive Display

Visual Marketing

Virtual Healthcare

Business sectors:

Retail applications

Health care applications

Financial and Banking apps

Visual Media and

Entertainment

Education / Presentations

Automobile industries

3D modeling apps

And more…

11

Page 12: Natural User Interface Microsoft Kinect and Surface Computing

Confidential & Proprietary. Not for External Circulation

Case study – Kinect Banking POC

Problem

XYZ bank wanted to give very advanced user

experience for it’s customers. When customers

visit bank for any kind of work, they need to have

manual interaction with representatives to get to

know about the new products, services which are

offered by bank. Moreover, this will create waiting

queue and time consuming process for others.

So, bank wanted to create innovative and

interactive application for the users to know

more about the current details of the banking

services

Specific observations:

Customers need to wait for the discussions

If customers want to know lot of details from

different banking services, they may need to

go to other counters and that may be a time

consuming process

What if a small group of people want to know

details about different banking services at the

same time?12

Page 13: Natural User Interface Microsoft Kinect and Surface Computing

Confidential & Proprietary. Not for External Circulation

Case study – Kinect Banking POC

Solution

Overall notion:

Avoiding manual discussions

Customers should be able to freely

explore the details even if they don’t

have idea about the services

Usability should be very intuitive and

innovative

What will be the deliverable:

A touch-free Kinect enabled banking

application

Users will be able to interact with the

digital content by basic human gestures

Users will be able to talk to application

Multiple users will be able to access

banking details simultaneously

13

Page 14: Natural User Interface Microsoft Kinect and Surface Computing

Getting Started – Microsoft Kinect for Windows

Supported operating systems: Windows 7 / 8 (x86 or x64)

Hardware Requirements:

Computer with a dual-core, 2.66-GHz or faster processor

Windows 7–compatible graphics card that supports Microsoft® DirectX® 9.0c capabilities

2 GB of RAM or above

Kinect for Windows sensor—retail edition, which includes special USB/power cabling

Once all the above is installed, confirm that sensor is working fine and then install latest SDKs as follows:

KinectSDK-v1.7-Setup

KinectDeveloperToolkit-v1.7.0-Setup

Runtime requirement – Sensor should be attached to CPU through USB14

Page 15: Natural User Interface Microsoft Kinect and Surface Computing

Surface computing application development

15

Page 16: Natural User Interface Microsoft Kinect and Surface Computing

Capabilities of Surface computing

Direct interaction

Users can actually “grab” digital information with their hands and interact with content through touch and gesture, without using mouse or keyboard kind of GUI devices

Multi‐touch contact

Surface computing recognizes many points of contact simultaneously, not just from one finger as with a typical touch screen, but up to 50 touches as per Microsoft

Multi‐user experience

The horizontal form factor makes it easy for several people to gather around surface computers together, providing a collaborative,face‐to‐face computing experience, can access Surface applications from any 4 direction from the touch screen

Object recognition

Users can place physical objects on the surface to trigger different types of digital responses, including the transfer of digital content

16

Page 17: Natural User Interface Microsoft Kinect and Surface Computing

Microsoft PixelSense Overview It’s a new Natural User Interface (NUI) based surface computing device

manufactured jointly by Samsung and Microsoft. They named it as PixelSense,

the Samsung SUR40. The product was renamed from "Microsoft Surface" to

"Microsoft PixelSense“

The current cost of SUR40 is $8400

The Samsung SUR40 is a 40 in (102 cm) 16:9 LED backlit LCD display

(1920×1080) with integrated PC and PixelSense technology, which replaces the

cameras in the previous product. PixelSense technology enables Samsung and

Microsoft to reduce the thickness of the product from 22 in (56 cm) to 4 in (10 cm)

17

Page 18: Natural User Interface Microsoft Kinect and Surface Computing

Business application opportunities for Surface Computing

Advertising, Media and Exhibitions Business – Kiosks for Ticket Booking/get

the product details in Shopping Malls, by using Smart Surface Wall, SmartTable

Education, Training institutions – Group learning / Interactive case studies by

SmartSurface

Health Care – X-Ray analysis/Ultrasound video streaming/Group discussions by

doctors by SmartSurface

(http://www.youtube.com/watch?feature=player_embedded&v=afOkWRSIgZ8)

Financial Organizations – Banking, back office processes, conferences, Data

Analysis by SmartSurface

Government – Multiple people can simultaneously access government details

like forms, rules/procedures in government offices

Gaming Industry – Multiple people can play games, casinos using SmartTable,

SmartSurface

Engineering & Auto Mobile Design Industries – Components design and

internal parts analysis (present the different model, rotate, drag, zoom the images,

etc.,)

Music Composing – Sound designing, audio mixing

and More…18

Page 19: Natural User Interface Microsoft Kinect and Surface Computing

Getting Started – Surface application development

Supported operating systems: Windows 7

A 32-bit or 64-bit edition of one of the following Windows® 7 operating systems:• Windows 7 Home Premium • Windows 7 Professional• Windows 7 Ultimate

Additional Requirements:• Microsoft Visual C#® 2010 Express Edition or Microsoft Visual Studio® 2010 • .NET Framework 4.0 is required (included with Visual C# 2010 and Visual Studio 2010) • Microsoft XNA® Framework Redistributable 4.0 • (Optional - Recommended) Microsoft Expression Blend® 4 to edit the XAML code that defines your user interface

Once all the above is installed, then download and install Microsoft® Surface® 2.0 SDK

Runtime requirement - Surface computing device or Windows 7 touch-enabled devices19

Page 20: Natural User Interface Microsoft Kinect and Surface Computing

Sample apps on Kinect and Surface Computing

KinectBanking – A simple banking products demo touch free application that can be accessed by natural PUSH, PRESS, GRIP, RELEASE, WIPE kind of human gestures

KinectFINDict – A financial domain learning Kinect enabled application – User can learn the financial terms by voice commands and human gestures

SurfaceAuto – A surface multi-touch enabled application – User can compare multiple cars and their features by drag/drop, swiping, rotating, flipping kind of natural actions without using GUI devices

SurfaceBingMap – A surface touch enabled application – User is able to know the current activities (some details, images, videos) of an organization by touching the specified location in the world map

SurfacePowerWall – A surface multi-touch enabled application for retail domain –User is able to select the products from the huge touch screen and further get to know the details of the product

Links:

https://www.dropbox.com/s/chyyi3ulsn9a3h6/SynePowerWall.mp4

https://www.dropbox.com/s/pn6fpbmtkz0p3bq/SurfaceCarDemo.mp4

https://www.dropbox.com/s/fkzdq0jqi0v36g0/SurfaceSyneMapDemo.mp4

https://www.dropbox.com/s/ewbgrjbjl8meas6/KinectBankingDemo.mp4

20

Page 21: Natural User Interface Microsoft Kinect and Surface Computing

Thank You !

21