45
Supervisors: Professor Dr. Mohammed Roushdy Dr. Haythem El-Messiry T.A. Ahmad Salah Multi-touch Interactive Surface 1

Interactive Wall (Multi Touch Interactive Surface)

Embed Size (px)

DESCRIPTION

A graduation project at the Faculty of Computer and Information Sciences, Ain Shams University, Cairo, Egypt.Interactivewall Allows users to interact with the computer using his/her handsgestures, The application uses an optical camera to detect and trackthe hands using image processing techniques,The desktop is projectedon a wall using a projector, which gives the user the free experienceof interacting with the computer freely. _________________________________________________________________Windows Live™: Keep your life in sync. Check it out!http://windowslive.com/explore?ocid=TXT_TAGLM_WL_t1_allup_explore_012009

Citation preview

Page 1: Interactive Wall (Multi Touch Interactive Surface)

1

Supervisors:

Professor Dr. Mohammed Roushdy

Dr. Haythem El-Messiry

T.A. Ahmad Salah

Multi-touch Interactive Surface

Page 2: Interactive Wall (Multi Touch Interactive Surface)

2

Team Members

Mennat - Allah Mostafa Mohammad Computer Science

Nada Sherif Abd El Galeel Computer Science

Rana Mohammad Ali Roshdy Computer Science

Sarah Ismail Ibrahim Computer Science

Page 3: Interactive Wall (Multi Touch Interactive Surface)

3

Agenda

1. Introduction

2. Physical Environment and Framework

3. Project Modules and Applications

4. Challenges

5. Conclusion and Future work

6. Tools and References

Page 4: Interactive Wall (Multi Touch Interactive Surface)

4

Motivation A more natural and direct way of Human Computer

Interaction (HCI).

Current Multi-touch devices are: Expensive Heavy Fragile Consume space

Page 5: Interactive Wall (Multi Touch Interactive Surface)

5

Problem Definition

It would be more comfortable, effective and user friendly if the user could interact directly with the display device without any hardware equipments, just using his hands’ gestures.

Our goal is to deliver an interactive surface characterized by low cost, efficiency and ease of use in real life applications.

Page 6: Interactive Wall (Multi Touch Interactive Surface)

6

Overview

Optical camera-projector system

Generic Framework for Human Computer Interaction (HCI) using hand gestures.

Page 7: Interactive Wall (Multi Touch Interactive Surface)

7

Physical Environment Physical environment consists of:

1. A projector.

2. A webcam placed over the projector’s lens capturing the projected surface.

Page 8: Interactive Wall (Multi Touch Interactive Surface)

8

1.25 m

0.8 m

Page 9: Interactive Wall (Multi Touch Interactive Surface)

9

Physical Environment

Camera

Projector

Surface

2.15 m

Page 10: Interactive Wall (Multi Touch Interactive Surface)

10

Framework

Controller

Configuration Module

Input Module

Hand Tracking

Hand Segmentation

Hand gesture Recognition

Interface

Page 11: Interactive Wall (Multi Touch Interactive Surface)

11

Controller Module

Page 12: Interactive Wall (Multi Touch Interactive Surface)

12

Controller Module

Detect Corners

SegmentationConstruct the

search windowTrack the

hand

Color Mapping

Search for hand in entry point

Fire EventGesture

Recognition

Page 13: Interactive Wall (Multi Touch Interactive Surface)

13

Configuration Module

Page 14: Interactive Wall (Multi Touch Interactive Surface)

14

Colors Mapping Maps the colors between the desktop and the captured

image colors. A set of colors are projected and captured for the color

calibration process.

Desktop Colors Projected Colors

Page 15: Interactive Wall (Multi Touch Interactive Surface)

15

Corner Detection

The four corners of the image are automatically detected using fast corner detection algorithm.

Page 16: Interactive Wall (Multi Touch Interactive Surface)

16

Input Module

Page 17: Interactive Wall (Multi Touch Interactive Surface)

17

Calibrate captured image according to the four calibration points.

Geometric Calibration

Captured Image Calibrated image

Page 18: Interactive Wall (Multi Touch Interactive Surface)

18

Hand Tracking Module

Page 19: Interactive Wall (Multi Touch Interactive Surface)

19

Kalman Filter

The Kalman filter algorithm is essentially a set of recursive equations that implement a predictor-corrector estimator.

Steps: Initialization Prediction Correction

Page 20: Interactive Wall (Multi Touch Interactive Surface)

20

Hand Segmentation Module

Page 21: Interactive Wall (Multi Touch Interactive Surface)

21

Skin Color Detection

The hand is affected by the projector’s light which results in generating different texture patterns on the hand’s surface which excludes any skin detection algorithm.

Captured image

Skin detection applied

Page 22: Interactive Wall (Multi Touch Interactive Surface)

22

Subtraction using Color Calibration

Subtract captured image form the desktop image.

Convert colors of the desktop image to that of the captured image.

Page 23: Interactive Wall (Multi Touch Interactive Surface)

23

Color Calibration Get colors’ training set,

each desktop color and its

corresponding projected color.

Divide each pair of images into regions (3x3).

Calculate the transformation matrix A for each region.

b=A * x ; where b is the calibrated color, A is the transformation matrix, x the desktop color.

Page 24: Interactive Wall (Multi Touch Interactive Surface)

24

Segmentation Results

Desktop Captured Segmented Global thresholding

Largest BlobExtraction

Page 25: Interactive Wall (Multi Touch Interactive Surface)

Blob Analysis A heuristic method to extract the hand from the arm is

applied using morphological operations.

A bounding box (60 * 60) is constructed around the largest blob.

Original Closed Original - Closed

Page 26: Interactive Wall (Multi Touch Interactive Surface)

26

Hand Gesture Recognition Module

Page 27: Interactive Wall (Multi Touch Interactive Surface)

27

Hand Gesture Recognition

EFDDB

Gesture Type

Contour Tracing

EFD

Contour Re-

sampling

Page 28: Interactive Wall (Multi Touch Interactive Surface)

28

Elliptical Fourier Descriptors Elliptical Fourier descriptors are a parametric representation

of closed contours based onharmonically related ellipses.

Any closed contour can be constructed from an infinite set of Elliptical Fourier descriptors.

Page 29: Interactive Wall (Multi Touch Interactive Surface)

29

TrainingTraining Set :

60 training images from each gesture

Page 30: Interactive Wall (Multi Touch Interactive Surface)

30

Testing Results

Gesture Testing set 1 (68 images) Testing set 2 (119 images)

98 % [1 misclassified] 99% [1 misclassified]

95% [3 misclassified] 97% [3 misclassified]

97% [2 misclassified] 98% [2 misclassified]

98% [1 misclassified] 93% [8 misclassified]

95 %[3 misclassified] 95% [5 misclassified]

Page 31: Interactive Wall (Multi Touch Interactive Surface)

31

Interface Module

Page 32: Interactive Wall (Multi Touch Interactive Surface)

32

Interface Module

A gesture event is fired whenever the framework recognizes a gesture, the event contain the position and the gesture type.

The interface handles the raised events.

The user can map the gestures to events.

Page 33: Interactive Wall (Multi Touch Interactive Surface)

33

Main GesturesEvent Hand Gesture

Pointer

Click

Drag

Zoom

Right Click

Page 34: Interactive Wall (Multi Touch Interactive Surface)

34

ApplicationsPuzzle game

Image Viewer

Painter

Page 35: Interactive Wall (Multi Touch Interactive Surface)

35

Demo

Page 36: Interactive Wall (Multi Touch Interactive Surface)

36

Challenges

Controller Module: Multi hands tracking

Gesture Recognition Module: Similar gestures

Segmentation Module Dark and complex backgrounds Arm Extraction

Page 37: Interactive Wall (Multi Touch Interactive Surface)

37

Conclusion

Human Computer Interaction field is still an open field.

Image processing can be very powerful if used in the appropriate environment.

Page 38: Interactive Wall (Multi Touch Interactive Surface)

38

Future Work

Using the depth Z- axis besides X and Y axes for determining the hand position.

Multi hands’ and multi users’ interaction.

Interactive Wall can be used with another surface other than the projector, for example a large screen can be used.

Recognize dynamic gestures.

Page 39: Interactive Wall (Multi Touch Interactive Surface)

39

ToolsSoftware Microsoft Visual Studio 2008 MatLab OpenCV

Hardware Optical Camera Projector

Page 40: Interactive Wall (Multi Touch Interactive Surface)

40

References Edward Rosten, Reid Porter, and Tom Drummond, Faster and better: a machine learning

approach to corner detection, Los Alamos National Laboratory, Los Alamos, New Mexico, USA, 87544, Cambridge University, Cambridge University Engineering Department, Trumpington Street, Cambridge, UK, CB2 1PZ, October 14, 2008.

  Yongwon Jeong and Richard J. Radke, Reslicing axially-sampled 3D shapes using elliptic

Fourier descriptors, Department of Electrical, Computer, and Systems Engineering Rensselaer Polytechnic Institute, USA, 2007.

  Louis Patrick Nicoli, Automatic Target Recognition of Synthetic Aperture Radar Images

using Elliptical Fourier Descriptors, Florida Institute of Technology, Melbourne, Florida, August, 2007.

  G. Amayeh, G. Bebis, A. Erol, and M. Nicolescu, A New Approach to Hand-Based

Authentication, Computer Vision Laboratory, University of Nevada, Reno, 2007.

Asanterabi Malima, Erol Özgür, and Müjdat Çetin, A Fast algorithm for vision-based hand gesture recognition robot control, Faculty of Engineering and Natural Science, Sabancı University, Tuzla, İstanbul, Turkey, 2006.

Page 41: Interactive Wall (Multi Touch Interactive Surface)

41

References Greg Welch and Gary Bishop, An Introduction to the Kalman Filter, Department of

Computer Science University of North Carolina at Chapel Hill, NC 27599-3175, Updated: Monday July 24, 2006.

E. Rosten and T. Drummond, Machine learning for high-speed corner detection, European Conference on Computer Vision, May 2006.

  Rafael C.Gonzalez, Richard E.Woods, Digital Image Processing ,Second Edition, 2006.

Erik Cuevas, Daniel Zaldivar and Raul Rojas, Kalman filter for vision tracking, Freie Universität Berlin, Institut für Informatik Takustr. 9, D 14195 Berlin, Germany Universidad de Guadalajara Av. Revolucion No. 1500, C.P. 44430, Guadalajara, Jal., Mexico, August 10, 2005.

  Jason J. Corso, Techniques for vision based Human computer interaction, A dissertation

submitted to The Johns Hopkins University in conformity with the requirements for the degree of Doctor of Philosophy, Baltimore, Maryland, August 2005.

.

Page 42: Interactive Wall (Multi Touch Interactive Surface)

42

References Marcelo Bernardes Vieira, Luiz Velho, Asla S´a, Paulo Cezar Carvalho, A Camera

Projector System for Real-Time 3D Video, Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), Instituto de Matem´atica Pura e Aplicada Est. Dona Castorina, 110, Riode Janeiro, Brazil, 2005

 Ngon T.Truong, Jae-Gyun Gwag, Yong-Jin Park, and Suk-Ha Lee, Genetic Diversity of Soybean Pod Shape Based on Elliptical Fourier Descriptors, Dep. of Plant Science, Seoul National University, Seoul 151-742, Korea Dep. Of Crop Sciences, Can Tho University, Can Tho, Viet Nam Genetic Resources Div., National Institute of Agricultural Biotechnology, Suwon 441-707, Korea, 2005.

E. Rosten and T. Drummond, Fusing Points and Lines for High Performance Tracking, ICCV, 2005.

  Attila Licsár1, Tamás Szirányi, Dynamic Training of Hand Gesture Recognition System,

Proceedings of the 17th International Conference on Pattern Recognition (ICPR’04), University of Veszprém, Department of Image Processing and Neurocomputing, H-8200 Veszprém, Egyetem u. 10. Hungary. Analogical & Neural Computing Laboratory, Computer & Automation Research Institute, Hungarian Academy of Sciences, H 1111 Budapest, Kende u. 13-17, Hungary, 2004.

 

Page 43: Interactive Wall (Multi Touch Interactive Surface)

43

References Attila Licsár1, Tamás Szirányi, lecture notes in computer science, University of Veszprém,

Department of Image Processing and Neurocomputing, H-8200 Veszprém, Egyetem u. 10. Hungary. Analogical & Neural Computing Laboratory, Computer & Automation Research Institute, Hungarian Academy of Sciences, H 1111 Budapest, Kende u. 13-17, Hungary, 2004

Stephen Wolf, Color Correction Matrix for Digital Still and Video Imaging Systems, U.S. DEPARTMENT OF COMMERCE, December 2003.

Qing Chen, Evaluation of OCR Algorithms for Images with Different Spatial Resolutions and Noises, School of Information Technology and Engineering Faculty of Engineering University of Ottawa ©, Ottawa, Canada, 2003.

Vladimir Vezhnevets _ Vassili Sazonov Alla Andreeva, A Survey on Pixel-Based Skin Color Detection Techniques, Graphics and Media Laboratory † Faculty of Computational Mathematics and Cybernetics Moscow State University, Moscow, Russia, 2003.

  Yasushi HAMADA, Nobutaka SHIMADA, Yoshiaki SHIRAI, Hand Shape Estimation Using Sequence

of Multi-Ocular Images Based on Transition Network, Department of Computer-Controlled Mechanical System, Osaka University, Japan, 2002.

Dengsheng Zhang and Guojun Lu, A Comparative Study on Shape Retrieval Using Fourier Descriptors with Different Shape Signatures, Gippsland School of Computing and Information Technology Monash University Churchill, Victoria 3842, Australia, 2001.

Page 44: Interactive Wall (Multi Touch Interactive Surface)

44

References Douglas Chai and AbdElsalam Bouzerdoum, A Bayesian approach to skin color

classification in YbCr color space, School of engineering and mathematics, Edith Cowan University, Australia, 2000

Kenny Teng, Jeremy Ng, Shirlene Lim, Computer Vision Based Sign Language Recognition for Numbers.

Nguyen Dang Binh, Enokida Shuichi, Toshiaki Ejima, Real-Time Hand Tracking and Gesture Recognation System, GVIP 05 Conference, 19-21 December 2005, CICC, Cairo, Egypt, Intelligence Media Laboratory, Kyushu Institute of Technology 680-4, Kawazu, Iizuka, Fukuoka 820, JAPAN.

  G. Amayeh, G. Bebis, A. Erol, and M. Nicolescu, A New Approach to Hand-Based

Authentication, Computer Vision Laboratory, University of Nevada, Reno.

A. M. Hamad, Fawzia shaaban, Mona Gabr, Noha Sayed, Rabab Hussien, Robot Vision, Faculty of computer and Information Sciences Ain Shams University, Cairo, Egypt, 2008.

Page 45: Interactive Wall (Multi Touch Interactive Surface)

45