54
ISMAR Symposium & Expo 2009 October 19-22 • Orlando, Florida, USA ISMAR International Symposium on Mixed and Augmented Reality REALITY DREAMS MELTING THE BOUNDARIES SPONSORED BY IEEE COMPUTER SOCIETY VISUALIZATION AND GRAPHICS TECHNICAL COMMITTEE

ConferenceProgram

Embed Size (px)

Citation preview

Page 1: ConferenceProgram

ISMAR Symposium & Expo 2009 October 19-22 • Orlando, Florida, USA

ISMAR International Symposium on Mixed and Augmented Reality

REALITYDREAMS

MELTING THE BOUNDARIES

SPONSORED BY IEEE COMPUTER SOCIETY VISUALIZATION AND GRAPHICS TECHNICAL COMMITTEE

Page 2: ConferenceProgram

Mixed and Augmented Reality melts the boundaries between computer generated 3D computer graphics and the physical reality of the user’s environment. ISMAR 2009, The International Symposium on Mixed and Augmented Reality offers a sneak peak into the hottest paradigm shift in modern technology.

Copyright© M

CL /UCF

2 ISMAR 2009

I S M A R 2 0 0 9

Page 3: ConferenceProgram

An IEEE event, ISMAR (International Symposium on Mixed and Augmented Reality) is the premier international meeting place for the Mixed and Augmented Reality research community. Each year carefully peer-reviewed conference papers disclose for the first time with others in the scientific

community the latest developments in the field. ISMAR is held in North America, Asia and Europe on a rotating basis. This year Orlando, Florida will host the most comprehensive ISMAR conference to date. The programs will cover diverse application domains, technological achievements, scientific discoveries as well as feature creative expression and innovative design. In addition to gathering the best scientific minds conducting research on Mixed and Augmented Reality, ISMAR 2009 will attract those from around the world and foster cross-disciplinary dialog between scientists and those in the Arts, Media and Humanities, those who are applying the tools of Mixed and Augmented Reality in their industries, and students or those who seek an introduction to this exciting field.

All disciplines from optical engineering and science, computer science, human factors, media production, product design and artistic expression play a role in advancing the innovative techniques of Mixed and Augmented Reality. Major definitive applications include education, entertainment, medical, military, design, manufacturing, mobile, marketing and media production. An array of technology and techniques are used to provide a multi-sensory and richly layered simulation from Head Worn Displays (HWDs) to embedded projection displays. The “mixing” can be accomplished by either video-see through or optical see-through devices.

The special expanded program encompasses the breadth and depth of Mixed and Augmented Reality. ISMAR 2009 surrounds the highly respected “core” Scientific and Technology program, which will include research paper presentations, posters, and hands-on demonstrations. Other events include comprehensive tutorials, participatory workshops, a tracking contest as well as stimulating receptions, an awards banquet and other special ISMAR 2009 attendee experiences.

Welcome to ISMAR 2009!

Diamond Donors ______________________________________________________________________________

ISMAR 2009 3

Page 4: ConferenceProgram

4 ISMAR 2009

Mixed and Augmented Reality: “Scary and Wondrous” Vernor Vinge Science Fiction Novelist

Perhaps the most interesting question about the future of Mixed and Augmented Reality is the nature of the supporting infrastructure. The battle between big iron and minicomputers dates almost to the earliest days of computers. The debate has morphed again and again through the years. Its current incarnation is cloud computing versus microcontrollers distributed throughout the environment. Clouds have the mindshare right now, but I think the tension between centralized and distributed computing will continue long into the future, each approach complementing the other and each becoming more and more awesomely useful. A straightforward way to provide the latency and specificity that AR needs is to use a huge number of itsy-bitsy computers distributed throughout the physical environment, each one with sensors, self-location knowledge, and nearneighbor wireless access. Ubiquity has always been a catchword of the distributed processing enthusiasts, and each new generation has pushed the idea beyond the horizons of the previous generation. Imagine an environment where most physical objects know where they are, what they are, and can (in principle) network with any other object. With this infrastructure, reality becomes its own database. Multiple consensual virtual environments are possible, each oriented to the needs of its constituency. If we also have open standards, then bottom-up social networks and even bottom-up advertising become possible. Now imagine that in addition to sensors, many of these itsy-bitsy processors are equipped with effectors. Then the physical world becomes much more like a software construct. The possibilities are both scary and wondrous.

Bio Vernor Vinge holds a PhD (Math) from University of California, San Diego. From 1972 to 2000 he taught in the Department of Math and Computer Sciences at San Diego State University. He has now retired from SDSU in order to write science-fiction full time. In 1982, at a panel for AAAI-82, he proposed that in the near future technology would accelerate the evolution of intelligence, leading to a kind of “singularity” beyond which merely human extrapolation was essentially impossible. In the 1980s and 1990s, he elaborated on this theme, both in his science fiction and nonfiction (http://www-rohan.sdsu.edu/faculty/vinge/misc/singularity.html ). Vinge sold his first science-fiction story in 1964. His novella “True Names” (1981), is one of the earliest stories about cyberspace. His most recent novel, Rainbows End (2006), looks at the impact of wearable computing and smart environments on issues of entertainment, privacy, and terror. Vinge has won five Hugo Awards, including three for best novel.

I S M A R 2 0 0 9

Page 5: ConferenceProgram

ISMAR 2009

8th International Symposium on Mixed & Augmented Reality

Passport to ISMAR 2009

Mixed & Augmented Reality : “Scary and Wondrous” by Vernor Vinge, Science Fiction Novelist................................ 4

Message from the Symposium General Co-Chairs........................................................................................................................ 7

ISMAR Map Guide...................................................................................................................................................................................... 9

Program ISMAR 2009 Program at-a-Glance............................................................................................................................ 10-13

Keynotes........................................................................................................................................................................................ 15, 26-27

Workshops........................................................................................................................................................................................... 18-19

Letter from the Science & Technology Chairs............................................................................................................................... 21

Science & Technology Program Overview & Schedule....................................................................................................... 22-25

Posters................................................................................................................................................................................................... 28-29

Letter from the Arts, Media & Humanities Chairs........................................................................................................................ 31

Arts, Media & Humanities Program Overview & Schedule................................................................................................ 34-37

Tutorials Program Overview & Schedule.................................................................................................................................. 38-42

Exposition Overview & Message from the Chairs................................................................................................................. 44-45

Demonstrations................................................................................................................................................................................. 46-52

Tracking Contest Message................................................................................................................................................................... 53

A Thank You to Our Sponsors.............................................................................................................................................................. 54

Mobile Themes Guide........................................................................................................................................... (See reverse cover)

Mobile Themes Icon (right) indicates ISMAR 2009 mobile-related sessions.

) )

ISMAR 2009 5

ISMAR 2009 Conference Program designed by Jennifer Reiser, Design Concepts & Realization

Page 6: ConferenceProgram

Great companies never stop moving forward. Every day your company is out there looking for answers to the age old question, “What next?” A growing number of companies are discovering the power of advanced wireless to increase their ROI, lower operating expenses, increase customer satisfaction—and more. Qualcomm is a world leader in providing mobile communications, computing and network services that are transforming today’s workplace. As a pioneering technology innovator, we’re committed to continuous evolution, helping you achieve your business goals with solutions that are always evolving to meet your changing needs. From secure and reliable managed network services and back-end host applications to asset tracking and machine to machine communications solutions, Qualcomm has answers for you. We can help you work smarter, move faster and reach further. We see success in your future. In fact, we’re passionate about it. We make mobile technology that works for business.

We are wireless futurists.

To learn more, visit www.qualcomm.com

Sara is passionate about

new solutions for the mobile

networks of tomorrow.Where will you be tomorrow?

Yukio is engineering

new “smart services”

for the wireless

workplace of

tomorrow.

Gail is innovating

new user interfaces

for the wireless

devices of tomorrow.

We use Augmented Reality!

www.volkswagen.comLearn more about Augmented Reality applications at Volkswagen in the exhibition together with metaio.

Page 7: ConferenceProgram

Message from the Symposium General Co-Chairs, Jannick Rolland and Christopher Stapleton

Jannick P. RollandBrian J. Thompson Professor of Optical Engineering, The Institute of Optics and Department of Biomedical Engineering, University of Rochester Courtesy Appointment with CREOL, The College of Optics and Photonics, University of Central Florida Christopher Stapleton President, Simiosys Real World Laboratory Creative Director, Entech Creative President, Rural Heritage Center, Geneva, FL

Mixed and Augmented Reality melts the boundaries between visceral physical reality and a dynamic virtual reality to enhance human performance and heighten experiences. It extends the power of our imagination to create and extends the efficacy of communication and collaboration. Our symposium has expanded to represent and serve a large and diverse community. It is designed to address research challenges in science, business, professional trades, the arts, media and the humanities.

ISMAR 2009 marks a transformational year; the beginning of an expanded program to include the Arts, Media and Humanities. We have instituted the first Arts, Media, and Humanities program to interpret and assess the impact of this technology on the world in which we live. For those who are experiencing this field for the first time, we have created a comprehensive tutorial program on Mixed and Augmented Reality. Our industry workshops represent diverse professional sectors that are now applying Mixed and Augmented Reality in a rich collection of experiential environments. Within the last year, the chasm between the early adopters and the main stream has been breached with mobile devices; we are seeing how the magic of Mixed and Augmented Reality is enhancing our ability to experience content and communicate information.

Mixed and Augmented Reality is no longer defined by its technology alone, it has become the leading edge of an emerging generation. The innovations in technology are creating exciting opportunities that challenge the way we live our lives. ISMAR 2009 provides a forum for us to assimilate, reflect, and accelerate as we are presented with how mixing realities is and will change how we live, work, play, and learn. As an international, interdisciplinary symposium, we are not only introducing technology solutions to modalities of communication thereby breaking down the boundaries between reality and our imagination, we are also in the process of melting boundaries between nations, cultures, disciplines, sectors, and markets. With this transformational year, ISMAR can emerge as a platform, established during this one-week symposium, to represent and serve a year-round growing global community and professional network.

ISMAR 2009 7

I S M A R 2 0 0 9

Page 8: ConferenceProgram

13Startup Donors __________________________________________________________

For over 45 years, ORA's Engineering Services team has been providing imaginative, cost-e�ective solutions across the entire spectrum of optical design.

Our engineers share a single vision and a common goal: to meet the challenges of virtually any optical design problem with world-class, cost-e�ective solutions, and to follow these solutions through to hardware implementation. Our comprehensive approach includes collaborating with you to achieve your quality, performance, schedule, and budget goals.

We provide you with optical system designs that work – the �rst time. Our design expertise, coupled with an understanding of fabrication processes and careful tolerancing, ensure that our solutions will perform as predicted.

OP T I C A L ENG I N E E R I NG S E RV I C E S

© 2009 Optical Research Associates.

Corporate Headquarters: 3280 East Foothill Boulevard, Suite 300, Pasadena, CA 91107-3103Main (626) 795-9101 Fax (626) 795-9550 E-mail: [email protected]�ces: Tucson, AZ | Westborough, MA

Optical Engineering Services That Deliver

w w w . o r a e n g i n e e r i n g . c o m

8 ISMAR 2009

Page 9: ConferenceProgram

ISMAR 2009 Map Guide

1. Ballroom A 2. Ballroom B 3. Ballroom C 4. Foyer 5. Sabel 6. Garland 7. Executive

1

11

4

2

9

6

10

3

8

127

5

13Upper Level

Lower Level

ISMAR 2009 Map Guide

Sci&Tech (2,3)AMH (10, 11)Tutorials (M: 14, T-Th: 8)Tracking Contest (13) Workshops Falling in Love with Learning (3) Transforming Lives (12) Manufacturing the Future (8) AR 2.0 (11) Let’s Go Out (2) The Mobile Magic Wand (10) Posters (4)Exposition (1,5,6)

14

Press Events (12) Press Office (7) Speaker Prep (9) Media Capture (9) Interviews (9)

ISMAR 2009 9

8. Amelia 9. Orange 10. Magnolia A 11. Magnolia B 12. Zinfandel’s 13. UCF Center for Emerging Media: Film Production Area 14. UCF Center for Emerging Media: Bridge

Page 10: ConferenceProgram

Monday, October 19

10 ISMAR 2009

Page 11: ConferenceProgram

Tuesday, October 20 ISM

AR 2009 Program

-at-a-Glance

ISMAR 2009 11

Page 12: ConferenceProgram

Wednesday, October 21

12 ISMAR 2009

Page 13: ConferenceProgram

Thursday, October 22 ISM

AR 2009 Program

-at-a-Glance

ISMAR 2009 13

Page 14: ConferenceProgram

I S M A R 2 0 0 9

Platinum Donors __________________________________________________

Gold Donors __________________________________________________

Silver Donors Bronze Donors __________________________________________________

14 ISMAR 2009

Page 15: ConferenceProgram

WorkshopsKeynote Speaker Mixing Reality and Magic at Disney Theme Parks Mark Mine Director, Technical Concept Design Disney Imagineering

Abstract Ever since Walt Disney first opened the doors to Disneyland in 1955, Imagineers have been using (and misusing) state-of-the-art technology to immerse our guests in magical worlds. Combined with richly detailed environments, imaginative characters, and compelling stories, these tools have enabled our guests to dance with ghosts, sail with pirates, and fly to the furthest reaches of both inner and outer space. Today, advances in computing power, display technology, and sensing devices, along with ever accelerating trends of miniaturization and cost reduction are enabling exciting new ways to create our magical Disney worlds. Our environments are richer, our characters more interactive, and our storytelling more fluid, customizable, and reactive.

In this talk, I will describe the new techniques we are developing to light, animate, and augment our environments, bringing the world of our animated features to life in ways never before possible. I will relate how we are using advanced sensing technology and better awareness of our guests to create smart reactive environments and new forms of entertainment. I will present advances in our characters that make them more responsive, aware, and engaging than ever before. I will show how we are working to break the confines of the conventional computer display in order to better surround and immerse our guests. I will discuss how all of these efforts are bound together by the common goal of bridging the everyday world of reality and the world of magic and imagination.

Keynote Speakers

Bio

Currently in his 12th year with Walt Disney Imagineering, Mark Mine is the Director of the Creative Technology Group. The fundamental mission of the Creative Technology Group is to help Imagineering’s creative and engineering teams build better theme park rides and attractions through new ways to design, evaluate, and present innovative conceptsand ideas. This includes the development and integration of real-time and pre-rendered computer graphics technologies and techniques into the blue sky design process. Mine began his Disney career in 1997 in the Virtual Reality Studio, as a programmer/designer for interactive attractions in the DisneyQuest virtual theme park project. Since then, he has worked on attractions such as Mission: SPACE, Finding Nemo Submarine Voyage and Toy Story Mania! Prior to Disney, Mine worked as an engineer for the Jet Propulsion Laboratory on projects such as the Voyager Spacecraft. Mine has a bachelor’s degree in Aerospace Engineering from the University of Michigan, a Master’s degree in Electrical Engineering from the University of Southern California, and a Master’s degree and Ph.D. in Computer Science from the University of North Carolina.

Keynote Speaker

ISMAR 2009 15

Page 16: ConferenceProgram

Get Better Results with Our Custom Marketing Solutions. DCR will develop creative marketing and public relations concepts along with integrated marketing strategies for your product, company, or startup.

Call today for a free consultation.

www.designconceptsrealization.com • 407.408.4071

designconceptsrealization

DRC

Jennifer Reiser, Founder, DCR

Partnering Affiliations ________________________________________________________

Page 17: ConferenceProgram

Media Partners ________________________________________________________

metaio GmbH

As one of the leading providers for professional cutting-edge Augmented Reality software, metaio has produced: worldwide first Browser Plug-In, first adoption of AR in Live-Marketing and the first fully integrated AR mobile application.

metaio designs, develops and markets solutions based on augmented reality. This innovative technology allows virtual 3D information to be superimposed seamlessly into the real environment at real-time. metaio was founded in 2003 and is fully owned by CEO Dr. Thomas Alt and CTO Peter Meier.

Infanteriestrasse 19, House 4b80797 Munich

GermanyPhone: +49 (0)89-5480-1980Web Site: www.metaio.com

Electrosonic is a worldwide audio-visual company with extensive experience in designing, project managing, engineering and supporting AV systems and products. Electrosonic brings a unique breadth of experience to each project; backed by solid engineering skills, project management and quality production.

Page 18: ConferenceProgram

The ISMAR 2009 conference and exhibition will be preceeded by a full day of optional workshops. Through workshops, the community of people working and studying Mixed and Augmented Reality has an open and flexible environment suitable for sharing information and knowledge that has not been through the same rigorous peer review process as the papers presented in the main conference, however, have been deemed of high quality and extremely valuable to ISMAR 2009 attendees, both researchers and practitioners.

Mobile Augmented Reality Workshops: Let’s Go Out: Research in Outdoor Mixed & Augmented Reality Chairs: Christian Sandor (University of South Australia), Itaru Kitahara (University of Tsukuba), Gerhard Reitmayr (Graz University of Technology), Steven Feiner (Columbia University), Yuichi Ohta (University of Tsukuba) The Mobile Magic Wand: Augmented Reality on Mobile Devices Chair: Christine Perey (PEREY Research & Consulting), Ori Inbar (Ogmento), Robert Rice (Neogence), Markus Tripp (Mobilizy), Chetan Damani (AcrossAir), David J. Murphy (Nokia), Peter Meier (Metaio)

Workshops Monday, October 19, 20099:00-10:30am,10:45-12:15pm, 1:45-3:15pm, 3:30-5pm

) )) )

18 ISMAR 2009

Page 19: ConferenceProgram

Keynote SpeakersProfessional Applications: Falling in Love with Learning: Experiential Learning with Mixed & Augmented Reality Chair: Eileen Smith (Media Convergence Laboratory, UCF) This workshop is designed to meet the needs of people who are currently designing memorable and lasting experiences for visitors and students through AR technology. These include professionals in the areas of cultural heritage preservation, education and in-situ learning, entertainment and games for learning, museum curation and design The leaders of this workshop will discuss how they are currently using MR/AR for education and entertainment and the challenges they face or most wish to tackle in the future. Transforming Lives: High-Performance, High-Risk Training with Mixed Reality Chairs: Mary Triers (Capital Communications & Consulting, MT3 Principal), Judith Riess (MT3 Principal) This workshop is designed for those professionals who are in the medical and military training fields or who are planning a career in these domains, designing services and applications for the betterment of the human condition through technology. The range of conditions in which MR/AR for transforming lives can be imagined includes the spectrum: from portable, outdoor training of maneuveurs to highly controlled indoor, instrumented spaces for high precision experiences. The leaders of this workshop will share and the participants of the workshop will discuss, how they are currently using MR/AR for changing the lives of civilians and people in the armed forces worldwide. With the aid of workshop participants, the group will formulate a roadmap for how MR/AR can become part of the practitioner’s toolkit for transforming lives. Manufacturing the Future: Use of Mixed and Augmented Reality in Design and Manufacturing Chair: Noora Guldemond (metaio, Inc.) This workshop is designed for people who are active in the field of industrial and spatial design and related fields, whether in a factory, design studio, commercial property or a marketing firm. Engineers, architects, industrial design agencies, retail, marketing agencies, software developers, manufacturers and investors are invited to engage in an active and enlightening series of sessions to share experiences in these fields. Participants will discuss the best scenarios or approaches for addressing common obstacles using MR/AR.

Social Networking and Augmented Reality: AR 2.0: Social Augmented Reality Chairs: Tobias Höllerer (University of California at Santa Barbara, USA), Dieter Schmalstieg (Technische Universität Graz, Austria), Mark Billinghurst (University of Canterbury/HIT Lab New Zealand) This is a workshop designed for researchers, practitioners, and interested observers of topics at the intersection of Augmented Reality and Social Networking. Massively user-generated content has changed the way the world wide Web is being utilized, and established its role as a ubiquitous up-to-date information resource, media provider, and communication medium. Augmented reality has the strong potential to extend these capabilities into the physical world. What will the ultimate interface to social information in the physical world look like? Help us explore topics concerned with user generated content and augmented reality and other areas at the intersection of social networking and AR. Join us for the discussion of the next wave of content creation and information experience in the physical world.

ISMAR 2009 19

Page 20: ConferenceProgram

Science & Technology

JACQ

UEP

HO

TO.C

OM

20 ISMAR 2009

Page 21: ConferenceProgram

Gudrun KlinkerTechnische Universität München, Germany Hideo SaitoFaculty of Science and TechnologyKeio University, Japan Tobias HöllererUniversity of California, Santa Barbara, USA

While ISMAR is expanding and reaching out to new communities, we proudly continue to present to you the best research papers in the field of Mixed and Augmented Reality. The technical program of ISMAR 2009, in the tradition of the proceedings of seven previous ISMAR, two ISAR, two ISMR, and two IWAR meetings, this year takes the form of a Science and Technology (S&T) track. This program is comprised of 24 contributed papers, 28 posters, as well asan array of keynote talks, demonstrations, tutorials, workshops and the tracking competition. All of these elements of the program are the result of dedicated hard work by members of the conference committee and additional volunteers, and we would like to recognize their efforts.

This year, the conference received 130 submissions from 23 countries. The accepted papers and posters came from 16 countries, demonstrating the global nature of active research in Mixed and Augmented Reality. The accepted papers reflect the diverse nature of the field. You will find sessions devoted to tracking on standard and mobile computing platforms, real world modeling, rendering, user interfaces, human factors, and applications. We are confident that these sessions will provide inspirations for your own work, as will the posters, demonstrations, tutorials, and workshops.

ISMAR 2009 continues its recognition of outstanding papers with the Best Paper, Best Student Paper, and Honorable Mention awards. These awards are determined by the Award Committee, composed of pioneering and leading MR and AR researchers from around the globe. We gratefully acknowledge the large amount of time and effort that the Area Chairs and Award Committee invested into this process. We also thank the international reviewers for all their work. It is due to the volunteer efforts of all these people that ISMAR stands as the world’s premier symposium for Mixed and Augmented Reality research.

Letter from the Science & Technology Chairs

ISMAR 2009 21

Page 22: ConferenceProgram

The core ISMAR program continues its excellence in representing the latest capabilities developed in Science and Technology research. Returning research scientists and engineers will not be disappointed in the breadth and depth of research expanding the capabilities of Mixed & Augmented Reality in areas of computer science, human factors, real-time graphic rendering, optics, ubiquitous computing, computer vision, tracking, imaging and much more.

Societal Advances in Science & TechnologyMixed Reality (MR) and Augmented Reality (AR) allow the creation of fascinating new types of user interfaces and are beginning to show significant impact on industry and society.

Multiple Applications for MR/ARThe field is highly interdisciplinary, bringing together signal processing, computer vision, computer graphics, user interfaces, human factors, wearable computing, mobile computing, computer networks, displays, sensors, to name just some of the most important influences. MR/AR concepts are applicable to a wide range of applications.

Science & Technology

22 ISMAR 2009

Page 23: ConferenceProgram

Monday, October 19 Industry Mixer 06:00 – 07:00 pm Keynote: Mark Mine, Technical Concept Design, Walt Disney Imagineering Mixing Reality and Magic at Disney Theme Parks 07:00 – 09:00 pm Reception sponsored by Intel, Volkswagen and Partnering/Media Affiliates Demonstrations

Tuesday, October 20 08:00 – 08:45 am Continental Breakfast (sponsored by ISMAR Startups)

08:45 – 09:00 am Introduction to ISMAR 2009, General Co-Chair Jannick Rolland

09:00 – 09:10 am S&T Track Opening Remarks (Gudrun Klinker, Tobias Höllerer, Hideo Saito)

09:10 – 10:30 am Paper Session: User Interfaces (Session Chair: Takeshi Kurata)

09:10 – 09:40 Using Augmented Reality to Facilitate Cross-Organisational Collaboration in Dynamic Tasks. S. Nilsson (University of Linköping), B.J.E. Johansson (Swedish Defence Research Institute), A. Jönsson (University of Linköping). 09:40 – 10:10 Interference Avoidance in Multi-User Hand-Held Augmented Reality. O. Oda, S. Feiner (Columbia University). 10:10 – 10:30 Continuous Natural User Interface: Reducing the Gap Between Real and Digital World, N. Petersen, D. Stricker (DKFI) 10:30 – 11:00 am Morning Coffee Break (sponsored by Optical Research Associates)

11:00 – 11:50 am Paper Session: Rendering (Session Chair: Tobias Höllerer)

11:00 – 11:30 Animatronic Shader Lamps Avatars. P. Lincoln, G. Welch, A. Nashel, A. Ilie, A. State, H. Fuchs (The University of North Carolina at Chapel Hill). 11:30 – 11:50 Augmenting Aerial Earth Maps with Dynamic Information From Videos. K. Kim, S. Oh, J. Lee, I. Essa (Georgia Institute of Technology)

11:50 – 01:30 pm Lunch Break

01:00 – 1:30 Tracking Contest (sponsored by Volkswagen): TUM Team 01:30 – 03:20 pm Paper Session: Tracking on Mobile Devices (Session Chair: Greg Welch)

01:30 – 02:00 Multiple Target Detection and Tracking with Guaranteed Frame Rates on Mobile Phones. D. Wagner, D. Schmalstieg, H. Bischof (Graz University of Technology). 02:00 – 02:30 Shape Recognition and Pose Estimation for Mobile Augmented Reality. N. Hagbi, O. Bergig, J. El-Sana (Visual Media Lab, Ben- Gurion University), M. Billinghurst (The HIT Lab NZ). 02:30 – 03:00 Towards Wide Area Localization on Mobile Phones. C. Arth, D. Wagner, M. Klopschitz, A. Irschara, D. Schmalstieg (Graz University of Technology). 03:00 – 03:20 Parallel Tracking and Mapping on a Camera Phone. G. Klein, D. Murray (University of Oxford).

03:20 – 03:45 pm Afternoon Break (sponsored by ART Advanced Real Time Tracking)

03:45 – 04:50 pm One-minute madness: Poster and Demo Teasers

Science & Technology Schedule

) )

ISMAR 2009 23

) )

Page 24: ConferenceProgram

Science & Technology Schedule

Wednesday, October 2108:00 – 09:00 am Continental Breakfast (sponsored by ISMAR Partnering /Media Affiliations )

09:00 – 09:10 am S&T Chairs Announcements and Preview of Wednesday’s Program

09:10 – 10:20 am Paper Session: Paper AR (Session Chair: Vincent Lepetit)

09:10 – 9:40 In-Place 3D Sketching for Authoring and Augmenting Mechanical Systems. O. Bergig, N. Hagbi, J. El-Sana (The Visual Media Lab, Ben-Gurion University), M. Billinghurst (The HIT Lab NZ). 09:40 – 10:00 Augmenting Text Document by On-Line Learning of Local Arrangement of Keypoints. H. Uchiyama, H. Saito (Keio University). 10:00 – 10:20 Augmented Touch without Visual Obtrusion. F. Cosco (Università della Calabria), C. Garre (URJC Madrid), F. Bruno, M. Muzzupappa (Università della Calabria), M. Otaduy (URJC Madrid). 10:20 – 10:50 am Morning Coffee Break (sponsored by VUZIX)

10:50 – 12:00 am Paper Session: Human Factors (Session Chair: Gerry Kim)

10:50 – 11:20 Interaction and Presentation Techniques for Shake Menus in Tangible Augmented Reality. S. White, D. Feng, S. Feiner (Columbia University). 11:20 – 11:40 Influence of Visual and Haptic Delays on Stiffness Perception in Augmented Reality. B. Knoerlein (ETH Zurich), M. Di Luca (Max Planck Institute Tübingen), M. Harders (ETH Zürich). 11:40 – 12:00 A User Study towards Understanding Stereo Perception in Head-worn Augmented Reality Displays. M. Livingston, Z. Ai, J. Decker (Naval Research Laboratory). 12:00 – 01:30 pm Lunch Break

12:30 – 1:30 Tracking Contest (sponsored by Volkswagen): metaio and Fraunhofer IGD Teams 01:30 – 02:30 pm Paper Session: Modeling (Session Chair: Hideo Saito)

01:30 – 01:50 Online Environment Model Estimation for Augmented Reality. J. Ventura, T. Höllerer (University of California, Santa Barbara). 01:50 – 02:10 In Situ Image-based Modeling. A. v.d. Hengel, R. Hill, B. Ward, A. Dick, J. Bastian (University of Adelaide) 02:10 – 02:30 Dynamic Seethroughs: Synthesizing Hidden Views of Moving Objects. P. Barnum, Y. Sheikh, A. Datta, T. Kanade (Carnegie Mellon University). 02:30 – 03:30 pm Posters and Demos 03:30 – 04:00 pm Afternoon Break (sponsored by UCF) Symposium Gala sponsored by Qualcomm 04:00 – 05:00 pm Keynote: Pattie Maes, MIT Media Lab, SixthSense: Integrating Information in the Real World05:00 – 06:30 pm Poster Party 06:30 – 09:00 pm Awards Banquet/ Panel: ISMAR Past, Present, and Future

05:00 – 06:00 pm Keynote: Natasha Tsakos, Conceptual Director, Idea Generator, Performer, Up Wake 06:00 – 07:00 pm Dinner Break 07:00 – 09:00 pm Posters and Demonstrations

24 ISMAR 2009

) )

Page 25: ConferenceProgram

Science & Technology ScheduleThursday, October 2208:00 – 09:00 am Continental Breakfast (sponsored by Optical Research Associates and Volkswagen)

09:00 – 09:10 am S&T Chairs Announcements and Preview of Thursday’s Program

09:10 – 9:50 am Special Presentation (Invited) : Proposal of International Voluntary Activities on Establishing Benchmark Test Schemes for AR/MR Geometric Registration and Tracking Methods H. Tamura (Ritsumeikan University), H. Kato (Nara Institute of Science and Technology) 09:50 - 12:00 pm Posters 10:30 – 11:00 am Morning Coffee Break (sponsored by Nokia)

12:00 – 02:00 pm Lunch Break 12:30 – 01:30 pm Tracking Contest (sponsored by Volkswagen): Mark Fiala Team 02:00 – 03:30 pm Paper Session: Applications (Session Chair: Gerhard Reitmayr) 02:00 – 02:30 Pick-by-Vision: A First Stress Test. B. Schwerdtfeger, R. Reif, W.A. Günthner, G. Klinker (Technische Universität München), D. Hamacher, L. Schega, I. Böckelmann (Otto-von-Guericke Universität Magdeburg), F. Doil, J. Tümler (Volkswagen AG). 02:30 – 03:00 Real-Time In-Situ Visual Feedback of Task Performance in Mixed Environments for Learning Joint Psychomotor-Cognitive Tasks. A. Kotranza (University of Florida), D.S. Lind (Medical College of Georgia), C.M. Pugh (Northwestern University), B. Lok (University of Florida). 03:00 – 03:30 Evaluating the Benefits of Augmented Reality for Task Localization in Maintenance of an Armored Personnel Carrier Turret. S. Henderson, S. Feiner (Columbia University).

03:30 – 03:45 pm Short Break

03:45 – 05:05 pm Paper Session: Tracking (Session Chair: Yoshinari Kameda)

03:45 – 04:15 A Dataset and Evaluation Methodology for Template-based Tracking Algorithms. S. Lieberknecht, S. Benhimane, P. Meier (metaio GmbH), N. Navab (Technische Universität München). 04:15 – 4:45 Global Pose Estimation using Multi-Sensor Fusion for Outdoor Augmented Reality. G. Schall, D. Wagner (Graz University of Technology), G. Reithmayr (Cambridge University), E. Taichmann, M. Wieser, D. Schmalstieg, B. Hofmann-Wellenhof (Graz University of Technology). 04:45 – 5:05 ESM-Blur: Handling and Rendering Blur in Object Tracking and Augmentation. Y. Park (GIST, U-VR Lab), V. Lepetit (EPFL), W. Woo (GIST, U-VR Lab).

05:05 – 5:35 pm Continuing the Conversation. Building Community: ISMAR–Past, Present, Future

05:35 – 6:00 pm Closing Remarks and Announcement of Tracking Contest Winner

ISMAR 2009 25

Page 26: ConferenceProgram

Bio Bio

Playwright, conceptual director and performer Natasha Tsakos works in brave new form of theatre, where sound, computer generated images and the performer move meticulously in sync to create a dreamlike yet sharply real stage environment. Within this realm of total possibility, Tsakos muses on the deepest questions of the human soul.

A Swiss born living in Miami, Tsakos has been a performer with Circ X, Camposition, Big Apple Circus and Cirque du Soleil. She writes, teaches and performs her own work in Miami and worldwide.

Among her recent work is the performance piece UP WAKE, in which she plays ZERO, a worker everyman stuck between Dream and Reality.

Keynote Speaker UPWAKE: Art Performance Fusing Dreams & Technology Natasha Tsakos

Abstract UpWake, Verb:

Reverse awareness, synthesis of being simultaneously dreaming and conscious (a) as she began to up wake, colors smelled differently and sounds looked brighter.

Why…

Synchronizing the multidimensional disciplines of animation, music and performance. Up Wake seeks the constantly changing and developing formulas of integration between technology and performance. Up Wake wishes to transport its audience into heightened sensory journey, while stimulating the imagination, turning on the subconscious’ switch and creating a world of exhilaration and visual poetry.

26 ISMAR 2009

I S M A R 2 0 0 9

Page 27: ConferenceProgram

Keynote Speaker SixthSense: Integrating Information and the Real WorldPattie Maes, MIT Media Lab Abstract Pattie Maes will discuss her project, ‘SixthSense’, a wearable gestural interface that augments the physical world around us with digital information and lets us use natural hand gestures to interact with that information.

Although the miniaturization of computing devices allows us to carry computers in our pockets, keeping us continually connected to the digital world, there is no link between our digital devices and our interac-tions with the physical world. Information is confined traditionally on paper or digitally on a screen.

SixthSense bridges this gap, bringing intangible, digital information out into the tangible world, and allowing us to interact with this information via natural hand gestures. ‘SixthSense’ frees information from its confines by seamlessly integrating it with reality, and thus making the entire world your computer.

Bio

Pattie Maes is an associate professor in MIT’s Program in Media Arts and Sciences and associate head of the Program in Media Arts and Sciences. She founded and directs the Media Lab’s Fluid Interfaces research group <http://fluid.media.mit.edu/> which develops technologies for seamless integration of the digital world and physical world.

Previously, she founded and ran the Software Agents group <http://fluid.media.mit.edu/>. Prior to joining the Media Lab, Maes was a visiting professor and a research scientist at the MIT Artificial Intelligence Lab. She holds bachelor’s and PhD degrees in com-puter science from the Vrije Universiteit Brussel in Belgium. Her areas of expertise are human-computer interaction and intelligent user interfaces. Maes is the editor of three books, and is an editorial board member and reviewer for numerous professional journals and conferences. She has received several awards: /Newsweek/ magazine named her one of the “100 Americans to watch for” in the year 2000; /TIME/ Digital selected her as a member of the Cyber-Elite, the top 50 technological pioneers of the hightech world; the World Economic Forum honored her with the title “Global Leader for Tomorrow”; Ars Electronica awarded her the 1995 World Wide Web category prize; and in 2000 she was recognized with the “Lifetime Achievement Award” by the Massachusetts Interactive Media Council.

ISMAR 2009 27

Keynote Speakers

Page 28: ConferenceProgram

Integration of Georegistered Information on a Virtual Globe ............................................................................1 Zhuming Ai, Mark A. LivingstonForked! A Demonstration of Physics Realism in Augmented Reality ...............................................................2 David Beaney, Brian Mac NameeContextual In-Situ Visualization for Port Placement in Keyhole Surgery: Evaluation of Three Target Applications by Two Surgeons and Eighteen Medical Trainees ...........................................................3 Christoph Bichlmeier, Stuart Holdstock, Sandro Michael Heining, Simon Weidert, Ekkehard Euler, Oliver Kutter, Nassir NavabUsing Optical Flow as Lightweight SLAM Alternative ...........................................................................................4Gabriele Bleser, Gustaf HendebyAdvanced Training Methods using an Augmented Reality Ultrasound Simulator .....................................5 Tobias Blum, Sandro Michael Heining, Oliver Kutter, Nassir NavabObject Recognition and Localization while Tracking and Mapping ................................................................6 Robert O. Castle, David W. MurrayStreaming Mobile Augmented Reality on Mobile Phones ..................................................................................7 David M. Chen, Sam S. Tsai, Ramakrishna Vedantham, Radek Grzeszczuk, Bernd GirodMoleARlert - An Augmented Reality Game Based On Lemmings ....................................................................8 Sandy Engelhardt, Annabell Langs, Gerrit Lochmann, Irini Schmidt, Stefan MüllerA Setup for Evaluating Detectors and Descriptors for Visual Tracking ...........................................................9 Steffen Gauglitz, Tobias Höllerer, Petra Krahwinkler, Jürgen RoßmannPhoto-based Industrial Augmented Reality Application Using a Single Keyframe Registration Procedure....10 Pierre Georgel, Selim Benhimane, Jürgen Sotke, Nassir NavabEvaluating the Trackability of Natural Feature-Point Sets .................................................................................11 Lukas Gruber, Stefanie Zollmann, Daniel Wagner, Dieter SchmalstiegRobust Pose Estimation in Untextured Environments for Augmented Reality Applications ...............12 Wei Guan, Lu Wang, Jonathan Mooser, Suya You, Ulrich NeumannIn-situ Refinement Techniques for Outdoor Geo-Referenced Models Using Mobile AR .......................13 Thuong N. Hoang, Bruce H. ThomasTemporal Calibration in Multisensor Tracking Setups.........................................................................................14 Manuel Huber, Michael Schlegel, Gudrun KlinkerIntegrating Conversational Virtual Humans and Mannequin Patient Simulators to Present Mixed Reality Clinical Training Experiences .........................................................................................................................15 Yongho Hwang, Samsun Lampotang, Nikolaus Gravenstein, Isaac Luria, Benjamin LokMobile Augmented Reality based 3D Snapshots .................................................................................................16 Peter Keitler, Frieder Pankratz, Björn Schwerdtfeger, Daniel Pustka, Wolf Rödiger, Gudrun Klinker, Christian Rauch, Anup Chathoth, John Collomosse, Yi-Zhe SongEffects of Sizes and Shapes of Props in Tangible Augmented Reality ...........................................................17 Eun Kwon, Gerard J. Kim, Sangyoon LeeA Replication Study Testing the Validity of AR Simulation in VR for Controlled Experiments ..............18 Cha Lee, Scott Bonebrake, Tobias Höllerer, Doug A. BowmanPhysical-Virtual Tools for Spatial Augmented Reality User Interfaces ..........................................................19 Michael R. Marner, Bruce H. Thomas, Christian Sandor

Science & Technology Posters

) )) )

) )

28 ISMAR 2009

Page 29: ConferenceProgram

Head-Mounted Virtual Loupe with Sight-based Activation for Surgical Applications ...........................20 Anabel Martin-Gonzalez, Sandro-Michael Heining, Nassir NavabInteractive Model Reconstruction with User Guidance .....................................................................................21 Qi Pan, Gerhard Reitmayr, Tom W. DrummondEgocentric Space-Distorting Visualizations for Rapid Environment Exploration in Mobile Mixed Reality......................................................................................................................................................................22 Christian Sandor, Andrew Cunningham, Ulrich Eck, Donald Urquhart, Graeme Jarvis, Arindam Dey, Sebastien Barbier, Michael R. Marner, Sang RheeMultitouch Interaction for Tangible User Interfaces ...........................................................................................23 Hartmut Seichter, Raphaël Grasset, Julian Looser, Mark BillinghurstImmersive Image-Based Modeling of Polyhedral Scenes .................................................................................24 Gilles SimonReal-time Representation of Inter-reflection for Cubic Marker .......................................................................25 Yuki Uranishi, Akimichi Ihara, Hiroshi Sasaki, Yoshitsugu Manabe, Kunihiro ChiharaA Solution for Navigating User-Generated Content ...........................................................................................26 Severi Uusitalo, Peter Eskolin, Petros BelimpasakisVision based People Tracking for Ubiquitous Augmented Reality Applications ......................................27 Christian A.L. Waechter, Daniel Pustka, Gudrun J. KlinkerConsistent Real-time Lighting for Virtual Objects in Augmented Reality ...................................................28 Ryan Christopher Yeoh, Steven ZhiYing Zho

) )Loosely-coupled Mixed Reality: Using the Environment Metaphorically .................................................. 29 Myungho Lee, Gerard J. KimAn Intuitional Interface for Invocation of Chinese Painting ............................................................................ 30 Henry Been-Lirn Duh, Chien-Hsu Chen, Christ Chun-Chin Su, Raymond Koon Chuan KohEYEPLY: Baseball Proof of Concept - Mobile Augmentation for Entertainment & Shopping Venues... 31 Austin Hurwitz, Alistair Jeffs

Arts, Media & Humanities Posters

ISMAR 2009 29

Page 30: ConferenceProgram

Arts, Media & Humanities

Copyright© M

CL /UCF

Copyright© Sim

iosys

30 ISMAR 2009

Copyright© M

CL /UCF

Page 31: ConferenceProgram

Raphaël GrassetHIT Lab NZ, University of Canterbury, New Zealand

Over the last year Mixed and Augmented Reality has reached a growing audience, going far beyond the interests of the technical scholar community. The emergence of more accessible software tools, and simple hardware solutions have made it easy for artists, designers, media companies and others to develop augmented reality applications for the masses.

This transformation has been noticeable not only in the range of novel applications produced, but also by the innovative way the art and design community operates. More engaged, experimental, and concerned by the social aspects or aesthetic value of Mixed and Augmented reality these new developers all offer a new way to think about, design and create AR applications.

We are really proud for ISMAR 2009 to acknowledge this evolving community by creating the new Arts, Media and Humanities program. This complements the existing science and technology program, and celebrates the achievements and new perspectives of a unique MR/AR community. The Arts perspective explores the use of MR/AR technology as a form of personal expression. The Media perspective explores MR/AR as a tool for engaging communication with new forms of story, games and play. The Humanities perspective leverages new conventions for use in interpreting the human experience.

The Arts, Media and Humanities program is also innovative in its format, by adopting the participative and investigational aspect of this novel AR community. High quality paper presentations and outstanding keynote speakers are combined with a large range of discussion panels and creative, participative events and sessions.

We wish to thank all those who have contributed to this inaugural ISMAR program, the authors, the program committee, reviewers and the conference organizers. We welcome you to ISMAR 2009, and the inaugural Arts, Media and Humanities program.

Message from the General Program Chair

Arts, Media & Humanities

ISMAR 2009 31

Page 32: ConferenceProgram

Jay BolterGeorgia Institute of Technology,USA

Carl DiSalvoGeorgia Institute of Technology, USA

Mixed and Augmented Reality takes on greater position of prominence in fields of computationalmedia design and in popular culture, the arts and humanities offer opportunities to contextualize this work and to understand AR/MR systems and experiences as distinctive cultural artifacts.

There is a long history of the combination of the arts with science and engineering. What the arts bring to science and engineering in general, and AR/MR specifically, is a mode of aesthetic and critical experimentation. One of the most pressing questions for AR/MR today is: How does AR/MR mediate the world for us, and what do we want this experience of mediation to be? Art can provide novel answers to this question, by taking the possibilities as AR/MR as starting points for imaginative trajectories beyond the immediate concerns of usability and efficiency. The humanities, meanwhile, can add its own perspective, particularly by looking at the history of media and examining the place of AR/MR is fashioning for itself in that history.

With these ambitious goals in mind, we are pleased to present these inaugural Arts & Humanities papers in the Arts, Media and Humanities program. We believe they represent a diverse set of beginnings for discussion and collaboration between artists, humanists, and technologies in shaping the future of Mixed and Augmented Reality.

Arts & HumanitiesMessage from the Program Chairs

32 ISMAR 2009

Page 33: ConferenceProgram

Jarrell PairLP33.tv, USA

In 1998, when the first International Workshop on Augmented Reality was held, AR was a field that could only be pursued within the research lab. AR development required highly specialized computer science and vision knowledge along with access to expensive graphics workstations. These passionate researchers forged ahead despite being bound by frustrating limits in processing speed, graphics rendering, and network connectivity. Now, in 2009, millions own internet connected 3G smart phones and notebook computers that far exceed the capabilities of even the most expensive computing platforms available to AR’s pioneers. AR is quickly becoming a tool for mainstream media promotion as seen in a number of recent advertising campaigns for movies and consumer products. Furthermore, AR applications are emerging which leverage the internet’s vast social networks, geographic databases, and crowd-sourcing capabilities. These applications will bring forth a true WWW in which internet based information, services, and media are seamlessly integrated into our view of the real world.

For those of us who have been a part of the AR community since its infancy and those who are just entering the field, these are very exciting times. Simultaneously, we are challenged to make sure that augmented reality does not become a victim of its own hype. We must work to manage the expectations of the public and the market. Otherwise, frustrated consumers, investors, and analysts could prematurely dismiss AR as a technological gimmick that did not fulfill its promise. In many ways, we are in a similar position as cinema’s pioneers in the early 20th century. Like the Lumière brothers, Cecil B. DeMille, Orson Welles, Louis B. Mayer and too many others to name, we have an unprecedented opportunity to create a new medium. I would like to welcome you to this inaugural program and invite you to participate in defining mixed and augmented reality as an innovative and inspiring domain for human creativity, communication, and commerce.

I S M A R 2 0 0 9Media

Message from Media Program Chair

ISMAR 2009 33

Page 34: ConferenceProgram

This year, the International Symposium on Mixed and Augmented Reality (ISMAR) launches its inaugural program featuring the latest developments in the Arts, Media and Humanities (AMH) research and applications. Artists, designers, media producers and futurists will present new frontiers in the power of Mixed and Augmented Reality to express, convey, impact and improve human experience and interpretation in the areas of education, training, entertainment, communications, design and media production.

Arts, Media & Humanities

34 ISMAR 2009

Page 35: ConferenceProgram

Arts, Media & Humanities Program Schedule

Arts, Media & Humanites ScheduleMonday, October 19 Industry Mixer 06:00 – 07:00 pm Keynote: Mark Mine, Technical Concept Design, Walt Disney Imagineering Mixing Reality and Magic at Disney Theme Parks 07:00 – 09:00 pm Reception sponsored by Intel, Volkswagen and Partnering/Media Affiliates Demonstrations

Tuesday, October 20 08:00 – 08:45 am Continental Breakfast (sponsored by ISMAR Startups)

08:45 – 09:00 am Introduction to ISMAR 2009, General Chair Christopher Stapleton

09:00 – 09:10 am AMH Track Opening Remarks (Raphael Grasset, Jay Bolter, Carl Disalvo, Jarrell Pair)

09:10 – 10:30 am Session 1: Mixed and Augmented Reality in 2009 (Session Chair: Jarrell Pair)

09:10 - 09:40 Invited Presentation: Augmented Reality Today, Ori Inbar (Ogmento)

09:40 - 10:10 Reconstruction of Yuangmingyuan, Yetao Huang (Beijing Institute of Technology)

10:10 - 10:30 Coupling Digital and Physical Worlds in an AR Magic Show Performance: Lessons Learned, Anna Carreras (Universitat Pompeu Fabra) 10:30 – 11:00 am Morning Coffee Break (sponsored by Optical Research Associates)

11:00 – 12:30 pm Session 2: Augmented Reality in Sports, Entertainment, and Advertising (Session Chair: Jarrell Pair) Panel: Augmented Reality in Sports, Entertainment, & Advertising, Greg Davis (Total Immersion), Austin Hurwitz (Eyeply), Brian Selzer (Ogmento), Tish Shute (ugotrade.com) 12:30 – 01:30 pm Lunch Break

01:30 – 03:00 pm Session 3: Tools for Mixed and Augmented Reality Development (Session Chair: Jarrell Pair) MR/AR Development Tools and Frameworks Panel Presentation and Discussion

03:00 – 03:30 pm Afternoon Break (sponsored by ART Advanced Real Time Tracking)

03:30 – 05:00 pm Session 4: Designing for the MR/AR Experience (Session Chair: Jarrell Pair)

Invited Presentation: Games in AR: Types and Technologies, Andrea Phillips (deusexmachinatio.com) Invited Presentation: Augmented Reality Roadmap: The Six Elements of the AR Universe, Ori Inbar (Ogmento), Robert Rice (Neogence) Paper Presentation: Process and (Mixed) Reality: A Process Philosophy for Interaction in Mixed Reality Environments, Timothy Barker (University of New South Wales) 05:00 – 06:00 pm Keynote: Natasha Tsakos, Conceptual Director, Idea Generator, Performer, Up Wake 06:00 – 07:00 pm Dinner Break / Press Event 07:00 – 09:00 pm Posters and Demonstrations

ISMAR 2009 35

Page 36: ConferenceProgram

Arts, Media & Humanities Schedule

Wednesday, October 2108:00 – 09:00 am Continental Breakfast (sponsored by ISMAR Partnering /Media Affiliations )

09:00 – 09:10 am AMH Chairs Announcements and Preview of Wednesday’s Program

09:10 – 10:30 am Session 1: Location-based Media, Arts and Technology Panel and Discussion: Natasha Tsakos (Session Chair: Jay Bolter) 10:30 – 10:50 am Morning Coffee Break (sponsored by VUZIX)

10:50 – 12:30 pm Session 2: Arts and Humanities Papers (Session Chair: Jay Bolter)

10:50 – 11:20 Paper Presentation: Radiating Centers: Augmented Reality and Human-Centric Designs, Isabel Pedersen (Ryerson University) 11:20 – 11:50 Paper Presentation: Augmented Reality (AR) Joiners, A Novel Expanded Cinematic Form, Helen Papagiannis (York University) 11:50 – 12:30 Paper Presentation: Project SLARiPS: An Investigation of Mediated Mixed Reality Existence, Julian Stadon (Curtin University of Technology) 12:00 – 01:30 pm Lunch Break 01:30 – 03:30 pm Session 3: Arts and Humanities Papers, Poster Teasers (Session Chair: Jay Bolter)

01:30 – 02:00 Paper Presentation: Situated Simulations. Inventing an Augmented Reality Genre for Learning on the iPhone, Gunnar Liestøl (University of Oslo) 02:00 – 02:30 Paper Presentation: Spatialization as Musical Concept, Bijan Zelli 02:30 – 03:00 Invited Presentation: Mixed and Augmented Reality Projects at the Digital Worlds Institute, Arturo Sinclair (University of Florida) 03:00 – 03:30 Poster Teasers

Loosely-coupled Mixed Reality: Using the Environment Metaphorically, Gerard Kim (Korea University) An Intuitional Interface for Invocation of Chinese Painting, ChunChin Su (National Cheng-Kung University) EYEPLY: Baseball Proof of Concept - Mobile Augmentation for Entertainment and Shopping Venues, Alistair Jeffs (Eyeply) 03:30 – 04:00 pm Afternoon Break (sponsored by UCF) Symposium Gala sponsored by Qualcomm 04:00 – 05:00 pm Keynote: Pattie Maes, MIT Media Lab, SixthSense: Integrating Information in the Real World05:00 – 06:30 pm Poster Party 06:30 – 09:00 pm Awards Banquet/ Panel: ISMAR Past, Present, and Future

) )

36 ISMAR 2009

Page 37: ConferenceProgram

Arts, Media & Humanities ScheduleThursday, October 2208:00 – 09:00 am Continental Breakfast (sponsored by Optical Research Associates and Volkswagen)

09:00 – 09:10 am AMH Chairs Announcements and Preview of Thursday’s Program

09:10 – 10:30 am Session 1: Invited Speaker and Panel (Session Chair: Jay Bolter) How to Do Things with Layers: Artistic Uses of Locative Media, Prof. Rita Raley 10:30 – 11:00 am Morning Coffee Break (sponsored by Nokia) 11:00 – 12:30 pm Session 2: (Session Chair: Christopher Stapleton) Imagination: The Third Reality

12:30 – 02:00 pm Lunch Break

02:00 – 03:30 pm Session 3: (Session Chair: Christopher Stapleton) Panel: Science Meets Fiction: Imagining the Future of Mixed and Augmented Reality, Joe Tankersley (Disney Imagineering) 03:30 – 05:00 pm Session 4: (Session Chair: Christopher Stapleton) Real-Time Experiment: Exploration in Interactive Performance on Next Generation Mixed Reality, Jeff Wirth (Interactive Performance Lab) 04:30 – 05:00 pm Session 5: Open Discussion: Future of ISMAR and Arts, Media and Humanities 05:05 – 5:35 pm Continuing the Conversation. Building Community: ISMAR–Past, Present, Future

05:35 – 6:00 pm Closing Remarks and Announcement of Tracking Contest Winner

ISMAR 2009 37

Page 38: ConferenceProgram

The International Symposium on Mixed and Augmented Reality (ISMAR) will expand it tutorial program into a four day comprehensive program of ISMAR pioneers presenting the breadth and depth of the art and science of Mixed and Augmented Reality advancements. It will feature the latest techniques in science, technology, art, media and humanities fields.

Through projects and research in the areas of education, training, entertainment, communications, design and media production, the programs will cover how Mixed and Augmented Reality can be applied to diverse domains.

Who Should Attend?Students, reseachers, artists, entrepreneurs, content producers and entrepreneurs who are looking to invent, develop or study MR/AR should attend the Tutorials. Simulation specialists, experience designers, historians, art curators, digital media producers, game designers, experiential marketing, exhibit developers, human performance trainers and other professional and companies who want to come up to speed with the latest practices of MR/AR.

The Tutorials ExperienceThe ISMAR 2009 Tutorials will be offered for four days. Select one, two, three or four days of offerings. If you are just getting involved with MR/AR or if you are looking to explore new applications and disciplines of MR/AR, the tutorials provide a broad and diverse overview of interdisciplinary research, study and innovation. Never before has such a comprehensive program been offered.

Tutorials

38 ISMAR 2009

Page 39: ConferenceProgram

MONDAY: UCF Center for Emerging Media: Film Production Area, next to the MarriotttTracking for AR Tracking Researchers (8:15am-4:45pm)This session will focus on a range of lower-level concepts and techniques related to tracking. We will begin the day with an overview of the available tracking modalities, and an introduction to the Kalman filter (sensor fusion). We will then spend several hours focusing on vision-based tracking topics. including camera models, calibration, feature detection, and both model-based and model-free tracking (Visual SLAM). We will end the day with a session on development on mobile platforms. 08:15 - 09:45 am Tracking for AR Researchers I: Sensing Modalities and Fusion (The Kalman Filter) Greg Welch (University of North Carolina at Chapel Hill) 10:00 - 11:30 am Tracking for AR Researchers 2 Brian Clipp (The University of North Carolina at Chapel Hill) Camera Pose Estimation, Location: UCF Center for Emerging Media next door to the Marriott 11:45 - 1:15 pm Tracking for AR Researchers 3 Brian Clipp (The University of North Carolina at Chapel Hill), Vincent Lepetit (EPFL) Camera-Based Tracking, Location: UCF Center for Emerging Media next door to the Marriott 03:15 - 4:45 pm Tracking for AR Researchers 4 Gerhard Reitmayr (Cambridge University) Mobile Platforms, Location: UCF Center for Emerging Media next door to the Marriott

TUESDAY: (Amelia)Introduction to MR/AR Experience Creation (9:00am-5:00pm)This session will provide an introduction to MR/AR research, applications, technology, and the development of MR/AR experiences. Creating diverse content experiences with Mixed and Augmented Reality entails the complex integration of art and science. Expert developers and researchers will share their insights, techniques for content creation, development of design tools, and heuristics for creating stories, games and play for entertainment, education, communication, medical, and training applications. The objective of this session is to offer insight into MR/AR applications and development, and to enable attendees to begin developing their own MR/AR experiences. 09:00 - 10:30 am Introduction to MR/AR Experience Creation Charles Hughes (University of Central Florida)

This session will provide an introduction to MR/AR research, applications, technology, and the development of MR/AR experiences. Creating diverse content experiences with Mixed and Augmented Reality entails the complex integration of art and science. Expert developers and researchers will share their insights, techniques for content creation, development of design tools, and heuristics for creating stories, games and play for entertainment, education, communication, medical, and training applications. The objective of this session is to offer insight into MR/AR applications and development, and to enable attendees to begin developing their own MR/AR experiences. 11:00 - 12:30 am Creating MR/AR Experiences for Diverse Applications Christopher Stapleton (Simiosys, LLC)

This tutorial will introduce participants to the basic concepts and technologies underlying MR/AR. We will review MR as the multisensory, real- time blending of real and synthetic content into a single landscape. MR can be viewed as a continuum from VR to Physical Reality (PR). Within this continuum, AR is towards the PR side, in that it focuses on adding synthetic content (often semantic overlays) to real-world contexts (objects and scenes). This tutorial will not be purely technology-centric as MR experiences are only effective when they engage their audiences through story.

Tutorials ScheduleMonday-Thursday, October 19-22, 2009

ISMAR 2009 39

Page 40: ConferenceProgram

Tutorials Schedule

01:30 - 3:00 pm AR Game Design: Blair MacIntyre (Georgia Tech) Augmented Reality (AR) Games present an exciting opportunity for mobile game designers. By moving games out into the world, AR games have the potential to sidestep the limitations of small mobile displays by giving players the illusion that they are looking through a window into a larger 3D play space merged with the world. By attaching this virtual world to the physical world, AR games create new opportunities for physical and social play. In this tutorial, I will present examples of AR game prototypes to illustrate AR game mechanics, with a focus on social, physical, and tangible interaction.

03:30 - 5:00 pm Software and Tools for AR Development Hirokazu Kato (Nara Institute of Science and Technology), Ohan Oda (Columbia University), Mark Billinghurst This is a tutorial on MR/AR development tools, such as AR ToolKit and Goblin XNA.

Wednesday: (Amelia) MR/AR Enabling Technologies (9:00am-3:30pm)In this session, leaders in MR/AR research will provide an in depth look into the display technology behind MR/AR, such as mobile, projection, HMD, and web-based MR/AR displays. The goal of this session is to offer insight into the major research and development challenges for the various MR/AR displays and to give attendees a deeper knowledge of current MR/AR technologies. 09:00 - 10:30 pm 3D Display Technologies: Fundamentals and State-of-the-Art Technologies Hong Hua (University of Arizona), Jannick Rolland (University of Rochester) A wide range of 3D display technologies has been developed or under development. While some of the available 3D display technologies are familiar to the AR/MR community (e.g. head-mounted displays and projection-based spatially augmented displays), some technologies are emerging and perhaps new to the community. This tutorial will review the various aspects of the human visual system relevant to display technologies, cover fundamental concepts and technology advancements for most of the available 3D display technologies, summarize the state-of-art display capability, and demonstrate applica-tion examples when applicable, with the hope that the tutorial will not only help audience of MR/AR novices to better understand 3D display technologies but also stimulate thinking about potentially alternative approaches to MR/AR displays and interface research. Finally, the presenters will share their experience in developing and evaluating some of the display technologies. 10:50 - 12:30 pm Developing AR Applications for the Web Mark Billinghurst (HIT Lab NZ) Recently it has become possible to develop browser based Augmented Reality applications for the Web. This is significant because it means that anyone with a web camera and browser can have an AR experience, lowering the barrier to entry. This course will give an overview of technologies that can be used to experience AR on the Web. The main focus will be on FLARToolKit, the flash version of the popular ARToolKit library, and FLARManager, a higher level rendering and interaction library. Participants will learn how to use these tools to develop their own web based AR experiences and will also be given an insight into future technologies that can be used to deliver innovative AR applications. Examples will be shown from some of the most popular AR web sites, and applications developed by the leading research labs.

Tuesday, October 20 (Continued)

) )

40 ISMAR 2009

Page 41: ConferenceProgram

Tutorials Schedule

) )

01:30 - 03:30 pm An Introduction to Handheld AR Mark Billinghurst (HIT Lab NZ), Christine Perey (PEREY Research & Consulting) As mobile phones and handheld devices have become more powerful there are now many opportunities to develop mobile AR applications that can be used by the general public. In his tutorial we provide an introduction to the field of handheld and mobile Augmented Reality (AR). Beginning with a history of handheld AR we describe the evolution from backpack based AR to AR systems based on PDAs and eventually self contained mobile phones. Developing for a mobile AR device is different from traditional desktop or head mounted display based AR. So tutorial attendees will learn key design principles from handheld HCI and how to apply them in Handheld and Mobile AR applications. In addition they will study Handheld AR user interface metaphors and be shown successful case studies. There will be material presented on the software tools that can be be used to develop handheld AR applications, and the development process. There will be a review of current handheld AR commercial applications /industry trends that will affect the development of future applica-tions and an overview of important directions for future research and ongoing development opportunities.

THURSDAY: (Amelia) Enhancing Human Performance with MR/AR (9:00am-5:00pm)Performance training, human factors, computer-human interfaces, usability, and human performance modeling all contrib-ute to the leveraging the power of MR/AR to advance human abilities. Designers, engineers and scientists present and discuss the design and evaluation of MR/AR technology, experiences, and applications. The goal of this session is to overview the human benefits of MR/AR and afford a better understanding of experimental design for MR/AR system evaluation. 09:00 - 11:00 pm Immersive Multi-Modal Mixed Reality for Assessing and Enhancing Human Performance Kay Stanney (Design Interactive Inc.), Charlie Hughes (University of Central Florida), Cali Fidopiastis (University of Alabama, Birmingham) This tutorial will introduce participants to the use of mixed reality for assessing and enhancing human performance, especially when that performance must occur under stress. Topics include experience design with a focus on multiple senses, experience delivery with considerations of commercial viability, and the capture/analysis of psychophysical and performance data. Examples presented include situational awareness training for military and first responders, and cogni-tive/physical assessment and rehabilitation. The technologies for delivering these experiences include video see-through head-mounted displays, 3D projection using passive and active stereo, multi-tiered audio, vision-based tracking and digital puppeteering. The human experience is captured through tracking, wireless EEG, respiration, Galvanic skin response and a variety of other psychophysical metrics. These data are correlated with the user’s task performance and the events occurring in the simulation including what the user is seeing, hearing and touching. These data are then visualized in a manner appro-priate to the goals of the specific experience, e.g., showing a physical therapist the ranges of movement on a human skeletal model. The session will end with a discussion of lessons learned in the systems and applications developed to date.

11:30 - 12:30 pm Evaluating AR Applications Mark Billinghurst (HIT Lab NZ)

In this tutorial we describe how to develop Augmented Reality usability studies that can be used to evaluate prototype AR applications. User Centred Design and User Evaluation have been important techniques in traditional Human Computer Interaction and there has been a wide range of user studies conducted in desktop immersive Virtual Reality. However, there have been fewer experiments presented in the wider context of MR/AR. Attendees will learn how to design and conduct qualitative and quantitative evaluation of AR interfaces and the types of user studies that can be conducted in the key AR research categories of Perception, Interaction and Collaboration. Specifically, we will cover the following topics: User Centered Design Processes, Experimental Design, Qualitative User Evaluation, Quantitative User Evaluation, Statistical Analysis Techniques, User Evaluation Cases Studies in Perception, Interaction and Collaboration, Research Directions in AR User Evaluation. In addition to the presentation material, attendees will be given key papers to read in the area, a summary of a large number of AR user studies that have been conducted and pointers to online resources. In this way they will be able to educate themselves further on the topics relevant to their own research practices. ISMAR 2009 41

Page 42: ConferenceProgram

Tutorials Schedule

02:00 - 03:30 pm Evaluating Perceptual Quality of MR/AR Mark Livingston (Naval Research Laboratory)

Mixed and Augmented Reality (MAR) systems have been under laboratory development for 40 years. Yet few applications have been able to move beyond the laboratory environment. While there are many reasons for this, one of the important reasons is the incomplete understanding of the human factors of MAR. This tutorial will focus on some of the most basic aspects of the human factors issues, that of perception in MAR. Both the hardware and software limitations of existing systems lead to conflicts with our typical perception of our environment. For MAR, these issues can in turn lead to a failure to perceive the virtual objects or their fundamental properties, which prevents assimilating the virtual information into the surrounding environment. Since this is not only the goal, but also the assumption of many MAR systems, this often limits the use of MAR for users beyond those who are intimately familiar with the design of the application. Another issue is that the nature of MAR visualization offers some perceptual capabilities that are beyond our everyday experience. Thus the design challenge is to endow the user with these “extra” abilities in a way that smoothly merges with our perception of the environment. This has been shown to be a challenging aspect of MAR visualization design, as it often is limited by either the human visual system or by the MAR system capabilities from achieving the desired perceptual goals. This tutorial will begin with a review of some basic aspects of perception, with an emphasis on the problems that arise in MAR systems. Next, we will discuss strategies that have been applied to investigate the perceptual issues in MAR. One challenge to consider here is the difficulty of designing perceptual experiments in MAR. Finally, we will (collectively) try to build a road map for explora-tion of perception in MAR. In all three phases of the discussion, we will draw upon the literature of perceptual user studies in MAR.

03:45 - 5:00 pm How to Design and Evaluate Industrial AR Applications Bjorn Schwerdtfeger (Technische Universität München) Margarita Anastassova (CEA, LIST)

This tutorial focuses on experience in evaluating AR user interfaces for industrial applications. As the development of such interfaces happens at the limit of what is known or common practice in the design of user interfaces, their design and usability evaluation are real challenges. The need for intensive user studies during the design and evaluation of new user interfaces is without controversy. However, user studies with AR interfaces, especially for industrial applications, are more complex than user studies of more traditional systems like the Web. The complexity arises from several aspects (e.g. the innovative technologies, the scarce user experience and knowledge of these technologies, the limited number of applica-tions, the limitations of HCI knowledge and guidelines in this field). Furthermore, the design space is huge, which allows the design of various innovative solutions but, at the same time, it does not prevent from making a lot of mistakes in the design. For these reasons, it seems difficult to simply apply established methods to assess AR user interface design. The tutorial aims at presenting a discussion on the limitation of current methods for the design and user evaluation of AR interfaces for industrial applications and a number of practical guidelines and techniques to improve the user interface of your AR system using target-aimed approaches with less overhead. We will mainly regard the early and middle design stages of AR applications. The reason for this choice is that these phases are the most demanding ones both from a design and from an evaluation perspective (i.e. most of the users do not know the technology, the technology is in search of applications, there are few existing HCI guidelines for AR interfaces). Thus, the tutorial mainly covers the preliminary user studies (requirements elicitation using several methods such as scenarios, field studies, activity analysis, and formative evaluations of prototypes) and the subsequent usability evaluation of existing prototypes. With this Tutorial we will also share some evaluation strategies particularly suitable for AR applications. This includes problems, which are specific for the design and evaluation of indus-trial AR user interfaces and which may not appear in the evaluation of classical interfaces.

Thursday, October 22 (Continued)

42 ISMAR 2009

Page 43: ConferenceProgram

ISMAR 2009 43

Building the Future on Fundamentals.

www.evoopticks.com

Welcome ISMAR 2009 Exhibitors

Page 44: ConferenceProgram

All attendees will be provided an opportunity to experience hands-on demonstrations in conjunction with the two receptions on Monday and Tuesday evenings. Students, artists, entrepreneurs, content producers and entrepreneurs who are looking to invent, develop or study Mixed and Augmented Reality should attend these demonstrations. It is a great way for simulation specialists, experience designers, historians, art curators, digital media producers, game designers, experiential marketing, exhibit developers, human performance trainers and other professional and companies to get a first-hand experience with the latest developments with Mixed and Augmented Reality.

Exposition Overview

44 ISMAR 2009 44 ISMAR 2009

I S M A R 2 0 0 9

Page 45: ConferenceProgram

ISMAR 2009

Message from the Laboratory Demonstrations & Research Showcases Co-Chairs, Dr. Christian Sandor and Dr. Sean White Welcome to the Laboratory Demonstrations and Research Showcases at ISMAR 2009, the Eighth IEEE International Symposium on Mixed and Augmented Reality. Following the established tradition of showcasing late-breaking, state of the art research demonstrations, we are delighted to host a large number of high-quality demonstrations for you! Interact with the latest systems before they become hot topics in mainstream media and techno blogs.

Exhibitors present innovative technologies and applications in many fields, including displays, tracking, input devices, and interaction techniques. The demos are available for attendees to try out and discuss with the creators. We have included a mix of works invited by the organizers and works selected from juried submissions to the ISMAR 2009 online submission system.

Since Augmented and Mixed Reality systems are by definition interactive and realtime, we personally believe that demonstrations have always been one of the most important parts of ISMAR, as they enable participants to experience Augmented and Mixed Reality to the fullest extent.

Laboratory Demonstrations and Research Showcases

ISMAR 2009 45

D e m o s

Page 46: ConferenceProgram

1. Put a Spell: Learn to Spell with Augmented Reality Ori Inbar, Ogmento, Contact: [email protected] Point your camera phone to a printed image laying on the kitchen table or the living room floor - and your reading buddy appears in 3D in the real world. The animated reading buddy presents literacy challenges that encourage your child to find REAL letter cards, grab and place them in the virtual blank spaces.

The phone becomes a “magic lens” that recognizes letters and words, pronounces them, and rewards your child for making progress. The result: Your child spends less time in front of a screen, interacts with the real world and learns a skill. Kids happy. Parents happier.

2. Animatronic Shader Lamps Avatars Andrew Nashel, The University of North Carolina at Chapel Hill, Contact: [email protected] Peter Lincoln, Greg Welch, Andrew Nashel, Adrian Ilie, Andrei State, Henry Fuchs, The University of North Carolina at Chapel Hill This demonstration presents a new approach for robotic avatars of real people, using cameras and projectors to capture and map the dynamic motion and appearance of a real person onto a humanoid animatronic model. We call these devices animatronic Shader Lamps Avatars (SLA). The demonstration consists of a proof-of-concept prototype comprising of a camera, a tracking system, a digital projector, and a life-sized Styrofoam head mounted on a pan-tilt unit. The system captures imagery of a moving, talking user and maps the appearance and motion onto the animatronic SLA, delivering a dynamic, real-time representation of the user to multiple viewers.

3. Parallel Tracking and Mapping on the iPhone Georg Klein, University of Oxford, Contact: [email protected], David Murray, University of Oxford We demonstrate an implementation of the Parallel Tracking and Mapping system running on an iPhone 3G. PTAM is a natural feature tracker for Augmented Reality which does not require markers or pre-known templates, and learns about its environment on-line. The system presented here has been adapted to the iPhone’s limited computational resources and non-ideal camera. 4. Real time physics simulation of digital puppet Tzung-Han Lin, Industrial Technology Research Institute, Contact: [email protected] The demo shows a human-interaction game, named “AR puppet”. In our system, users will experience a puppeteer and control the digital puppet via barcodes. When you use this system, please make sure the major barcode is put on table and can be seen by webcam. Once the major barcode is detected, the “ISMAR2009” logo will be rendered on it. Then, please use the handheld barcode and move it upon the major barcode. When two barcodes are detected, you can shift/roll/tilt the handheld barcode for driving the digital puppet. Program also provides several options to allow users changing the grabbed position, and please hit NUMBER-key(1~9) to switch.

) )

I S M A R 2 0 0 9

46 ISMAR 2009

) )

I S M A R 2 0 0 9

Page 47: ConferenceProgram

5. Immersive Image-Based Modeling of Polyhedral Scenes Gilles Simon, University of Nancy & LORIA, Contact: [email protected] This demo enables a user to interactively capture the 3D geometry of a polyhedral scene with the aid of its physical presence. The built parts of the scene are immediately shown superimposed on the environment, which allows the user to verify the geometry against the physical world in real-time. A handheld camera is used as both the interaction and tracking device. The system can be seen as an immersive version of the widely used 3D drawing software Google SketchUp(TM). 6. Camera-based Interactions for Augmented Reality Tatu Harviainen, VTT Technical Research Centre of Finland, Contact: [email protected] Otto Korkalo, Charles Woodward, VTT Technical Research Centre of Finland We demonstrate camera-based interaction techniques for generating simple, easy-to-use augmented reality applications. The interaction techniques use only the camera as input device, so that users can interact with 3D content of the application by gesturing with camera movements. The interaction techniques are demonstrated using two very different kinds of applications: 1) an entertainment application to play with an animated 3D character, as if it were “aware” of the camera, and 2) an architectural application linking 3D models to 2D floor plans, intended for professional use. In addition, our demos include technical features such as markerless tracking as well as mobile phone implementation. 7. Object depth and shape extraction for Augmented Reality Interaction Fernando Manuel, YDreams, Contact: [email protected] Fernando Nabais, Goncalo Lopes, Andre Almeida, YDreams This demo illustrates a method for incorporating a representation of a participant into a virtual 3D environment in real-time. A single camera is used to extract contours of the participants along with depth information. A three-dimensional representation of the participant is then automatically generated, allowing for augmented interaction with the virtual environment. A number of possible interactions built around this representation are demonstrated, including physically realistic collisions, real-time object occlusion and shadow casting.

8. Dismounted Soldier Training Capability Using Mixed-Augmented Reality Frank Dean, U.S. ARMY RDECOM-STTC, Contact: [email protected] Scot Sanders, Pathfinder Systems, Inc. The DST-MAR (Dismounted Soldier Trainer – Mixed & Augmented Reality) is the US Army’s mixed and augmented reality test bed for dismounted soldiers. The system is a man-wearable, optical see-through augmented reality system designed to test new and improved component technologies as they are developed. The demonstration will allow a select number of attendees to put on a man-wearable, optical see-through augmented reality system. The user will view and shoot a hostile virtual person weaving through three friendly real people using a simulated M16. This demonstration illustrates static and dynamic occultation, man-wearable computing, see-through head mounted displays and see-through registration.

D e m o s

) )

D e m o s

ISMAR 2009 47

Page 48: ConferenceProgram

I S M A R 2 0 0 9

9. Goblin XNA: Infrastructure for Augmented Reality Research and Games Ohan Oda, Columbia University, Contact: [email protected] Steve Feiner, Columbia University Columbia’s Computer Graphics and User Interfaces Lab will demonstrate single-player and multi-player AR games created using Goblin XNA. Our games use both tracked head-worn displays and tracked hand-held computers. Goblin XNA is an open-source framework for research on AR, with an emphasis on games. It is based on the Microsoft XNA platform, and written in C#. The framework supports 6DOF position and orientation tracking using optical marker-based tracking (through VTT ALVAR and ARTag), in addition to providing a 3D scene graph, rigid body physics simulation (using Newton Game Dynamics), networking (using Lidgren), shaders, particle systems, and 2D user interface primitives

10. Image Space – A Mixed and Augmented Reality Social Media Sharing Service David Murphy, Nokia Research Center, Tampere, Contact: [email protected] Markus Kähäri, Ville-Veikko Mattilla, Severi Uusitalo,Peter Eskolin, Petros Belimpasakis, Nokia Research Center, Tampere The Nokia Image Space media sharing service is comprised of three components. A mobile daemon automatically tags media with its position and orientation. This media is accessible via the web application, which provides augmented virtuality interaction in a spatially aligned mirror world featuring 3D reconstruction of dense image clusters, in addition to social media sharing functionality. Finally the mobile AR client can overlay media items previously captured at the current location, as well as indicate POIs. The service is an MR use-case beyond the traditional tourist-guide style informational applications, using MR to bring extra dimensions to social media sharing services.

11. A Mixed Reality Painting Experience for Physical Rehabilitation Emiko Charbonneau, Media Convergence Laboratory, University of Central Florida, Contact: [email protected] Steven Braeger, Daniel Mapes, Eileen Smith, Charles Hughes

Physical therapy is a necessary but often frustrating process for a patient. Our system was developed to encourage patients to do required exercises by occupying their minds with an experience that is engaging and involves an adjustable level of challenge based on their current stage of rehabilitation. In our demonstration you can paint virtually by using a real paintbrush. By wearing passive stereo glasses that are head tracked, the viewpoint into the virtual studio changes as you move around, allowing you to paint, look at the still life display, and change brush color with ease.

48 ISMAR 2009

) )

I S M A R 2 0 0 9

Page 49: ConferenceProgram

D e m o s

12. Computing Alpha Mattes in Real-time for Noisy Mixed Reality Video Sources Authors: Nicholas Beato, Media Convergence Laboratory, University of Central Florida Contact: Nicholas Beato ([email protected]) Yunjun Zhang, Mark Colbert, Kazumasa Yamazawa, Charles E. Hughes Mixed Reality (MR) based on video see-through head-mounted displays requires the real-time blending of virtual content into the captured video. Chroma keying, often called blue screening or green screening, is one means of addressing this problem, especially in the case of augmented virtuality in which the virtual content primarily provides a surrounding setting for the real content. We will demonstrate the effectiveness of our noise-tolerant algorithm for interactive chroma-keying in mixed reality. The activity will include a demonstration of the system’s fast calibration method and user experience of an MR environment using both the Canon VH2002 and VH2007 video see-through head-mounted displays. Keywords: Mixed reality, Chroma keying, Blue screening, Alpha matting, Video see-through HMD, GPU

13. Augmented Reality Pop-Up Dollhouse Author/Artist: Helen Papagiannis, Augmented Reality Lab, Department of Film, Faculty of Fine Arts, York University, Contact: [email protected] The Augmented Reality (AR) Pop-up Dollhouse combines the tactile pleasure of unfolding, opening, and pulling tabs from pop-up books to discover secret AR elements throughout this miniature house. The viewer is invited to pick up the web camera and explore the miniature environment,revealing the augmented imagery throughout the scenes. The dollhouse is hand-crafted using paper-engineering techniques with AR markers adding an element of magic, of a make believe world coming to life. Unlike other AR books that are predominately 3D object based, this project focuses on integrating 2D planar video, offering viewers a glimpse into an imaginary miniature world. 14. See-through near-to-eye display with integrated gaze tracker Author: Toni Järvenpää, Nokia Research Center, Contact: Toni Järvenpää ([email protected]) We demonstrate a wearable display concept with an integrated eye gaze tracking functionality. Near-to-Eye Display (NED) offers a large mobile screen experience. Highly integrated eye gaze tracker detects the user focus point in the displayed image. Gaze tracker can be used as an input device for the NED system. See-through image of the virtual display can be overlaid in the visual field of the user and the gaze point in the image or in the real world used for novel applications. Both technologies are based on our proprietary diffractive optics. The prototype system is mobile and operated with a laptop.

ISMAR 2009 49

) )

D e m o s

Page 50: ConferenceProgram

I S M A R 2 0 0 9

15. Streaming Mobile Augmented Reality on Mobile Phones Authors: David Chen, Nokia Research Center Palo Alto, Stanford University Contact: David Chen, [email protected] Sam Tsai, Ramakrishna Vedantham, Radek Grzeszczuk, Bernd Girod We demonstrate a mobile augmented reality system for continuous recognition of objects in video captured on a phone’s viewfinder. The user points the camera at a book or CD cover and sees the object’s identity in the viewfinder in 1 second. The object’s boundary is displayed and accurately tracked, for easy visibility against a cluttered back-ground. Both the object’s identity and geometry are quickly retrieved from a server hosting a database of 20,000 entries. As the user pans across a set of books and CDs, the system recognizes new objects that come into view, without any button being pressed. 16. Mixed Reality in Virtual World Conference Author: Tuomas Kantonen, VTT Technical Research Centre of Finland Contact: [email protected] The demonstration shows how mixed reality techniques can be used for teleconferenc-ing between the real world and Second Life virtual world. In the virtual world each tele-conference participant is represented by an avatar. In the real world augmented reality is used for displaying virtual avatars of remote participants among the co-located partici-pants. Augmented virtuality, currently in form of simple head and hand tracking, is used for relaying the body language of participants into the virtual world. Two augmented reality display methods are used: head mounted video see through display and plain old video teleconference on large screen. 17. Vision based People Tracking for Ubiquitous Augmented Reality Applications Authors: Christian Waechter, TU Munich, Fachgebiet Augmented Reality Contact: [email protected] Daniel Pustka, Gudrun Klinker TU Munich, Fachgebiet Augmented Reality We demonstrate the capabilities of a ceiling mounted, single camera people tracking system for its use in Ubiquitous Augmented Reality applications. The position information provided by the system is combined in a complementary fusion with the orientation information of an inertial sensor attached to a user worn UMPC. The resulting pose information can be used for various scenarios including augmentations on the display of the userís UMPC. Therefore the positions of the people tracking system must be assigned to the corresponding individuals. This is done without any prior registration of the persons within the people tracking system and positions are only assigned on request which also respects privacy issues.

) )

50 ISMAR 2009

) )

I S M A R 2 0 0 9

Page 51: ConferenceProgram

D e m o s

18. Temporal Calibration in Multisensor Tracking Setups Authors: Michael Schlegel, TU Munich, Fachgebiet Augmented Reality, Contact: [email protected] Manuel Huber, Gudrun Klinker TU Munich, Fachgebiet Augmented Reality In this demo we want to demonstrate a generic method for temporal calibration. Temporal calibration is crucial for sensor fusion tracking setups. The approach can be seen in our poster “Temporal Calibration in Multisensor Tracking Setups” also presented at ISMAR09. We show the feasibility of our calibration method. To demonstrate this we do a temporal calibration of two different gyroscopes and a marker tracker. The estimated temporal offset is then used to perform a temporal alignment of the sensor data. To illustrate the improvement of accuracy we will use our tool “trackman”.

19. Mobile Augmented Reality based 3D Snapshots Authors: Peter Keitler, Technische Universitat Munchen, University of Bath Contact: [email protected] Frieder Pankratz , Bjorn Schwerdtfeger , Daniel Pustka, Wolf Rodiger, Gudrun Klinker, Christian Rauch, Anup Chathoth, John Collomosse, Yi-Zhe Song Technische Universitat Munchen, University of Bath The demo presents a user-friendly mobile phone augmented reality platform based on user-generated content. The core idea is to generate a 3D model of arbitrary small or mid-sized objects, based on photographs taken with a mobile phone camera. Optical square markers provide the anchor for placing the reconstructed virtual objects in a natural environment such as a living room. Tracking performance is highly improved by a novel approach based on pixel flow. This dual tracking method also enables a new single- button user interface metaphor for moving virtual objects in the scene. 20. ProFORMA: Probabilistic Feature-based On-line Rapid Model Acquisition Authors: Qi Pan, Gerhard Reitmayr and Tom Drummond, Cambridge University Contact: Qi Pan, [email protected] The generation of 3D models is very useful for many computer vision and augmented reality applications. This demo introduces ProFORMA, a system designed to enable on-line reconstruction of textured 3D objects rotated by a user’s hand. Partial models are created very rapidly and displayed to the user to aid view planning, as well as used by the system to robustly track the object pose. The system works by calculating the Delaunay tetrahedralisation of a point cloud obtained from on-line structure from motion estimation which is then carved using a recursive and probabilistic algorithm to rapidly obtain the surface mesh.

) )

ISMAR 2009 51

D e m o s

D e m o s

Page 52: ConferenceProgram

I S M A R 2 0 0 9

21. Global Pose Estimation using Multi-sensor Fusion for Outdoor AR Authors: Gerhard Schall, TU Graz, Institute for Computer Graphics and Vision Contact: [email protected] Daniel Wagner, Gerhard Reitmayr, Dieter Schmalstieg TU Graz, Institute for Computer Graphics and Vision This demonstration shows an outdoor handheld augmented reality system featuring a tracking system combining Real-Time Kinematic (RTK) based GPS with barometric heights and an inertial measurement unit a visual orientation tracker to obtain improved robustness and accuracy of pose estimates. The visual tracker learns a map of the envi-ronment and allows for correction of the deviations of the 3-axis magnetic compass. The demonstrated application visualizes sub-surface infrastructure such as pipes and electrical installations.

22. Demo Tracking Contest Authors: Sebastian Lieberknecht, metaio GmbH, Contact: [email protected] Selim Benhimane, metaio GmbH We would like to demonstrate the system we will use for the tracking contest at ISMAR‘09. In the contest, a person should be guided by a mobile computer so that the person can pick several items in a given order. The items‘ locations are provided by their 3d coordi-nates in a global reference coordinate system. The task of the tracking system is to create a map of the environment, register it to the global coordinate system and then guide the user to the items one after another.

23. The Zerkin Glove Author: Noah Zerkin, Guy with Soldering Iron, Contact: [email protected] The Zerkin Glove (www.ZerkinGlove.com) is a low-cost inertially tracked interface device. It combines multi-axis gyroscopic, accelerometric and magnetometric sensors on the forearm with an elbow-flex sensor to generate scalable coordinate location values for the elbow and wrist in 3D space. The actual glove component contains thirteen to sixteen flex sensors and an accelerometer. The device can be used to intuitively interact with virtual and augmented objects and environments without external reference infrastructure. This gives the wearer an immersive experience with one-to-one movement and gesture projection into the virtual world.

24. Pedestrian Tracking System Author: Tomoya ISHIKAWA, Ph.D., National Institute of Advanced Industrial Science and Technology, Japan Contact: [email protected] We demonstrate a pedestrian tracking system that enhances tracking performance and other services, such as 3D indoor modeling and content authoring services, by linking the services. Position and orientation from the tracking system can be used for efficient 3D indoor modeling from multiple photos and content authoring based on the tracking information on the site. Besides, the created models improve the tracking performance by map matching, and the authored contents are used for navigation services. As an example of such service linkages, we also show an indoor pedestrian navigation system that utilizes photorealistic 3D indoor models and authored contents by our interactive modeler and authoring tool.

52 ISMAR 2009

I S M A R 2 0 0 9

) )

Page 53: ConferenceProgram

Don’t Miss the ISMAR 2009 Tracking Contest sponsored by Volkswagen at the UCF Center for Emerging Media: Film Production Area.

Tuesday 01:00 - 1:30 pm Wednesday 12:30 - 1:30 pm Thursday 01:00 - 1:30 pm

Winner will be announced at 5:30 on Thursday, October 22!

Daniel Pustka (TU München) and

Fabian Doil (Volkswagen)

Many tracking technologies for Augmented Reality have been proposed so far. Some have made their way into commercial products, some are

freely available as open-source software, and others are still in the development phase. An application designer has many potential tracking solutions, but it is not clear which choice is most appropriate. To allow a fair compari-son of the state-of-the art, the first tracking contest was organized at ISMAR'08 in Cambridge. Following the success of last year, a sequel is being organized in Orlando.

Compared to last year, the general idea is unchanged: participants are provided with a list of 3D points in a given reference frame. These points correspond to (real) objects in the world which have to be identified using mobile augmented reality techniques. This evaluation method produces results that are relevant for a wide range of AR applications, such as logistics or mainte-nance. As the area to be covered is reasonably large and still some degree of accuracy is needed, the task cannot easily be solved using off-the-shelf tracking technologies.

To make this year’s contest more exciting, we introduced a new score system. In addition to the number of correctly picked objects, participants will be able to earn bonus points for speed and accuracy. Also, wrongly picked objects are penalized to encourage the use of error-estimation techniques. In order to have a more realistic scenario this year, we have teamed up with Volkswagen, who will provide additional industry-relevant tracking challenges.

Tracking Contest Chair Message

ISMAR 2009 53

) )

Page 54: ConferenceProgram

We are privileged to have many more sponsors join our ISMAR community. We look forward to engaging more companies and institutions in support of our future.

Thank You ISMAR 2009 Sponsors.

“Do not wait to make your perfect offering Ring the bells that still ring There is a crack in everything That is how the light gets in.”

- Adapted from Singer Leonard Cohen

Looking forward to seeing you again next year and for years to come!