Upload
abbas-tolgay-yilmaz
View
95
Download
0
Embed Size (px)
Citation preview
A new future for Augmented Reality
Workshop paper∗
Abbas Tolgay Yilmaz†
Department of ComputerEngineering
Izmir Institute of TechnologyUrla, Izmir Turkey
Mark SurrowDerpartment of Computer
ScienceAarhus University
Aabogade 34, AarhusDenmark
ABSTRACTThis paper takes a critical look at the current research inthe field of Augmented Reality. We identify two areas withroom for improvement and discuss a new “direction” for ARresearch to implement said improvements or additions. Twoexamples are presented which springs from the suggested newinitiatives in AR.
General TermsAugmented Reality, Artificial Intelligence, Pervasive Posi-tioning
1. INTRODUCTIONThe state of the art in augmented reality today seems toresemple the state of the art of ten years ago[1]. There hasbeen a lot of examples of research in Augmented Reality(AR) and laboratory prototypes: [8, 7, 9] to mention a few.However all of these examples and the reseaarch revolvesaround the same basic ideas. Ideas such as marker basedvs. markerless or position (be it GPS, RFID etc.) basedAR, wearable or not. Either way it boils down to the samebasic approaches as descriped by Azuma, 2001 [1]. In 1998Mackay [5] framed the basic strategies of AR and today theystill remain the same.
The technology use also remains the same, albeit perhapssmaller and fancier: Handheld device (phone, tablet etc.),head mounted displays, cameras and other tracking sensors.A recent advance in technology—or personal technology—namely the smartphone revolution has fueled the interestin AR and inspired a lot of new amature or independent1
∗For the Spring 2011 Augmented Reality class, Aarhus Uni-versity, Department of Computer Science†Visiting Aarhus University1Non-researchers
applications utilzing AR. However even with this new foundpopularity and availability to the public the field of AR issill on the same track.
Some researchers take a different approach using the Scan-dinavian Model of participatory design (i.e. [5] and [3]). Aswe will argue later this approach has its own marits and thatour suggestion in principle springs from the same basis.
In the light of these concerns we can identify two elementsin current AR research that we can improve on: The appli-cation contexts including its relevance with respect to users,and the scope of technology used in AR systems. In otherwords, it is time to expand the use of technologies and ar-eas of application in the field of AR. We will in this articlepresent our suggestion to a possible next step for AR. In sec-tion 2 we will discuss problems with the current approach,the the following sections (2.1 and 2.2) we will introducethe foundation for a new approach to Augmented Reality.In sections 3.1 and 3.2 we will introduces and elaborate onour suggested approach. Last we will conclude in section 4.
2. THE FUTURE OF AROne of the problems for current AR applications and re-search prototypes are a lack of content or suiting applica-tion context. Some research projects (e.g. [3]) have hit anapplication area that seems to have the possibility for theaugmenting tech to aid the target user group in their corefunction. However the common case (e.g. [2, 6, 4] ), seem towork from a technological basis following Azumas[1] defini-tion of Augmented Reality closely: Combine real and virualobjects in a real environment, in real-time and interactiveas well as registering (aligning) real and virtual objects witheach other.
Most of the current research in the field of AR is basic re-search. This scientific process is a rather slow moving pro-cess and does not take full advantage of the possibilities inresearch. Basic research thends to focus primarily on en-abling technologies, developing and exploring of the basicideas of AR.
We thus base our suggestion for a new direction for AR re-search in the needs of users in their every day lives and theirworking lives, in order to get a more relevant applicationcontext and content for AR prototypes. We do not suggest
or imply that our approach should replace current researchdirections in AR, however we do argue that the field of ARstands to gain from an increesed focus on real life applica-tions and users.
2.1 Content for AR ApplicationsThe main idea is that better content and/or contexts makefor better AR research and prototypes. The reason to focuson relevant real life applications and not simply made upuse cases are that the better (real life) applications are ina better position to not alone drive the research into newand perhaps groundbreaking (within the field) projects butalso to drive the research in a meaning full direction: Theimplications of AR on interface design and user interactionhave gained interest over the last ten years and while this canbe studied in the laboratory, using real users and contextsfacilitate a more insigtfull and emperical inquery into theusability aspects of AR. Experience tells us that users oftenhave a different perspective and opinions on the problemsand challenges we present them with. These user insightsmay very well be able to push AR research towards newresults.
2.2 A Need for Cross Fields ResearchWhat then can give us better content and better applica-tion contexts for our AR projects other than using userneeds based contexts as a starting point? We believe thatthe AR field can make good use of other related researchfields. Some AR applications have already taken up positionbased technologies (E.g. Layar2) and pervasive positioningoffers both technologies and concepts that may be usefull inan AR context. Positioning and tracking are some of thecore problems for AR and extends beyond tracking hand orhead movement. Users make use of numerous artifacts andobjects—both static and dynamic in position—every day.As an example, consider the number of people travelling bybus or other forms of public transport and how the bus isnever on time. Various possible applications spring from thissimple part of the user’s life; An iPhone app that functionsas a window into the city’s public transportation world, per-haps an augmented wall in the home instead of a smartphoneetc. What are the usability implications of having a wall orwindow in the home, that hides the city and shows excactlywhere the bus to work is?
With AR systems augmenting the users every day, workday etc. with context sensitive information, overlays andobjects, in ways that are more or less invasive in the usersevery day, these systems could benefit from having an ele-ment of intelligentce or autonomy. It would be possible tocreate prototypes that where usable in more than one ap-plication context as well as facilitate different uses—there isno need for the wall to display where the bus to work is ifthe workday is over. Thus the field of Artificial Intelligencecan provide underlying technologies that can be usefull increating AR systems that are extendable to more than onespecific problem domain and certainly in common use sys-tems with more than on application.
2.3 Summation2http://www.layar.com
The field of AR presents some interesting initiatives for al-tering the way users interact with computers and lays thefoundation for fundamentally changing the way computerscan benefit users through out their lives. AR research shouldutilize these resources as well as input from other researchfields to drive the research forward.
3. A FUTURE WITH PERSONAL (AR) PETSAn examplification of our suggestion then become utiliz-ing the mentioned research fields to facilitate the use ofmore relevant application contexts, based on users. Thiscan in turn facilitate the development of AR applicationsthat are both cross/multiple domain as well as different sce-narios/perspectives for each domain, as opposed to singlepurpose systems.
3.1 Personal PetsResearch fields in AR should be broadened by related Arti-ficial Intelligence (AI) researches. AR technologies are sur-rounding people and a person by him/herself will not beable to make all decisions in a short period of time, by usingAR’s time saving advantages. People that are interested inAR in their daily lives have been using some kind of gogglesthat turns them into roughly RoboCops. These cyborgs aresacrifying from their health sake of AR researches and someindividual improvements in their lives.
In terms of using AR in daily lives, cyborgs are sitting onthe ultimate place and sharing their lives with AR tech-nologies. Besides the cyborgs, people need using technologydeeply not by themselves, but perhaps by their personal-ized robots—say artiın ↪Acial pets. The brain of personalizedrobots will be structured on many kinds of data. Like theuseraAZs behaviors on manipulating the data, how to use itand when or why to edit the data. Addition to these, thehistory of data will be based on the user’s social networkaccounts (etc. Facebook, StumbleUpon, Reddit, discussion
boards) and of course the data itself. The useraAZs datawill be valuable inputs for the artificial, to react in the besttiming and best way.
3.1.1 Pets as assistant and guideUsing AR just with looking to a monitor and trying to getto the right decision takes our time and we may not comeup with the best decision sometimes. An AI is needed to getquick and comparatively better results.
To illustrate with an example, users will give the pet theirshoplist before leaving home. Then the pet will return themthe best shopping trip options (e.g. Google Maps destina-tion options) and can take us to the shops. This pets are not
have to be in regular pet shape. They can also be in ınCy-ing versions. The pet can reach the destination by walk-ing/ınCying (depends on the robotaAZs abibity) and theuser will track the pet. Another version of the trip would bejust accompanying by the pet when we drive user’s cars orbikes to a destination. If users decide to walk, they shouldfollow it and it will take them to the shopping points. Ifusers decide to go by a vehicle, then the pet behaves like aGPS navigation system and tells them where to turn, stop(the indications may be via 3D arrows pop up on the streetor just via voice message by robot). Users need couple of
technologies to carry along with them. It can be a backuphard drive, even a mobile access point to be used anywhere.All needed peripherals can be added and upgraded the pet.
3.1.2 Pet as partnerAs we can observe, in a park or seaside, sometimes two dogstries to get close and forces their owners to get close to theother pet. With these AI pets, it will turn into dogs likethis. They will try to match their semantic information inthe cloud. They may have mutual friends on Facebook andit would be nice to meet. Or one of the owners looks fora rental house and the other wants to rent his house. Itwould be also used in hitchhiking or couchsurın ↪Ang areas.In hitchhiking, pets can decide for a driver to pick the themup if available. Then, when they accepted by the car driver,the pet takes the owner to the meeting point that decided.
3.2 Augmented ShoppingShopping centers may have this kind of robots (maybe flyingones like Parrot AR.Drone3 or presence robots like anybots4
just for augmented shopping. Users could add all items tothe shopping cart and it appears on cashier’s monitor. Thepet will await the users confirmation and then they bringsthe packages to home. The bill can be payed remotely by theuser using an augmentative device, perhaps an AR check-book utilizing an AnotoPen?
4. CONCLUSIONSIn this article we have examined and discussed the currentapproaches to AR research. We have then presented a sug-gestion for a new track in AR research, not necessarily toreplace currect directions of research, but to augment cur-rent directions as well as harness the available resources—the users—to a more extensive degreee. Last we presentedexamples of futuristic AR research projects, springing fromour suggested approach.
We started this paper by identifying two elements of AR re-search with room for improvement (see section 1) and pre-sented two initiatives for improvement (see 2.1 and 2.2), withexamplification of our suggestion in section 3.
5. REFERENCES[1] R. Azuma, Y. Baillot, R. Behringer, S. Feiner, S. Julier,
and B. MacIntyre. Recent advances in augmentedreality. IEEE, 2001.
[2] V. Buchmann, S. Violich, M. Billinghurst, andA. Cockburn. Fingartips - gesture based directmanipulation in augmented reality. Doesn’t say,Doesn’t say.
[3] K. Grønbæk, P. Ørbæk, J. F. Kristensen, and M. A.Eriksen. Physical hypermedia: Augmenting physicalmaterial with hypermedia structures. HT03, 2003.
[4] T. Hollerer and J. Pavlik. Situated documentaries:Embedding multimedia presentations in the real world.Proceedings of ISWC ’99, 1999.
[5] W. E. Mackay. Augmented reality: Linking real andvirtual worlds. a new paradigm for interacting withcomputers. Proceedings of AVI’98, 1998.
3http://store.apple.com/us/product/H1991ZM/A?fnode=MTY1NDA3NA&mco=MjEzMDMyNTY4http://www.anybots.com/#front
[6] I. Poupyrev, D. Tan, M. Billinghurst, H. Kato,H. Regenbrecht, and N. Tetsutani. Tiles: A mixedreality authoring interface. Doesn’t say, Doesn’t say.
[7] J. Rekimoto and Y. Ayatsuka. Cybercode: Designingaugmented reality environments with visual tags.Doesn’t say, Doesn’t say.
[8] J. Rekimoto and M. Saitoh. Augmented surfaces: Aspatially continuous work space for hybrid computingenvironments. Doesn’t say, Doesn’t say.
[9] J. Rekimoto, B. Ullmer, and H. Oba. Datatiles: Amodular platform for mixed physical and graphicalinteractions. SIGCHI, 2001.