95
[bits] Interactive Ecologies RMIT Industrial Design . Pre-Major Project 7 2011 BENJAMIN CREEK FORGETTING TO REMEMBER RECONNECTING OUR ARTEFACTS AND HISTORIES

FORGETTING TO REMEMBERRECONNECTING OUR ARTEFACTS AND HISTORIES

Embed Size (px)

DESCRIPTION

With the increasing impermanence of our material surrounds, how can we find new ways to explore our digital artifacts using our spatial environments?

Citation preview

[bits]Interactive Ecologies

RMITIndustrial Design . Pre-Major Project 7

2011

BENJAMIN CREEK

FORGETTING TO REMEMBERRECONNECTING OUR ARTEFACTS AND HISTORIES

12

14 15

CONTENTS

INTRODUCTION

RESEARCH & INFLUENCES

SKETCHES

EXPERIMENTS

INITIAL CONCEPT

DEVELOPMENT

DETAILED DEVELOPMENT

PROTOTYPE

CONCLUSION

APPENDIX

1

9

37

47

71

79

109

141

157

167

INTRODUCTION

2 3

FORGETTING TO REMEMBERRECONNECTING OUR ARTEFACTS AND HISTORIES

INTRODUCTION

This project sits somewhere between a new media arts project and an interactive projection environment. Examining our use of memory tools and how our environments effect us as occupants and users, this project ultimately attempts to provide a tool for allowing people to create their own augmented reality.

From a technological standpoint, this project focuses on current developments in portable computing and mobile augmented reality systems such as Pranav Mistry’s SixthSence. It also looks into open-source communities such as those centred around Processing and Ardruino. In terms of technological media, this project looks at projection as an interface element and the possiblitlities of micro projectors in portable display systems.

Theoretically, this project has just as much to do with the Situationists and ideas around pyschogeography as it does with technological development. Looking into ways in which we navigate and arrange data in physical space, this project has much to do with the idea of souvenirs (French for “to remember”), the spacial arrangements of wonder rooms and the role of artefacts in stimulating memory. Memory techniques such as the roman room technique and the psychological aspects of our built environment as spectulated by Gaston Bachelard in Poetics of space also form parts of the theoretical grounding for this project.

The use of off-the-shelf components in unintented ways also sees this project aligned with the modding & DIY communities, with the adaption of the techniques used in the Wii Remote whiteboard mods. If this system were to be extended into a public installation, then the project would also have the potential to be a public intervention and an alternative street art medium.

This varied and diverse set of influneces all play a role in informing the decisions that were made in the development of this project. Additionally, the important and growing interest among industrial designers in the process of alienation from our objects and commodities played a central role in defining the parameters for the outcome of this project. It is important to examine our relationship to commodities and to look at ways to reconnect with our histories and memories. It is also essential to minimise our impact on the environment and consider the need for creating new objects. For these reasons, a digital element was always central to this project.

The final outcome of this project was the construction of a working prototype of an interactive Flashlight system that allows people to navigate their digital archives by exploring physcial space.

Most importantly, it was essential to this process that all plans, ideas and information developed from this project were uploaded to the internet to both allow others to construct their own Flashlight system and to facilitate further developments in the relatively new field of Processing. As well as allowing for others to continue to develop or shift the findings of this project, engaging with these online communities will also allow this project to receive some important peer feedback that may influence any future direction this project may take.

4 5

With the increasing impermanence of our material surrounds, how can we find new ways to explore our digital artifacts using our spatial environments?

PROJECT STATEMENT

In response to the transition from the physical to the virtual and the cultural shifts this has caused, this project harnesses our obsession with the visual and our capacity for spacial memory in an attempt to re-engage with how we navigate our digital artifacts.

Using a mobile projection interface it is hoped that this project will create a more intuitive and interesting way to explore our cultural signifiers such as personal and found photo media. By creating an experience that allows users to navigate their digital photo libraries in a way that resembles navigating physical space, the system will utilise spatial memory techniques currently unrealised in our convential methods of viewing digital artifacts.

ABSTRACT

6 7

8 9

RESEARCH & INFLUENCES

10 11

INITIAL RESEARCH SCOPE

This project looks at the effect of impermanent artefacts on our memories and what we can do to prompt us to recall these connections in the domestic environment.

We live in a digital world filled with multi-functional devices and transitory objects. Obsessed with visual culture, we are heading towards a post-object existence. In such a scopophilic society vision is privileged and we become adept at looking, but this has lead to a loss of connection to our histories and artifacts on several levels. Most obviously, we have lost many of these visual signifiers from our daily lives as we minimize the range of objects that we possess. Multi-functional and multi-purpose devices have reduced the number of personal and domestic tools we use and the consumption trends of late capitalism see us transitioning through the objects we do possess at an alarming rate, preventing us from attaching memories, meaning and significance to them. Paradoxically, as we reduce this ongoing connection to these things, we concurrently become inundated with images and representations in this post-object society and often struggle to cope with the sheer volume of digital information that we possess.

This project attempts to harness our short attention span for visual culture, our tendency to glance, in a system that reintroduces these connections to histories and artifacts. By providing a better link to histories and artefacts, we may be able to lead a more enriching life. based on the initial research performed, the most successful outcome for this project will involve the creation of an interactive system that utilises visual projection to embed a virtual layer onto physical space. This system will focus on sight and touch as the core feedback modalities.

INITIAL ABSTRACT

This project attempts to harness our short attention span for visual culture, our tendency to glance and spacial memory techniques, in a system that reintroduces how we navigate our digital artifacts.

By use of a mobile projection interface I hope to create a more intuitive and intriguing way to explore our cultural signifiers such as personal and found photo media. By creating an experience that allows users to navigate their digital photo libraries in a way that resembles navigating physical space, the system will ultalise spacial memory techniques currently unrealised in our convential methods of viewing digital artifacts.

12 13

PSYCHOGEOGRAPHY & THE FLANEUR

Psychogeography explores how elements of the built environment can affect a person’s behavior and their perception of that space. I was drawn to these areas with an interest in exploring whether the ways in which we navigate physical space might also apply to the ways we navigate and visualize our digital environments. Since these digital environments have affected how we store, view and share much of our memory stimuli (such as digital photographs and videos), we may be able to apply similar theories to the digital environment that the Situationists applied to physical space.

The Flâneur - that particular kind of drifter described by Charles Baudelaire - is simply an individual who strolls around the city in order to experience it. This intuitive process could be used to find more playful and expressive ways of connecting to our histories in a system which explores the signifiers that spark our memories.

Various graphic depictions of the Situationist concept of Pychogeography and the Flaneur.

14 15

CABINET OF CURIOSITIES

A cabinet of curiosities was an encyclopedic collection of objects whose categorical boundaries were yet to be defined that existed from the Renaissance through to the Victoria Era. They were also known by various other names such as Cabinets of Wonder and Kunstkammer or Wunderkammer in German (art-room or wonder-room). These spaces were in many ways the precursors to museums and galleries, presenting the viewer with a loosely arranged collection of artifacts meant to spark imagination, memory and mystery.

BLOGS: THE NEW CABINET OF CURIOSITIES

Several internet bloggers describe their sites as Wunderkammer, either because they are primarily made up of links to random things that are interesting or because they inspire wonder in a similar manner to the original Wunderkammer. Interestingly, blogs function as digital Wunderkammer and are therefore removed from the physical signifiers that they represent. Robert Gehl describes internet video sites like YouTube as modern-day Wunderkammers and discusses how they too might be refined into capitalist institutions, “just as professionalized curators refined Wunderkammers into the modern museum in the 18th century.”1

1 http://en.wikipedia.org/wiki/Cabinet_of_curiosities

ROMAN ROOM MEMORY TECHNIQUE

The Roman Room is a memory technique that works by coding information through attaching it to memories of known objects in a room or several rooms. The technique is an ancient and effective way of remembering unstructured information where the relationship of items of information to other items of information is not important. It functions by imagining a room, such as your bedroom, and associating an image with each of the objects that you know to be in that room. In order to recall information, one simply takes a mental tour around the room, visualizing the known objects and their associated images. The technique can be expanded by going into more detail, and keying images to smaller objects. Alternatively you can open doors from theroom you are using into other rooms and use their objects to expand the volume of information stored. Different rooms can be used to store other categories of information. The technique can be applied to other spaces as well, with some users having envisioned a view or a town they know well and populated it with memory images.

This overlaying of information onto mental maps of geographic space overlaps with many ideas explored in the area of augmented reality (which is explored in more depth below) and also illustrates the brains ability to store spatial information and trigger memory data through geographic space information.

Graphic depiction of a Curiosity Cabnet. Graphic depiction of a Wonder Room. Screen cabture of a blog website.

16 17

DESIRE LINES

The term “desire lines” was first developed by French Philosopher and poet Gaston Bachelard in his book The Poetics of Space. The term refers to a path developed by erosion caused by animals or humans walking along an unmarked path. They are a common sight with a loosely defined line offering an alternative path to the predetermined path the council had commissioned. The paths take on an organic appearance by being unbiased toward existing constructed routes. These are almost always the most direct and the shortest routes between two points, though not always the routes anticipated by urban planners. The phenomenon has in fact been used by urban planners in places such as Finland where planners are known to visit their parks immediately after the first snowfall when the existing paths are not visible.

Navigating space contrary to the presanctioned manner is nothing new, but I find that with much of my digital environments the paths I have available to navigate and explore the information can be rigidly defined and less than intuitive Personal digital archives, such as photo albums and home videos collections, which many people use to connect with their histories, are a logical playground for these ideas to be implemented.

SOUVENIRS - OBJECTS & PHOTOGRAPHS

Deriving from the French word for ‘to remember’, a souvenir is a memento, keepsake or token of remembrance acquired for the memories the owner associates with it. The term souvenir brings to mind the mass-produced kitsch that is the main commodity of souvenir and gift shops in many tourist traps around the world, but a souvenir can be any object that can be collected or purchased and transported home by the traveller.

A souvenir as an object in itself has no real significance other than the psychological connection the possessor has with the object as a symbol of past experience. Without the owner’s input, the object’s meaning is invisible and cannot be articulated. With many souvenirs mass-produced in places that have no relation to the tourist site beyond the image represented in the object, the signifier role of the object is reinforced as its worth is predominately determined by its relationship to its owner.

The photograph would have to be the most common form of souvenir, serving as a medium to document significant events. The photograph itself carries much theoretical baggage with it when considered as an object of discursive analysis, particularly in relationship to ‘seeing’ and the role of the viewer. The question of the photographs status as an ‘original’ or ‘copy’ is also at play as with the photos ability to substitute a tangible object within the role of stimulating memory.

As with our increasing consumption of physical objects, our dramatic increase in consumption of digital objects has been aided by developing technologies. Our ability to capture magnitudes of photographs with the advent of the digital camera and the ability to store them permanently with cheap data storage has revolutionised the photography industry. Without film restricting our capacity to take photographs or the limitations of physical storage space, the photographer can store an almost infinite number of images. With so many images in our possession, it becomes difficult to develop meaningful relationships to these objects individually, and we become attached to the collection - and collecting more broadly - rather than having intimate relationships to a special few.

Photograph of a Desire Line wihth in public park.Film still from Amelie, depicting the emotional reponse of memory stimulating objects.

18 19

PLANNED OBSOLESCENCE ANDMASS CONSUMPTION

Planned obsolescence is a policy or process employed by industry where companies deliberately plan or design a product with a limited useful life, so it will become obsolete or nonfunctional after a certain period. This puts the consumer under pressure to purchase the product again, either from the same manufacturer or from a competitor, creating artificial demand in that industry.

Planned obsolescence was first developed in the 1920s and 1930s when mass production became the industry norm. Since this period - and particularly since the post-war boom - mass consumption has also become normalised. Henry Ford, the father of mass production, and his peers understood that mass production would necessarily lead to mass consumption, and a range of tools have been used to ensure this continued relationship, including advertising, planned obsolescence and a culture of competitiveness.

ALIENATION

Alienation is a process identified by Karl Marx whereby people become foreign to the world they are living in. Individuals often seek to mitigate their alienation through the aquisition of goods in the hope of buying their way out of their alienated state. It has been argued that by late capitalism the whole sphere of personal consumption had been reorganised according to commercial principles where cultural products also gain “a life of their own” completely independently of the producers.

SCOPOPHILIA AND GLANCE CULTURE

Scopophilia is literally “a love of looking”. Many theorists, including Roland Barthes, have argued that we live in a visual culture where vision is privileged above all other senses. However, in this image saturated world, while we have become extremely visually literate, we are also inundated with so many images that many of them don’t register on more than a superficial level. This has led to a glance culture where we no longer pay strict attention to the images before us, often at our own expense. For example, it has been said that the average time spent before artworks in a gallery or museum is less than 8 seconds.

Photograph of discarded televisions on the pavement.

20 21

PROJECTION MAPPING

3D projection is a method of mapping three-dimensional points to a two-dimensional plane. An extention of this principle is projection mapping, a technique of beaming video (with a standard video projector) onto three dimensional objects and adjusting and masking the image so that it follows the contours of the shape of the target object. The result can be surprisingly effective and eye catching as the video is no longer a flat square on the wall but becomes an object in space - in effect, an animated sculpture. Projection mapping isn’t a new concept, but with the increase in cheaper technology capable of creating this effect, the technique is becoming more commonplace. It has come to be used more and more in commercial contexts, but the modding and artist communities still often use the technique in unconventional and interesting ways.

FOLLOWING FORM

Artist Barbra Kruger is famous for her use of type-immersed spaces which literally skin gallery spaces with text, making the surfaces of the space into the works themselves. Designer Ian de Grucci worked with Kruger on her show at ACCA, helping her transpose her text works into mapped projection.

Diagram of projection morphing.Photograph of Projection Mapping installationPhotgraph of graphics following formPhotograph of installation by Barbara KrugerPhotograph of Projection Mapping installation by Ian De Gruchy

22 23

AUTO CALIBRATION - PROJECTOR-BASED LOCATION DISCOVERY AND TRACKING

Johnny Chung Lee’s PhD study at Carnegie Mellon University in the Human-Computer Interaction Institute employed the basic idea of using projected light to discover the locations of optical sensors. This location data can be fed back to the computer for use in a projected application. This results in a significant simplification and enables interactive projection and augmented reality applications. It eliminates the needs for calibration and eliminates the needs for an external tracking technology.

FOUR POINT CALIBRATION SURFACE MAPPING

Using the Wiimote to track sources of infrared (IR) light, it is possible to track pens that have an IR led in the tip. By pointing a wiimote at a projection screen or LCD display, you can run software to calibrate the area by mapping it to a grid using the four point calibration method formalised by Johnny Lee.

WATCHOUT - MULTI-DISPLAY PRODUCTION AND PRESENTATION SYSTEM

Dataton WatchOut is the leading multi-display production and playback system. The software based system can be used to orchestrate stills, animations, graphics, video, sound and live feeds in a single impressive show across multiple display areas, soft-edge or scattered. RMIT has a 3 projector screen Watchout setup installed in building 8 which i hope to use for simulating and testing my project.

MULTI -POINT INTERACTIVE WHITEBOARDS USING THE WII REMOTE

Since the Wiimote can track sources of infrared (IR) light, you can track pens that have an IR led in the tip. By pointing a wiimote at a projection screen or LCD display, you can create very low-cost interactive whiteboards or tablet displays.Since the Wiimote can track up to 4 points, up to 4 pens can be used. It also works well with rear-projected displays.

Photograph of Projection Auto Calibration ProcedurePhotgraph of IR Four Point calibration ProcedurePhotographs of Wii IR Whiteboard installations.Photographs of WatchOut multi-display installations

24 25

AUGMENTED REALITY

Augmented reality (AR) is a term for a live direct or indirect view of a physical real-world environment whose elements are augmented by virtual computer-generated imagery.

AR systems typically generate a composite view for the user that is a combination of the real scene viewed by the user and a virtual scene generated by the computer that augments the scene with additional information. The virtual scene generated by the computer is designed to enhance the user’s sensory perception of the world they are seeing or with which they are interacting.

MULTIPLE INFORMATION SPACES

It is possible to create multiple information spaces whereby some digital information sits outside of the throw of a projector. Rather than changing the canvas to fit the viewing space, it is possible to change the position of a portable or micro projector to reveal information displayed outside the original position. This creates an illusion of exploring large information spaces embedded in the physical environment as if using a flashlight.

Using this technique, the display and interaction space can be expanded to cover almost an entire physical environment and support interaction concepts that are not possible on traditional desktop or handheld devices.

Building on these unique affordances of handheld projectors, Xiang Cao and Ravin Balakrishnan have explored techniques for dynamically defining and interacting with multiple information spaces embedded in the physical environment using projection. Enriching the interaction possibilities and leveraging human ability to perform bimanual tasks they used a passive pen to support annotations and local interactions.

VIDEO PAINTINGSWEATSHOPPE

Multimedia performers Sweatshoppe have recently been pasting buildings with moving images all over New York. Mapping video projections to LED-lit paint rollers, Sweatshoppe lay their projections on a surface, paint-stroke by paint stroke. They call this new digital performance style “Video Painting”.

How it works: The software controlling the video was written in Max. The paint roller does not use any sort of paint, it simply contains green LEDs. The software tracks the color green and outputs the ‘x’ and ‘y’ positions which are sent to drawing commands and the strokes are textured with video.

Sweatshoppe is video artists Bruno Levy and Blake Shaw. They plan on eventually releasing the software, but only after it is much more refined, buffed up with features and is user-friendly.

Photographs of Xiang Cao and Ravin Balakrishnan multiple information spaces prototypes.Diagram explaining how digital information sits outside of the throw of a projector. Photographs of Sweatshoppe’s Video Painting installations.

26 27

PERPETUAL STORYTELLING APPARATUS JULIUS VON BISMARCK & BENJAMIN MAUS

The “Perpetual Storytelling Apparatus” is a drawing machine that illustrates a never-ending story by translating words of text into patent drawings.

Seven million patents — linked by over 22 million references — form the vocabulary. By using references to earlier patents, it is possible to find paths between arbitrary patents. They form a kind of subtext.

The machine attempts to show how new visual connections and narrative layers emerge through the interweaving of the story with the depiction of technical developments.

Basic procedure:

1. The program downloads and parses a part of the text of a recent best-selling book.

2. The algorithm eliminates all insignificant words like “I”, “and”, “to”, “for”, “the”, etc. The remaining words and their combinations are the keywords for the patent drawings.

3. Using the keywords in chronological order, it searches for the key-patents.

4. The program now searches for a path connecting the found key patents. This is possible because every patent contains several references to older patents – the so-called “prior art”.1

5. All key-patents and the patents connecting them semantically are arranged and printed.

6. Repeat step 1.

1 http://www.creativeapplications.net/objects/perpetual-storytelling-apparatus-objects/

WANDERING IN KNOWLEDGEUNIVERSITY OF THE ARTS BREMEN

In the centre of the 15 metre high stairway at the University of the Arts library in Bremen, a sculpture of folded paper demonstrates the connection between the traditional storage medium and the digital world. This system projects automated graphics that reflect search results that have been entered into the library catalogue at that point in time. The media sculpture highlights the number of mental processes which take place simultaneously in the library.

“Altogether the media installation poses questions about the function of the information in the age of the increasing communicational isolation. In regard to the title, the visitor literally passes through the world of knowledge.”1

1 http://vimeo.com/13544405

http://wandernimwissen.wordpress.com/

Photographs of the Perpetual Storytelling Apparatus Photgraphs of the Wandering in Knowledge installation

28 29

VIRTUAL GRAVITY SILKE HILSING

“Representing the importance and popularity of search terms in the form of physical weight.”1

Virtual gravity is an interface between the digital and analogue world. Built using Processing, Arduino and reacTIVision, the installation allows you to measure weight (ie. popularity) between terms via “loading” the information into pads which are then positioned on mechanical scales. Terms that “weigh” more send messages to Arduino controlled cylinders which lower and rise depending on the term’s popularity in Google Search Insight. You can search desired terms via keyboard input operated via the same pads. In this way impalpable, digital data gets an actual physical existence and becomes a sensually tangible experience.

1 http://www.creativeapplications.net/processing/virtual-gravity-processing/

http://www.silkehilsing.de/

APARTMENTMARTIN WATTENBERG & MAREK WALCZAK

In Wattenberg and Marek’s Apartment, viewers are confronted with a blinking cursor. As they type, rooms begin to take shape in the form of a two-dimensional plan, similar to a blueprint. The architecture is based on a semantic analysis of the viewer’s words, reorganizing them to reflect the underlying themes they express. The apartments are then clustered into buildings and cities according to their linguistic relationships.

Each apartment is translated into a navigable three-dimensional dwelling, so contrasting between abstract plans/texts and experiential images/sounds.

Apartment is inspired by the idea of the memory palace. In a mnemonic technique from a pre-Post-It era, Cicero imagined inscribing the themes of a speech on a suite of rooms in a villa, and then reciting that speech by mentally walking from space to space. Establishing an equivalence between language and space, Apartment connects the written word with different forms of spatial configurations, similar to teh Roman Room technique.

Photographs of the Virtual Gravity device Illustrations of Wattenberg and Marek’s Apartment piece

30 31

BERT BONGERSINTERACTIVE READING TABLE

Reading Table uses physical objects that people interact with to generate particular new media content such as videos, sounds and web sites. It combines the tangibility and clarity of printed media such as books, papers and articles with the malleability and flexibility of new media. The aim is to create an integrated experience for the users, bringing together traditional media (such as books) with new media (such as video). The table uses RFID technology to link the physical objects to the media content.

INTERACTING WITH DYNAMICALLY DEFINED INFORMATION SPACES USING A HANDHELD PROJECTOR AND A PENXIANG CAO, RAVIN BALAKRISHNAN

The recent trend towards miniaturization of projection technology indicates that handheld devices will soon have the ability to project information onto any surface, thus enabling interfaces that are not possible with current handhelds. Cao and Balakrishnan’s project interacts with multiple virtual information spaces embedded in a physical environment using a handheld projector and a passive pen tracked in 3D. They developed techniques for defining and interacting with these spaces, and exploring usage scenarios.1

1 http://www.cs.toronto.edu/~caox/uist2006_handheldpro-jector.pdf

Photographs of the Interactive Reading Table installation Photographs of Xiang Cao, Ravin Balakrishnan prototype testing

32 33

BACKGROUND DEVELOPMENT

INTERACTIVE WALLPAPER

In 2010 as part of the Touch Points upper-pool studio I started my inquiry into how projection touch technology could be ultilised in a domestic environment. The project aimed to examine notions of the home and how an alternative interface could be utilised to enrich the lives of a house’s inhabitants. The concept developed using projected imagery and touch surface technology to create a system which allows users to explore media in a more abstract and, in many ways, more intuitive method than those accommodated by current household computers. It was hoped that through this alternative interface users would find their memories stimulated in a unique manner and thus expand their sense of self, improve optimism and aid their social interaction.

This project focused on the question: “how can we make a house a home?”. Alain De Botton, in his popular philosophy series ‘The Architecture of Happiness’, claims “the home is a place that succeeds in making more consistently available to its occupants the important truths that the wider world ignores”. The Interactive Wallpaper concept aimed to facilitate and develop this human need by allowing occupants to project surfaces with media that can remind them of their history and ideals, things which can easily be lost in the midst of their daily activities.

The technical aspects of this project are illustrated on the following two pages.

Various photographs of tile graphics used in installations.Illustration from my Digital Wallpaper concept.

34 35

Diagrams from my Digital Wallpaper concept.

36 37

INITIAL SKETCHES

38 39

40 41

42 43

44 45

46 47

INITIAL EXPERIMENTS

48 49

INITIAL DRAWINGSVISUAL LINES IN ENVIRONMENTS

A series of line drawings where I was experimenting with creating abstract graphic representation of physical spaces by making lines where I perceived paths within the environment. This technique is similar to the way Zaha Hadid designs many of her buildings.

Scans of the series of line drawings.

50 51

GRAPHIC COLLAGESIMAGES OF URBAN ENVIRONMENTS

These are a selection of graphic collages that I made whilst in Japan at teh beginning of my project. Using a photocopier to create layers though photocopying photocopies, the original images become unrecognisable as its data is over manipulated. Looking at graphic collage as a precursor to digital collage, this experiment focussed specificially on ideas of automated mashing of images, distortion and the idea of images only discernable to the individual for who it has personal significance.Scans of the graphic collages of urban environments.

52 53

TILE GRAPHICS

One of several experiments into vector line graphics that could be tiled to form seamlessly flowing graphics on surfaces.

Scans of a my tile graphic sequence Photograph of an artists installation using the tile graphic technique.

54 55

TILE GRAPHICS MAPPING INTERACTIVE TERRAINS

Using these tile graphics as the grid to map surfaces in the interactive terrain, this methodology determines and articulates the boundaries of the feedback zones.

PHYSICAL SIMULATION TO DETERMINE DIGITAL AESTHETIC

A simple example of some tests I did to simulate an effect I wanted to create digitally. Using a simple piece of cartridge paper on a screen, this test looked at ideas of blurring and focussing images within an interactive field. Sometimes it is easier to play with hands-on materials to test ideas than try to make them digitally.

Scans of a tile graphic concept form my Digital Wallpaper Project.Photographs of me testing an digital animation idea using physi-cal techniques.

56 57

PORTABLE INTERACTIVE DISPLAY

Using the Wii Remote whiteboard techniques, coupled with a digital micro projector, I was able to create a highly portable interactive display. With the ability to simply pick up the device and project onto a range of surfaces, it is easy to use the Wii Remote and IR pen to transform any projected surface into a touch screen. This allowed me to test concepts without having to move cumbersome equipment.

Photographs of my mobile digital whiteboard device.

58 59

QUICK 4 POINT CALIBRATION

If the scope of the projected canvas is moved or the focus changes or anything is altered, then the 30 second manual 4 point calibration technique is nesssary. A similiar technique could be used to manually set the projection keystone digitally., allowing the user to quickly draw out the required perspective angle.

IR PENS

In the past I have looked at infrared thimbles to generate touch screen feedback. This time I have opted for the more reliable pen model. I would prefer to eventually develop gesture-based feedback rather than point and click the IR pen to make the system more intuitive and user-friendly.

Photographs of me calibrating a Wii whiteboard installation.Photograph of a DIY IR Pen.Protograph of my mobile digital whiteboard device.Photograph of my DIY IR Pen.

60 61

INTERACTIVE TILE

Recreating the interactive tile simulation I made in my previous studio, I this time projected it with the micro projector. This allowed me to practically experiment with the concept as an individual tile rather than a collage of many and also to play with scale and surfaces.

Photographs of me testing my Digital Wallpaper Project prototype.

62 63

DRAWING WITH LIGHT

This installation involved a Wacom tablet and projector and experimented with digitally adding content to environments without physically manipulating them. The technique requires you to look at the projection surface rather than the tablet of the computer screen. Although Sweatshoppe’s video painting makes the tablet redundant, both techniques allow you to digitally draw - or in their case paint - directly onto the projected surface.

Photographs of my drawing with light installation.

64 65

PROJECTION MAPPING

Some of my experiments with mapping and then masking 3D surfaces with projection graphics.

Photographs of my projection mapping experiments.Screen capture of the template used to map the rooftop.

66 67

SCREENS FROM PLANES IN THE PROJECTION SPACE

What if the projection device was able to scan the 3D surfaces in its projection area and then automatically compile a collection of different sections from the different planes in the space? These sections would then form networked set of displays each being an independent screen zone.

Illustration using personal photo collages simulating my projected planes concept.

68 69

INTERACTIVE SCREENS FROM PLANES IN THE PROJECTION SPACE

What if these zones were then to become interactive with users manipulating their digital contents through gesture or touch?

For example, each zone could show a different image form the users library of personal cultural signifiers. The user could navigate through the media as if flicking through a slideshow to compose their desire configuration for the projection collage.

Illustration using personal photo collages simulating my projected planes concept.

70 71

INITIAL CONCEPT

72 73

PROPOSED OUTCOMEOVERVIEW

This project attempts to harness our short attention span for visual culture, our tendency to glance, in a system that reintroduces these connections to histories and artifacts. By providing a better link to histories and artefacts, we may be able to lead a more enriching life.

74 75

PROPOSED OUTCOMECONCEPT

As illustrated, this project attempts to harness our short attention span for visual culture using a system that reintroduces connections to histories and artifacts.

I propose to create an intuitive system that helps the user prioritize their digital objects and allows them to create more in-depth relationship’s to these digital artifacts. This new system will transcending the glance-based relationship we currently have with digital objects, permitting us to explore them in a more meaningful and intuitive way. More than just a digital database, this system will create an intimate, curated collection.

The outcome of this project will be the creation of an interactive system that utilises visual projection to embed a virtual layer onto physical space. This system will focus on sight and touch as the core feedback modalities.

At this stage of development, it is envisaged that this system will incorporate gesture-based interaction using Processing software Arduino hardware and modified Microsoft Kinect sensors to facilitate a live interaction within an automated projection mapping zone.

Illustration using personal photo collages simulating my projected planes concept.

76 77

PROPOSED OUTCOMESCENARIO

INSTALLATION

The projection device is installed in the space and angled to the desired surfaces of the room.

TERRAIN MAPPING

The projection device then scans the surfaces it will project onto and automatically establishes a composition of zones to reflect the different planes of the surfaces.

USER MANIPULATION

If the user wants to change the content that the device has automatically compiled, the user guestures in the area of the zone the user wants to manipulate and the system will suggest another file for that zone.

SECURED COMPOSITION

Once the user is happy with the composition of the projection space the user can lock the composition and use the cultural signifiers displayed by the device as memory triggers.

AUTO COMPOSITION

The projection device then assigns a media file from the user’s library to each of the zones.

Illustration using personal photo collages simulating my projected planes concept.

78 79

DEVELOPMENT

80 81

NEW DIRECTION

Reflecting on the themes of the earlier proposal, at this point in the project I shifted direction. I was keen to move away from the fixed nature of the previous concept, so I began to experiment with more portable hardware. In particular, I became interested in how I could use micro projectors to enhance the spacial environment.

As personal memories and the artifacts people use to stimulate memory were recurring themes from my earlier research, I was worried that the existing direction was too impersonal and that users of the device might be effected by the gaze of others when using the system. The transition from public planes to private portals seemed to align more approriately with the core ideas behind the project.

From the outside of this project I have been interested in incorporating Situationist approaches to spacial experience. The existing proposals’ physical restrictions to the throw of a fixed projector seemed unlikely to allow for any interaction, reappropriation or engagement with space outside the immediate area. I cam to realise that a portable device could engage with a much larger space and the ability of users to use the device differently in different spaces seemed consistent with many of the Situationists’ ideas.

One final concern was that the project was moving away from artifacts as memory stimuli to being more about spactial memory and the effect interaction with certain environments has on triggering memory. Instead of merely presenting the user with media to glance at, this new direction concentrated on providing users the ability to reveal artifacts and personally explore them.

Flash tile graphic projected on door surface navigated by IR Pen and IR whiteboard Application.

82 83

SIXTHSENSE A MOBILE PROJECTION INTERFACE

The SixthSense prototype comprises a pocket projector, a mirror and a camera contained in a pendant-like, wearable device. Both the projector and the camera are connected to a mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces, while the camera recognizes and tracks user’s hand gestures and physical objects using computer-vision based techniques. The software program processes the video stream data captured by the camera and tracks the locations of the colored markers (visual tracking fiducials) at the tips of the user’s fingers. The movements and arrangements of these fiducials are interpreted into gestures that act as interaction instructions for the projected application interfaces.

This system, developed by Pranav Mistry, a PhD student in the Fluid Interfaces Group at the MIT Media Lab, was another great precident for the area I’m exploring with this project. Specifically, his project aims to offer an alternative interface to many of the tasks currently undertaken by our phones and laptops, such as reading an online document or taking a photo and manipulting it. My project, on the other hand, has a much more specific desired function - concentrating mainly on navigating personal digital artifacts. This allows me to adopt some of his methods, such as cursor generation and micro projectors, and customize them to suit specific tasks. Pranav’s project has to deal with the inevitable restrictions applied to multifuctional devices. Using one device to do a range of tasks adequately is very different to a single device that does a specific task very well. Pranav has more more ambitious and multidisplinary plans for his concept, but the core idea of a mobile projection interface is a shared goal.

Illustration of SixthSense prototype.Photograph of the SixthSense prototype being tested.

84 85

SIMULATIONS OF PROPOSED PROCESSING EFFECTS

Rough simulations of the visual interface, a tiled grid structure with the main image shown and neighbouring images only partially visible.

SIMULATIONS OF PROPOSED PROCESSING EFFECTS

Playing on the theme of the spotlight, this technique reveals a digital layer on the surface of an environment.

At a distance the projected image is shown in full but is blurred. When the viewer moves closer to the image it becomes sharper, but only a part of the image is shown.

I could achieve this effect by using a depth sensor and ultilising the projector’s fixed focus.Illustrations simulating ideas for visual interfaces.

86 87

PROCESSING TO CREATE IMAGE NAVIGATION INTERFACES

Building on my experiments with image tile navigation in my Digital Wallpaper Project in 2010, I am now building a Processing script that will automatically generate a related effect. The desired result will be similiar to a pile of photographs stacked roughly on top of each other so that you can see part of the lower layer photographs, but can only see a whole image of the one that is on top. The user will be able to gesture to reconfigure the composition to reveal an image from lower in the pile.

Screen capture of the photo collage layout used in the prototype for my Digital Wallpaper project.Film still form Matthew Lewis Cycling 74 Processing piece.Screen captures form my initial Processing Tile Graphic.

88 89

ALTERNATIVE SOURCES OF IR LIGHT

Infrared light comes from the sun and most other light sources, so I started to look into unconventional sources to make IR light point-to-use with the Wiimote whiteboard system.

I came accross an example of a processing sketch activated/deactivated by igniting/extinguishing a candle.

This brought me to experiment with a naked flame to create an IR point that I could use with the Wiimote to generate curser based navigation feedback.

IR ARRAY

Johnny Lee, the Wii remote hacking master, has an interesting technique of using an array of infared LEDs to illuminte the area in front ot his Wii Remote in order to create IR points through reflection. He was able to use the surface of his fingertips to generate IR points readable by the Wiimote software, but only at short range. In order to create reflection based IR points at more useful ranges he cut small cubes of highly reflective material and ahered them temporary to his fingertips.

I made a rough array of IR LEDs to emulate the effect, but i was unsuccessful as my IR light source was not strong enough to cause sufficient reflection. This method still has potential if I could make a a much brighter IR source and is a potential avenue of investigation if other lines of research fail.

Screen capture of the Wii Whiteboard software.Photograph of me testing the Wii with alternative IR light sources.Photograph of Johnney Lee testing an IR LED array with the Wii remote.Photograph of me testing the Wii with a phototype IR LED array.

90 91

STATIC IR PEN & MOVING RECEIVER

In order to cut down the devices within the planned system, I decided to experiment with keeping the IR pen static whilst moving the Wii Remote and micro projector to generate feedback. Although this method severely limits the accuracy of the system, the final interface needs only simple gestures to generate the desired image transitions.

Photograph of me testing a moving Wii with Static IR Pens.Illustration explaining how i plan to use multiple information spaces.

92 93

IR CODES

An IR remote works by turning the LED on and off in a particular pattern. However, to prevent inteference from IR sources such as sunlight or lights, the LED is not turned on steadily, but is turned on and off at a modulation frequency (typically 36, 38, or 40KHz). The time when a modulated signal is being sent is called a mark, and when the LED is off is called a space.

Each key on the remote has a particular code (typically 12 to 32 bits) associated with it, and broadcasts this code when the key is pressed. If the key is held down the remote usually repeatedly broadcasts the key code.

On the receiving end, the IR detector demodulates this signal and outputs a logic-level signal indicating if it is receiving a signal or not.

There are many online resources explaining how to create your own custom remote using Arduino. Although I found no exclusive examples of using IR signals to navigate personal digital artifacts, it seems feasible that the techniques I have come across could be applied to such a system.

As an alternative to using an RFID tag system, I could use the IR light source I need to enable the Wii Remote to be used as a cursor-based navigation tool to also act as the location sensing system. To achieve this, each beacon will have a specific IR pulsating pattern. Within the Digital Flashlight device there will be an IR receiver that will decode the pattern and activate a command within the processing script which will import the file library associated with that particular beacon.

Image of IR code signal. Photograph of a project that uses Arduino, IR recievers and transcievers to make a custom SLR camera remote.Screen Capture of a Processing sketch that decodes IR signals

94 95

SECOND CONCEPT

96 97

THEORY AND INSPIRATION

The values expressed by Gaston Bachelard in Poetics of Space around the emotional qualities of our residential spaces sparked my interest in voids, corners , corridors and conventionally unappreciated (but in my view important) spaces. Combined with my interest in Cabinets of Curiosity, where owners curate collections of artifacts into meaningful displays, and the Situationists conception of the derive, I began to look at usage of the spacial environment within navigating digital visual media by means of instinctive meandering.

This developed into the idea of physically exploring our digital artifacts, in particular our digital photographs which are

increasingly the prominent medium though which we collect and recall memory. This new system creates unexpected and/or previously unrealised links between images and allows the user to curate narrative more intuitively than is possible with traditional interfaces. Projection quickly asserted itself as the most approriate means of materialising these ideas outside the screen-based nature of desktop computing. Thinking about this idea of the portable projection device, I was struck by its relationship to the flashlight. The flashlight’s ability to illuminate that which is hidden or repressed, as well as its limited ability for partial revelation, sits neatly with the aim of this project.Image of Paperback cover of The Poetics Of Space.

Graphic depiction of the Situationist concept of Pychogeography.Graphic depiction of a Curiosity Cabnet.Image of common flashlight.

98 99

DIGITAL FLASHLIGHTFINAL CONCEPT

This project explores themes of spatial memory, narrative and the act of reminiscing.

The Flashlight System projects a digital layer onto physical surfaces, enabling people to explore digital artifacts by navigating physical space. The user can associate personal cultural signifiers in the form of visual media to geographic locations by linking images to specific beacon objects that the user distributes within their physical environment. The user can then access the media by physically shining the digital flashlight onto the surfaces where the beacons are located, revealing an interactive collage that can be navigated intuitively with characterisitics similiar to how you operate a traditional flash light.

Diagram explaining features of the Digital Flashlight concept.

100 101

Illustration explaining features of the Digital Flashlight concept.

102 103

IMAGE PATHS & MULTIPLE INFORMATION SPACES

It is possible to create multiple information spaces where some digital information sits outside of the throw of a projector.

Rather than changing the canvas to fit the viewing space, it is possible to change the position of a portable or micro projector to reveal information displayed outside the original position. This creates an illusion of exploring large information spaces embedded in the physical environment as if using a flashlight.

I plan to use this method to create the experience of physically exploring paths of virtual media. In particular, photo library slideshows could be navigated by moving location and thus changing the images displayed.

Diagrams explaining features of the Digital Flashlight concept.

104 105

COMPONENT INTERACTIONS

Overview of how the different components of the Digital Flash Light system will interact with each other and are positioned relative to each other.

BEACON

Arduino Mini Pro (5V)

IR Transmitter Breakout Board

Watch Batteries (5V)

DIGITAL FLASHLIGHT

Wii Remote

3M M120 Micro Projector

Arduino Uno

IR Reciever breakout board

106 107

IMAGE TILE

Simulation of the image tile that the Digital Flashlight will project onto the beacon embedded surfaces.

Illustrations simulating how the user would transition through different images within the projected interface of the Digital Flash Light.Illustration simulating how someone would hold the Digital Flash Light device.

108 109

DETAILED DEVELOPMENT

110 111

INFRA-RED LIGHT FOR NAVIGATION

Following my experiment with a naked flame as a reference point to simulate cursor navigation, I was keen to again try using an IR LED array to create reflected points of IR light. Given that Johnny Lee had successfully demonstrated that the technique was possible, I set out to replicate his findings.

Unfortunately, I could not find a cheap pre-fabricated IR LED array to purchase, so my first tests used a bread board to make a basic IR LED array circuit. The initial configuration wasnt very bright but it was bright enough for the Wii remote to sense relected IR light. Since my IR array was not exceptionally light I found that I needed to use highly reflective surfaces and after some further testing I found that a segment of a bicycle reflector worked well for the task.

My first working concept prototype consisted of s breadboard housing an IR array of a dozen 1.2v IR LEDs paired with 330 Ohm resistors running off a 5V power source. With the breadboard positioned beneath the wii remote i was able to relect enough IR light from a 10mm square of bicycle reflector for the wiimote whiteboard software to detect IR points using its IR camera monitor function, proving that an IR array would work with my desired application.

I then decided to invest in a commercial IR LED spotlight. The one I eventually purchased was designed to be used with IR night vision security cameras. When Isetup my new array i was disppointed to find that the Wii remote was completely unresponsive to this new IR light source and concluded that the IR light it transmitted must not be suitable for the Wii’s IR camera to function. Frustrated, I conclued that my best path of action would be to construct my own IR array and thus after many hours of soldering and trial and error ciruit making I created a DIY version of what Johnny Lee had used with his experiments. Fortunately, this lead to some successfully replicated results.

Photographs of Johnney Lee testing an IR LED array with the Wii remote.

112 113

Scan of sketches working though the details of the prototype.Photograph of me testing my initial IR LED array prototype with the Wii remote.Photograph of my third IR LED array prototype.Photograph of my second IR LED array prototype.

Photograph of my purchased and failed IR LED array.Photograph of a off the shelf IR LED array.Scan of a circuit diagram of a IR LED array.Photograph of my final IR LED array prototype

IR LED ARRAY PROTOTYPING

114 115

PROTOTYPE FOR TESTING CONCEPTS

After constructing my IR LED array my next challenge was to create a device that would allow me to test my processing sketches and physically test within the spacial environment.

As I was between houses and did not have easy access to many tools, I soon became an advocate for cable tie construction. Using kitchen renovation scraps I was able to create a hand held protoype that in most ways would function the same as my final model.

With a mess of cables sprawled towards power boards and computer outputs I was able to start working out what I could feasible create within the boundaries of this project.

Photograph of an early Digital Flash Light device prototype.Photograph of cable tie experiments with the Digital Flash Light device prototype.Photograph of testing a Digital Flash Light device prototype with an reflector beacon.Photographs of the construction of a Digital Flash Light device prototype.

116 117

Illustration simulating how someone would hold the Digital Flash Light device.Photograph of the developed Digital Flash Light device prototype.

118 119

PROOF OF CONCEPT PROTOTYPE

My initial prototype consisted of a rough wooden frame holding a 3M micro projector and a Wii remote mounted with my DIY IR LED array and an RFID Tag reader/USB break-out board. All of these components were held together by an assortment of cable ties and wood screws. The prototype was completed using a kitchen cupboard handle and a mixture of electrical wires running to powerpoint transformers and a VGA monitor output and a USB port on a laptop.

The proof of concept prototype was not aesthetically pleasing but served it’s function - it was a canvas ready to test the software component.

Photographs of the developed Digital Flash Light device prototype.

120 121

BEACONS

The digital flashlight required a point of reference to tag digital media and from which to create cursor movement. The solution that I found was a component named a Beacon.

Initially the IR reflection was made by using bicycle reflectors. These worked, but created a cluster of IR points instead of one defined point and they were also intrusive, both physically and aesthetically. Later I found that reflective adhesive tape used in the marine and auto industry created a much better result as they were visually minimal and created a single point of reflection.

The Beacon’s other function was to identify the point within the environment and to do this I employed RFID tags. Although I initially explored the possibility of using IR remote signals, I decided against that technique as I preferred the Beacons’ unintrusiveness and was concerned that the prospect of having complicated electronics in each beacon would create restrictive parameters both economically and physically. I settled on placing all the active components within the Flashlight device and leaving the Beacons as passive objects.

My first tests used RFID card-sized tags which offer the best range within a budget. I found these to be too intrusive, so I opted for 16mm RFID buttons which offer slightly less range but fit the asthetic of the iconic push pin which found more suiting for the project. I attempted to pierce large push pins through the RFID buttons to disguise them, but this stopped them working. Instead I had to glue them ontop of shorter push pins which after some refinement looked fine and worked well.

Photograph of the Adhessive reflective tape.Photograph of the RFID tag used in the Beacons.Photograph of common push pins.Photograph of the developed Beacon prototypes.Photograph of the early Beacon prototypes.Photograph of testing of the Beacon prototypes.

122 123

CONCEPT SUMARY

Diagram explaining how the Flashlight System can project a digital layer onto physical surfaces, enabling people to explore digital artifacts by navigating physical space.

124 125

DIGITAL TERRAINS & MULTIPLE INFORMATION SPACES

My initial research into multiple information spaces led me to the idea of having digital information sitting outside of the throw of a projector.

The prospect of achieving this by changing the relative position of a micro projector to reveal information displayed outside the original position became a core aspect of my investigations. The illusion of exploring large information spaces embedded in the physical environment as if using a torch to reveal surfaces led to the devices title: Digital Flashlight.

My earlier ideas of using IR points to map planes within spaces evolved to using the IR beacons to identify zones within spaces. With this shift, my initial interest in pychogeography and desire lines merged in the idea of using the beacons as points along a path. The user was able to follow different paths by moving from one particular beacon to the next. This joining the dots experience referenced many areas. The act of exploring the space by impulse and revealing unplanned paths had links to the subconcious drift process of exploring built environments conceptualised by the Situationists. The ability of users to link their own digital artifacts to specific locations and then walk through this information by revealing them within the space could be seen as an extension of the the Roman Room memory technique sited earlier. Another important feature of the imagined esperience would be the ability of people to use the same space in different ways. One use that stood out would be to have multiple exhibitions running in the same space and having the user choose which exhibition she wanted to view by choosing which path of beacons she wanted to follow.

Diagrams explaining features of the Beacons could be used to form narrative paths with Digital Flashlight system.

126 127

INTERACTION DIAGRAM

128 129

DESIGNING FOR PROCESSING

Designing for virtual interfaces is rather different to designing for physical interfaces. One challenge is how to communicate relationships that aren’t physically based. Storyboards and still image animation are some tools that I used to prototype and explore ideas. Mindmaps were also handy to summarise the different functions, factors and elements within the proposed system.

As the relative cursor movement generated from the reflected IR point was the only way to navigate the libraries once opened via the RFID, the complexity of the interface was severely restricted. The movement generated by moving the device is inaccurate and inconsistent, so the interface had to match this with its simplicity. My initial ideas centered around a tile system with active areas that would change content as the cursor moved over them.

Understandably my ideas of what is possible is guided by the tools i have experience in using. As a result, my ideas tended to circulate around similiar aesthetics to what I had built prior in Adobe Flash or Dreamweaver.

Further, I was hesitant to invest too much time into the finer details of the interface design as I knew I would be greatly restricted and influenced by the precedents I could find online and the skills that my computer science collaboration had to offer.

Sketches of layouts for the projected interface of the Digital Flash Light.Illustrations simulating layouts for the projected interface of the Digital Flash Light.

130 131

DIGITAL PROTOTYPING

Learning a new tool without a set guide or without a set agenda I find more about detecting patterns than learning the fundamentals. I started by playing around with the software protoyping tools, such as Pure Data and Processing/Arduino. Intitially I found PD more accessible due to its more visual interface, but I later found more precedents within the Processing community with similarities to my project, so I concentrated solely on Processing.

After some initial reading into the basics of the Processing and Arduino software, I then set about finding other people’s code that they have uploaded to share with others. I found OpenProcessing (www.openprocessing.org) to be a great source of material.

Downloading existing sketches allowed me to dissect and manipulate the code by a process of trial and error, mishmashing lines of code together until I got a result I was after or better presented me with a phenomen i had not thought of. This process, although not accurate and very messy, allowed me to play with ideas without having to grasp the whole picture.

Here are some sketches I have compiled to create features that I hope to combine together into a final script to run the projected interface for the Digital Flashlight.

PANNING A CANVAS LARGER THAN THE SCREEN

FLASH LIGHT EFFECT

ALBUM ART SLIDE SHOWINTERFACE

AUTOMATED TILE GENERATION

Screen captures of the Processing sketches used to test functions for the software of projected interface of the Digital Flash Light.

132 133

COMPUTER SCIENCE COLLABORATION

My collaboration began quite late in the project, which meant that I had already built up some prototype sketches from which my computer science collaborator, Matt, could build. The sketches I had produced independently I had used to prove that the methods I wanted to use in my final prototype could work if further developed. This meant I concentrated on proving individual functions rather than creating a collective system. I had one sketch that demonstrated that cursor movement using the IR reflection point would be adequate for users to navigate images. Another sketch made RFID cards open specific files and another curated images into a collage. Having these sketches already built meant that communicating with Matt what I wanted from the final sketch was much easier. All I needed him to do was combine these multiple sketches together into one standalone sketch.

METHODS, ROLES & CONTRIBUTIONS

Luckily for me Matt was already working with another student on a project that overlapped with many of the functions I was working on. Because of this our work time was much more streamlined with him already knowledgable on RFID tag systems and image file loading techniques. At our first meeting I explained to him the details of my project and showed him the sketches I had already produced working with the initial physical prototype. I also showed Matt some reference images of what I had invisioned for the final outcomes aesthetic and I left him to materialise a response.

Within a week he had taken the sketches I had supplied him with and constructed a new standalone sketch that provided the tile navigation system that we used in the final system. We then had a further meeting where I provided him with the RFID

hardware and he then set out to make the tile navigation system work with multiple libraries activated by specific RFID tags. We decided on a few tweaks that he could implement to make the tile system more attractive and to make it work better with the end application, but for the most part I was happy trusting his judgement on minor decisions.

A week later after he had finished the update, we had a further meeting. By this point I had the final prototype casing produced which allowed for us to fine tune the sketch to suit the final configuration. Matt being ever generous, patiently worked to finely adjust the variable within the sketch to best suit what he and I thought worked best with the device. We both took turns using the device which made the work feel more productive and also gave us a sense of satisfaction seeing the software working within a physical setting. Being able to quickly adjust variables within the sketch and then quickly test them facilitiated a productive method of finalising the system.

Matt’s role moved between being a technician and a designer. Matt had prior experience in the architecture industry, so he understood more than most what I was aiming for and more excepting of the seemingly undefined nature of the design process. Much of the time he was materialising what I had already proposed but many times he offered alternative solutions and suggested opportunities that I had not seen.

As my sketches were built on messy juxtapositions, it was a welcome change to see the code being built and designed from step one. At these points in many ways he became the designer with my role shifting to that of the client.

Screen capture of my Processing sketch used to test Wii generated cursor control to navigate images.Photograph of my computer science collaborator Matt.Screen capture of Matts Processing sketch to test Wii generated cursor control to navigate images.

134 135

FUTURE HARDWARE OPTIONS

There was the possiblity of dissecting the off-the-shelf components and removing unnessary elements to make the devices form more compact. I opted against this in the end for the high likelihood of destroying the circuits of the components and the prohibitive expense of going through multiple micro projectors which was outside the scope of the budget. The fundamental decision lay in the value that I wanted the device to be easily reproducible. What better compliment would there be than for others to replicate the device and construct their own? The use of standard components allows for people to acquire the parts easily and if I were to make the processing sketches codes open source there is potential for others to take the project further than i have resource to take it. Given much of the project’s success lay on the generosity of others such as Johnny Lee, and the members of the openprocessing.org and Wii whiteboard communities, it is approriate to keep within the open source tradition with the hardware as well as the software.

CABLE MANAGEMENT

The umbilical cord that connects the devices hardware to the power and the computer running the software consists of:

- 3.3V DC - Wii Power- 5V DC - IR LED Array- 3.7V DC - Micro Projector Power- USB - RFID- VGA - Display

Initially I held these cables together by cable ties, upgrading to spiral wrap pipe and finally combining the spiral wrap with a cover of a valcro-sealed cord concealing mesh. I was hoping to use a more organic material, such as rope skin or a leather tube, but the need to have easy access to the cable if one were to become damaged and the inconisistency of the cables thickness made these options dificult to materialise. In the end I opted for the off-the-shelf solution, deciding that it met with the open source nature of the project and that the cable handling was a secondary concern was not of core importance to the outcome.

Scan of a sketch of the details of the prototype. Photograph of a dissection of the 3M micro projector.Photograph of an alternative RFID Reader.Photographs of the different cable management options explored.

136 137

DEVICE CASING FORM

From the moment I began to think of the device as a digital form of the traditional flashlight, the form of the iconic Dophin flashlight came to mind. Not only is its boxy yellow and black casing well know and instinctively linked to our cultural definition of a flashlight, its form fit with the casing requirements of the interior components.

I did purchase a Dolphin flashlight, though you do pay a premium for the brand recognition, but after many hours working on how I could appropriately fit the components into the suprisingly small interior space of the casing, I decided against it. Instead, I opted to reference its form in a custom built casing. Given that I was now only reference the cultural signs which people intuitively identify a flashlight from, I decided i could work outside of the traditional plastic and use the more natural material of wood as long as the outer form still held enough of the visual characteristics.

The rise in recent years of wooden casing returning to electric devices has brought about a number of torch like devices ultalising wood as their core casing material, which I used as precedent to evaluate what was feasible. I also researched more handmade and geometric forms used for antique flashlights. In the end i decided the shape should be rather square, simply acting as a skin wraping the interior squarish components. It shouldn’t be an overly designed object either, as this is not a casing project but the casing functions as a means of facilitating an experience.

Photograph of wooden flashlight.Photograph of an antique flashlight.Photograph of an antique flashlight.Photograph of an retro slide projector.Photographs of the EverReady Dolphin flashlight

138 139

PROGRESSED CONCEPTTREASURE HUNT

The idea of introducing the device within public space became a foreseable future application. People could take their own standalone digital device and walk about the streets finding beacons left by others, like time capsules or graffiti. The device could facilitate a form of treasure hunt, where people can create their own beacons from which its RFID code could be assigned to an online database where their personal media is uploaded. The self-made beacons can then be adheared to a surface within public space allowing others to uncover them revealing the messages curated by the beacons’ creators.

Illustration simulating how the Digital Flash Light system could be used within public space.

140 141

PROTOTYPE

142 143

CASING

Having already configured the components for the prototype, the production of the final prototype was an exercise in housing the concept in an approriate shell.

With time running short I was keen to get the casing produced as soon as possible. Having chosen the Dolphin flash light as the the reference form and as the dimensions of the interior components were already set the 3D modelling was a process of filling the gap between the two.

Within Solidworks I modelled basic models of the individual interior components, then built a skin over these parts and from that skin carved out the stylised and simplified shape of the Dolphin flash light. After some fine tuning, making sure the wall thickness was generous enough to be structurally sound but not so thick as to be heavy or unbalanced, I cut the model in two and added guidance holes for the magnets and dowels.

At this point I sent the model to the CNC Router technicians. They then refined the fillets and finer details of the casing to get the best finished model out of the machine. They then sent me technical approval and we set forth to machine it.

Renders of the the CAD model of the final prototype casing fitted with the interior components.

144 145

CNC

The Casing was machined using a Roland MDX-650 CNC Router with a single 6mm drill piece. Because of the depth of the casing each side had to be routed twice, the first to make enough space of the drill instrument to reach the final depth required.

Luckily the geometric shape of the casing made programing the machine relatively straightforward and with the casing consisting of two mirrored parts the process was made simpler than organic forms. The CNC took a night to mill each side of the casing and within a week of fitting into times where others werent using the equipment I had my casing cut.

Although I initially believed I would be limited to using a synthetic wood, I was grateful to be offered a rain forest timber that is the equivalent of a natural MDF. The result was a strong but light timber that didnt suffer any of the weakness associated with knots and imperfections.

The accuracy achieved by having the casing CNC meant that both sides fit accurately together with both magnets and dowels specifically aligned and the components able to be fitted in precise positions. I finished the casing by sanding the exterior surfaces smooth and then applying layers of clear timber wax to seal the wood.

Photographs of the final prototype’s casing being machined, tested and finished.

146 147

HARDWARE

Erring on the side of caution, I deliberately allowed for a few milimetres space between the casing and the components. To hold everything in place I used a combination of adhesive foam tape and screws, which allowed me to fine tune the positions of the components. The position of the RFID Reader was pushed further forward than orginally planned to extend its range and the IR LED array was shifted forward to minimise reflection noise created by the perspex sceen. The projector was moved back to allow for a wider image at a close distance from the projected surface.

The casing has proved sturdy and ergonomic with the handle being quite comfortable. The casing has also shown itself to be reasonably balanced, even with the heavy cable protruding from the device’s rear. Although I would have put the dowels and magnets on opposite sides to where they have been placed if I were to make the casing again, I am more than happy with the outcome. The casing is easy to pull apart to reset the components but equally holds together firmly when closed.

Photographs of the completed final prototype.

148 149

Photograph of the completed final prototype with the casing separated to access the controls of the interior components.Screen capture of display output from the Processing sketch for final prototype’s projected interface.

150 151

Photographs of the Digital Flashlight system being tested.

152 153

FINAL COMPONENTS OF THE DIGITAL FLASHLIGHT SYSTEM

Photograph of the final BeaconsPhotograph of the final Digital Flashlight system’s components.

154 155

FINAL TESTING

Although much testing was done with the ealier prototypes, I have continued to test the system in its final configuration.

The earlier protoyptes, although functional, were fragile and distracting but in the final robust casing I have been able to sit back and let others play with the device. Setting the system up in the corner of the studio has allowed me to keep the system running for long periods, ironing out any connection issues or minor issues that were not apparent in the earlier shorter tests.

People have instantly identified the device as a flashlight and go to pick it up and reach their thumb out for the ‘on’ button. The RFID system is not quite so intuitive, but once people are shown the ‘reach in and pull out’ procedure they continue to explore the remainder of the beacons. The navigation within the projected image tiles has proven obvious to most and the instant feedback quickly allows them to identify their ability to contol the image.

One common phenomenon of confusion is the point where the Wii looses sight of the reflective point but the projected tile does not communicate this. Luckily people tend to continue to move the flashlight which normally moves it back into the range of the Wii detection scope. This feedback issue is an example of the hardware limitations not being communicated fluidly with the experience of using the device. With future tweaking of the image files and variables within the sketch code, the opportunity for the projected tile navigation to become de-synced from the physical movement of the device could be minimised.

Sketch of prototype installation ideas.Photograph of the timber polish used to finish the timber surfaces of the final prototype’s casing.Photograph of the prototype being synced with the Wii Whiteboard Application.Photograph of the first configuration of output wires used in the final prototype testing. Photograph of the prototypes projected interface.

156 157

CONCLUSION

158 159

Photograph of the Digital Flashlight system being tested.

160 161

EXHIBITION INSTALL

For the RMIT Exhibition I plan to have the Flash Light system installed within a corridor or darkened space. The device will be installed so that members of the public can come use the device freely. The device will be left running at all time with beacons distributed on the surrounding surfaces. I will curate the beacons into two or three sets which I will then arrange in paths that people can follow. The beacons’ paths will be identified by each set being painted a different color. I will also project a video next to the plinth where the device will sit when not in use. The video will show people using the device within the space; when people enter the install area they can mimick the actions they see on the video and hopefully grasp how to use the device.

Illustration simulating the exhibition installation.Photograph of a possible space for the exhibition installation.

162 163

FUTURE DEVELOPMENTS

MORE TESTING

I would like to extend the testing of the device. Installing the device in people’s homes and then observing how they interact with it and manipulate it. I could eventually leave it installed for extended periods and see if usage patterns change with time.

WIRELESS

Research into making the device wireless quickly persuaded me that the idea was outside of the resources available to this project. The main problems lay with getting a video signal wireless which can be done but requires some expensive and cumbersome equipment. Another option would be to construct a custom mini computer within the device using BeagleBoard hardware making the device stand alone but also making it far more complex. If I were to develop the project in future this is one avenue I would be keen to experiment with, but given the time and resource restrictions of this project at this present stage I opted not to explore these options.

MULTIPLE DEVICES Another option outside of resources of this project but having the oportunity to make a multiple Flashlight devices give rise to some intreging installations. Given that if i were to make budget wired versions i could make them at a cost of $400 each i may infuture make another two see how this extends the social side of using the device. The ability for people to use tham in a social activity similiar to ‘treasure hunt’ with people showing each other what they have found could be an intersting experience.

PUBLIC INSTALLATION

Installing the device within a laneway exhibition context is something I am keen to explore. Having an installation that would encourage public intervention could sprout some interesting feedback and alternative ways of using the system. My vision is to extend the exhibition installation to multiple devices and multiple beacon paths within a large public space. I would like to show the existing system within an exhibition context outside of RMIT and, depending on how that is received, apply for funding to extend it to an installation as part of the Gertrude St Projection Festival or as part of a fringe event for the next Experimenta (International Biennial of Media Art). The Project could also be extended to allow other artists and curators to compile their own set of images which could be assigned to specific beacons. Then people could choose an artist and follow their path of beacons, which would allow people to see different exhibitions within the same physical space - a kind of visual version of the gallary iPod audio exhibition.

Photograph of the Digital Flashlight system’s components ready to be setup in new space for testing.Illustration simulating the exhibition installation.Photograph from the Gertrude St Projection Festival.Illustration simulating how the Digital Flash Light system could be used within public space.

164 165

REFLECTIONSPOSITION

Where does this project sit in the world? With the technology it ultilises, Flashlight could be described as a new media arts project, however it is also an interactive projection environment. Theoretically, this work looks into the techniques of memory tools and how environments effect their occupants, allowing people to create their own augmented reality. In many ways it looks at interior design and multi-use spaces, with the device allowing people to use the space differently with minimal permanent presence. The use of off-the-shelf components and using them in unintented ways also aligns this project with the modding & DIY communities. Finally, if I were to extend the system into a public installation, then the project has the potential to be a public intervention and an alternative street art medium.

GOALS

Apart from extending the project to a wider audience, one goal would to see the Flashlight system be adoption by others. Keeping with the values of these relevant open source communities, it would be amazing to see others take my work and build from it just as I have built on others. I plan to publish much of the details needed for people to make their own flashlight on my webpage and send links to the web communities where I found much of the precedent information. Hopefully this will lead to some feedback on the project, or better yet, someone making there own version of the digital flashlight.

WHY

This project is not like anything discovered in the relevant research, although this project owes much to the research of others who have worked in similar fields. This project offers a small, new contribution to these relevant fields and even if the outcome is not particularly useful in itself, hopefully the research that went into this project might help someone somewhere build something useful within another more meaningful context.

Aside from the contribution to this area of research, much of the drive for this project is a personal reflection and response to the alienation felt about the objects turned comodities around us and a desire for historical reflection. It is important to examine our relationship to commodities and to look at ways to reconnect with our histories and memories. This project began as a small attempt to engage with these ideas.

Due to the scope and breadth of the relevant fields, this project was always at danger of trying to cover too diverse an area in the initial research stage. The original abstract was too broad and the project was trying to cover too much. The project would have been stronger if it had concentrated solely on the projection medium and memory. The much broader philosophical readings, which, although they have influenced the fundamental values underlying the project, have very little direct influence, could have been trimmed back. This is partly the nature of working in largely undefined and cross-displinary areas where broad research is necessary to discover relevant parameters and partly due to the management of the project itself.

The feedback at the start of the project included questions about whether projection was approriate for this project. These are legitimate questions, but in the end exploring opportunities within the projection medium became an important part of the project and has been a dominant trend in my work over the past few years. My persistence with this medium was due to two main factors. Firstly, this gave me an opportunity to build on the skills and knowledge that I already had which meant that I could concentrate on design applications rather than learning basic skill. Second, light seemed like the most appropriate design medium given the philosophical underpinnings of the project, which did not lend themselves to simple product design and the creation of a commodity. I am increasingly interested in how light can be used within an industrial design context and it’s transient and non-rigid material proved a successful design solution for this project.

Working in collaboration with Matt from the computer science department also helped keep up the momentum of the project. The parallels between fellow student Theo’s work allowed me to talk over detailed aspects of the project with someone knowledgable about the technology, which also helped iron out issues. Much of the work was in too specific areas for most people to grasp quickly, so having prototypes was a welcome comunication tool.

I can’t agree more with the early advice i received about the importance to start physically protoyping early. Working with prototypes and physical tests alongside my theroetical studies worked well, particularly in allowing me to keep productive, but also in generating new connections and ideas.

166 167

168 169

APPENDIX

170 171

APRIL

Determine Project Direction

Initial Research Cultural Studies Scientific Theory Interaction Design Design Precedents Technology

MAY

Re-Define Project Direction

Continued Research Spacial memory Visual Culture Projection

Start DVR

JULY

Devolopemental Research

Initial Physical installation-based Experiments & Simulations

Proposed Project Outcome

Research, Ideation, Project Outcome DVR Chapters Finalised

JUNE

Devolopmental Research Portable Projection Augmented reality Precidents

Secondary Ideation

Compiling DVR

AUGUST

Arduino and Processing Software Exploration

Developed Physical Installation-based Experiments & Simulations

Project Outcome Clarified

Compiling DVR

SEPTEMBER

Processing sketch construction

Early Prototype construction & Testing

Compiling DVR

NOVEMBER

DVR Document Completed

Presentation

Exhibition Installation

TIME LINE

OCTOBER

Final Prototype ConstructionCasing MachinedProcessing sketch finalised

Final Prototype Testing

Project Reflection

DVR Document Finalised

172 173

P13 Debord.jpeg, http://artofmapping.blogspot.com/2010/09/debord-psychogeography-1957.html, [accessed 30 October 2011].parismap.jpg, http://www.ethanzuckerman.com/blog/2011/01/03/games-that-help-us-wander/, [ac-cessed 30 October 2011].paris-1957.jpg, http://sebroslil.blogspot.com/2010_03_01_archive.html, [accessed 18 Sep-tember 2011].

P14cabinet_of_curiosity.jpg, http://blog.echovar.com/?p=1955, [accessed 12 October 2011].

P17treasure2.html, http://meetmeatmikes.blogspot.com/2011/09/treasure.html [accessed 18 Septem-ber 2011].

P21Barbara-Kruger-l1.jpg, http://thedesigninspiration.com/fonts/barbara-kruger/, [accessed 30 October 2011].deg.jpg, http://georgiepaynebillard.files.wordpress.com/2009/03/deg.jpg, [accessed 30 October 2011].

P22 thesis.jpeg, http://johnnylee.net/projects/, [accessed 30 October 2011].

P23mind_opera_demo.jpg , http://www.microvision.com/displayground/innovation/video-wall-with-multiple-showwx-laser-projectors/, [accessed 30 October 2011].

P24Screen Captures from: http://www.cs.toronto.edu/~caox/uist2006_handheldpro- jector.pdf

P258.jpeg, http://www.sweatshoppe.org/presskit/, [ac-cessed 30 October 2011].

P26perpetual_patent_storyteller-1.jpeg, http://storyteller.allesblinkt.com/, [accessed 6 October 2011].

P27 wandern-im-wissen-1.jpeg, http://www.asquare.org/networkresearch/2010/wandern-im-wissen-wander-ing-in-knowledge, [accessed 29 September 2011].

P28Screen-shot-2011-10-10-at-7.41.21-PM.png, http://cmuems.com/2011/a/category/project/page/4/, [accessed 29 September 2011].

P29apartment.jpeg, http://victoriavesna.com/dataesthetics/?p=11, [accessed 29 September 2011].

P30Screen Captures from: http://www.educ.dab.uts.edu.au/interactivation/ReadingTable.pdf

P31Screen Captures from: http://www.cs.toronto.edu/~caox/uist2006_handheldpro- jector.pdf

P82Screen Captures from: http://www.pranavmistry.com/projects/sixthsense/, [accessed 29 September 2011].

P92fetch.jpeg, http://www.ladyada.net/learn/sensors/ir.html, [accessed 30 October 2011].

P9304371_invervalcam_t.jpeg, http://dev.squarecows.com/2010/10/07/how-to-sensor-tutorial-ir-remote-re-ceiver-decoder-tutorial/, [accessed 30 October 2011].a2dc8_images_sensors_intervalometer.jpeg, http://dev.squarecows.com/2010/10/07/how-to-sensor-tutorial-ir-remote-receiver-decoder-tutorial/, [accessed 29 October 2011].

P96the-poetics-of-space.jpeg, http://www.borders.com.au/book/the-poetics-of-space/1647850/, [accessed 30 October 2011].RitrattoMuseoFerranteImperato.jpeg, http://en.wikipedia.org/wiki/File:RitrattoMuseoFerranteImperato.jpg, [accessed 30 October 2011].Debord.jpeg, http://artofmapping.blogspot.com/2010/09/debord-psychogeography-1957.html, [accessed 30 October 2011].

P97eveready_music.jpeg, http://www.ohgizmo.com/2006/06/13/eveready-boombox/, [accessed 30 October 2011].

P10509950-02.jpeg, http://www.sparkfun.com/prod-ucts/9950, [accessed 30 October 2011].08554-03-L.jpeg, http://www.sparkfun.com/prod-ucts/8554, [accessed 30 October 2011].Wiimote.png, http://en.wikipedia.org/wiki/Wii_Re-mote, [accessed 30 October 2011].15261-a5wx8t_l.jpg, http://www.coolest-gadgets.com/20091007/hands-review-3m-mpro-120-pocket-projector/, [accessed 23 September 2011].09218-02.jpg, http://dlnmh9ip6v2uc.cloudfront.net/images/products/09218-02.jpg, [accessed 30 Octo-ber 2011].10732-02.jpeg, http://www.sparkfun.com/prod-ucts/10732, [accessed 30 October 2011].1_5V_watch_battery_AG_battery_.jpeg, http://www.al-ibaba.com/product-gs/279512333/1_5V_watch_bat-tery_AG_battery_.html, [accessed 30 October 2011].

P111wii-mote-finger-tracking.jpeg, http://hackaday.com/2007/11/09/wiimote-ir-finger-tracking/, [ac-cessed 30 October 2011].track-fingers-with-the-wii-remote.jpeg, http://hack-nmod.com/topics/wii/page/5/, [accessed 30 Octo-ber 2011]._Tracking%20fingers%20with%20the%20Wii%20Remote.jpeg, http://hackawii.com/category/video/page/5/, [accessed 30 October 2011].

P11318LEDlight.jpeg, http://www.haydnallbutt.com.au/2008/11/25/how-to-assemble-your-own-140-led-infrared-light-source-part-1/, [accessed 20 October 2011].24-LED-Lamp-Board-Plate-for-CCTV-Security-Camera-FY2024-GA.jpeg, http://www.dinodirect.com/24-LED-CCTV-Camera-IR-Board-FY-2024-GA.html?cur=AUD, [accessed 16 September 2011].

P134Wii-Remote-4.jpeg, http://www.sparkfun.com/tutori-als/43, [accessed 30 October 2011].28140-m-230x300.jpeg, http://blog.datasingularity.com/?cat=5, [accessed 150 October 2011].

P136wooden-flashlight.jpeg, http://www.coolest-gadgets.com/20090212/the-wooden-flashlight-from-area-ware/, [accessed 30 October 2011].

torch_mk6.jpeg, http://www.whereaboutsupply.com.au/index.php/hardware/torches/torch-dolphin-torch-mk5.html [accessed 30 October 2011].

P163Brooklyn_Arts_Hotel_Hands_09_jpg_643x450_crop_q85.jpg, http://www.broadsheet.com.au/media/im-ages/2010/07/02/Brooklyn_Arts_Hotel_Hands_09_jpg_643x450_crop_q85.jpg [accessed 30 October 2011].

FIGURE OF IMAGES

Image Credits (clockwise from left)Uncited images courtesy the author

174 175

SKETCH CODE EXAMPLE 1FINAL PROTOTYPE

import processing.opengl.PGraphicsOpenGL;import processing.serial.*;

/************** HACK*/String currentAlbum;

class Point2d { float x; float y; }

class ImageTile { Point2d pos = new Point2d(); PImage img = new PImage(); ImageTile(PImage i, float px, float py) { img = i; pos.x = px; pos.y = py; } }

class TileCanvas { private int nTiles_x; private int nTiles_y; private int tileIndex_x = 0; private int tileIndex_y = 0; public int pixelWidth = 0; // Total width of tiles public int pixelHeight = 0; // Total height of tiles // ArrayList of all the tile on the canvas private ArrayList<ImageTile> imageTiles; // initialise with number of tile for x and y public TileCanvas(int x, int y) { nTiles_x = x; nTiles_y = y; } void loadTiles(String[] files) { PImage img = new PImage(); int tileHeight = 200; int img_x = 0; int img_y = 0; int countImg_x = 0; int countImg_y = 0; // Array of row pixel widths for setting the canvas width int canvasShift = 0; imageTiles = new ArrayList<ImageTile>();

if (files.length < (nTiles_x*nTiles_y)) println(“Not enough tiles for all images!”); // Setup the image array for (int i=0; i<files.length; i++) { println(files[i]); if ( files[i] != null ) { img = loadImage(currentAlbum+”/”+files[i]); img.resize(0, tileHeight); // Resize to get correct aspect ratio ImageTile it = new ImageTile(img, img_x, img_y); //println(“pos: “ + img_x + “ “ + img_y); //println(“ic: “ + countImg_x + “ “ + countImg_y); //println(“wh: “ + img.width + “ “ + img.height); img_x += img.width; canvasShift += img.width; tCanvas.imageTiles.add(it); countImg_x++; if ( countImg_x == nTiles_x ) { pixelWidth = max(canvasShift, pixelWidth); //println(“CanvasWidth “+pixelWidth); countImg_y++; img_y = tileHeight*countImg_y; countImg_x = 0; img_x = 0; canvasShift = 0; } } } pixelHeight = countImg_y * tileHeight;

//println(“width: “+pixelWidth); } // Draw the tiles. void drawTiles() {

for (int i=0; i<imageTiles.size(); i++) { ImageTile imgTile = imageTiles.get(i); PImage img = imgTile.img; image(imgTile.img, imgTile.pos.x, imgTile.pos.y); /* int cz = -5; translate(imgTile.pos.x, imgTile.pos.y); beginShape(); texture(img); vertex(0, 0, cz, 0, 0); vertex(img.width, 0, cz, img.width, 0); vertex(img.width, img.height, cz, img.width, img.height); vertex(0, img.height, cz, 0, img.height); endShape(); */ } }}

/*********************************** * Global variables/************************************/TileCanvas tCanvas = new TileCanvas(4, 3); // Canvas dimensions oringal dimension are (3, 2)final float ZOOM = 50;final float CURSOR_SPEED = 1.5;final boolean DEBUG = true;boolean albumLoaded = false;/************************************/

Serial myPort; // Create object from Serial classHashMap tagsMap = new HashMap();String tagID = null;

/*RFID tagsAdd tag here*/String[] albumTags = { “NOT_A_TAG”, “4B0082F6330C”, // BLUE album1 “4D00617C7E2E”, // GREEN album2 “4B0082F91F2F”, // YELLOW album3 “4B0082A74628”, // RED album4 };

String[] readAlbum(String albumName) { currentAlbum = albumName; final String filterTypePNG = “.png”; final String filterTypeJPG = “.jpg”; FilenameFilter filter = new FilenameFilter() { public boolean accept(File dir, String name) { if (!name.startsWith(“.”)) return name.toLowerCase().endsWith(filterTypeJPG); // change to JPG return false; } }; // This function returns all the files in a directory as an array of Strings File file = new File(sketchPath(“data/”+albumName)); if (file.isDirectory()) { String[] names = file.list(filter); println(“Files: “+names[0]); return names; } else { // If it’s not a directory return null; }}

void setup() { size(800, 600, OPENGL); noCursor(); // Hides cursor frame.setLocation(1440,0); String portName = Serial.list()[0]; myPort = new Serial(this, portName, 9600); myPort.bufferUntil(‘\n’);

// Insert all the tags into the hashMap for (int i=1; i<albumTags.length; i++) { tagsMap.put(albumTags[i], i); }}

// If the serial is read void serialEvent(Serial myPort) { tagID = myPort.readStringUntil(‘\n’); if (tagID != null) { tagID = trim(tagID); println(“READING TAG: “ + tagID); if ( tagsMap.get(tagID) != null ) { tCanvas.loadTiles( readAlbum(“album”+tagsMap.get(tagID)) ); albumLoaded = true; } } }

public void init() { frame.removeNotify(); frame.setUndecorated(true); super.init(); }

void draw() { background(0); if (albumLoaded) { int cx = width/2; int cy = height/2; //pushMatrix(); //popMatrix(); pushMatrix(); //translate( -tCanvas.pixelWidth/2, tCanvas.pixelHeight/2 ); translate((cx-mouseX*CURSOR_SPEED), (cy-mouseY*CURSOR_SPEED), ZOOM); translate( -(tCanvas.pixelWidth/8), -(tCanvas.pixelHeight/4) );// bc-zoom? translate( 300, 400 ); // An offset for canvas size tCanvas.drawTiles(); popMatrix(); if (DEBUG) { float d = dist(cx, cy, mouseX, mouseY); line(cx, cy, mouseX, mouseY); text(mouseX+”:”+mouseY+”|”+d, mouseX, mouseY); } }}

176 177

SKETCH CODE EXAMPLE 2FIRST COLABORATION PROTOTYPE

import processing.opengl.PGraphicsOpenGL;

class Point2d { float x; float y; }

class ImageTile { Point2d pos = new Point2d(); PImage img = new PImage(); ImageTile(PImage i, float px, float py) { img = i; pos.x = px; pos.y = py; } }

class TileCanvas { private int nTiles_x; private int nTiles_y; private int tileIndex_x = 0; private int tileIndex_y = 0; public int pixelWidth = 0; // Total width of tiles public int pixelHeight = 0; // Total height of tiles // ArrayList of all the tile on the canvas private ArrayList<ImageTile> imageTiles = new ArrayList<ImageTile>(); // initialise with number of tile for x and y public TileCanvas(int x, int y) { nTiles_x = x; nTiles_y = y; } void loadTiles(String[] files) { PImage img = new PImage(); int tileHeight = 200; int img_x = 0; int img_y = 0; int countImg_x = 0; int countImg_y = 0; // Array of row pixel widths for setting the canvas width int canvasShift = 0; if (files.length < (nTiles_x*nTiles_y)) println(“Not enough tiles for all images!”); // Setup the image array for (int i=0; i<files.length; i++) { println(files[i]); if ( files[i] != null ) { img = loadImage(files[i]); img.resize(0, tileHeight); // Resize to get correct aspect ratio

ImageTile it = new ImageTile(img, img_x, img_y); //println(“pos: “ + img_x + “ “ + img_y); println(“ic: “ + countImg_x + “ “ + countImg_y); println(“wh: “ + img.width + “ “ + img.height); img_x += img.width; canvasShift += img.width; tCanvas.imageTiles.add(it); countImg_x++; if ( countImg_x >= nTiles_x ) { pixelWidth = max(canvasShift, pixelWidth); countImg_y++; img_y = tileHeight*countImg_y; countImg_x = 0; img_x = 0; canvasShift = 0; } } } pixelHeight = countImg_y * tileHeight;

println(“width: “+pixelWidth); } // Draw the tiles. void drawTiles() {

for (int i=0; i<imageTiles.size(); i++) { ImageTile imgTile = imageTiles.get(i); PImage img = imgTile.img; image(imgTile.img, imgTile.pos.x, imgTile.pos.y); /* int cz = -5; translate(imgTile.pos.x, imgTile.pos.y); beginShape(); texture(img); vertex(0, 0, cz, 0, 0); vertex(img.width, 0, cz, img.width, 0); vertex(img.width, img.height, cz, img.width, img.height); vertex(0, img.height, cz, 0, img.height); endShape(); */ } }}

/*********************************** * Global variables/************************************/TileCanvas tCanvas = new TileCanvas(3, 2);final float ZOOM = 120;final boolean DEBUG = true;

String[] readAlbum() { final String filterType = “.png”; FilenameFilter filter = new FilenameFilter() { public boolean accept(File dir, String name) { if (!name.startsWith(“.”)) return name.toLowerCase().endsWith(filterType); return false; } }; // This function returns all the files in a directory as an array of Strings File file = new File(sketchPath(“data/.”)); if (file.isDirectory()) { String[] names = file.list(filter); println(“Files: “+names.length); return names; } else { // If it’s not a directory return null; }}

void setup() { size(800, 512, OPENGL); tCanvas.loadTiles( readAlbum() );}

void draw() { background(0); int cx = width/2; int cy = height/2; pushMatrix(); //translate( -tCanvas.pixelWidth/2, tCanvas.pixelHeight/2 ); translate((cx-mouseX), (cy-mouseY), ZOOM); translate( -(tCanvas.pixelWidth/4), -(tCanvas.pixelHeight/2) ); tCanvas.drawTiles(); popMatrix(); if (DEBUG) { float d = dist(cx, cy, mouseX, mouseY); line(cx, cy, mouseX, mouseY); text(mouseX+”:”+mouseY+”|”+d, mouseX, mouseY); }}

178 179

SKETCH CODE EXAMPLE 3IMAGE NAVIGATION PROTOTYPE

int maxImages = 13;int imageIndex = 0;

PImage[] images = new PImage[maxImages]; void setup(){ size(800,600); //noCursor(); // Hides curser for (int i = 0; i < images.length; i ++ ) { images[i] = loadImage( “test” + i + “.jpg” ); }imageIndex = int(random(0, 13));//Randomizes initial image.}

void draw() { background(0); image(images[imageIndex], 0,0); frame.setLocation(1440,0); if(mouseX < 600){}//If cursor moves to right hand side of screen image will randomly chage. else {imageIndex = int(random(images.length)); delay(700);//delays the next action by 700 miliseconds? } if(mouseX > 200){}//If cursor moves to left hand side of screen image will randomly change. else {imageIndex = int(random(images.length)); delay(700); }

}

public void init() { frame.removeNotify(); frame.setUndecorated(true);

super.init(); }

SKETCH CODE EXAMPLE 4IMAGE COLLARGE PROTOTYPE

int imax = 30;PImage[] Image = new PImage[imax]; float x;float y;float z;float alp = 0;float tet = PI/2;float ray = 2000;float[] xim = new float[imax];float[] yim = new float[imax];float[] zim = new float[imax]; void setup(){ size(800,600,P3D); background(0); smooth(); imageMode(CENTER); for (int i = 0; i < Image.length; i++) {Image[i] = loadImage(“Image0” + i + “.jpg”);} genere_positions(); frame.setLocation(1440,0);} void draw(){ actualise_camera(); affiche_images();} void keyPressed(){ if (keyCode == ENTER) {saveFrame();} if (keyCode == BACKSPACE) {genere_positions();} }void genere_positions(){ for (int i = 0; i < imax; i ++) { xim[i] = random(-500, 500); yim[i] = random(-500, 500); zim[i] = random(-500, 500); }if ((keyPressed == true) && (keyCode == LEFT)) {tet += 0.1; println(“p”);} if ((keyPressed == true) && (keyCode == RIGHT)) {tet -= 0.1;} if ((keyPressed == true) && (keyCode == UP)) {alp -= 0.1;} if ((keyPressed == true) && (keyCode == DOWN)) {alp += 0.1;} if ((keyPressed == true) && (key == ‘+’)) {ray -= 10;} if ((keyPressed == true) && (key == ‘-’)) {ray += 10;}} void gestion_souris() //composes composition of collarge{ float dplcmt_x = map(mouseX, 0, width, -0.08, 0.08); float dplcmt_y = map(mouseY, 0, height, 0.08, -0.08); if (sq(dplcmt_x) < sq(0.01)) {dplcmt_x = 0;} if (sq(dplcmt_y) < sq(0.01)) {dplcmt_y = 0;} alp += dplcmt_y; tet += dplcmt_x;} void affiche_images(){ background(0); for (int i = 0; i < imax; i ++) { line(0, 0, 0, xim[i], yim[i], zim[i]); translate(xim[i], yim[i], zim[i]); image(Image[i], 0, 0); translate(-xim[i], -yim[i], -zim[i]); }} void actualise_camera(){ //Calcul de la position de la caméra x = ray * cos(alp) * cos(tet); z = ray * cos(alp) * sin(tet); y = ray * sin(alp); camera(x, y, z, 00, 00, 00, 00, 01, 00);}public void init() { frame.removeNotify(); frame.setUndecorated(true); super.init();

}

SKETCH CODE EXAMPLE 5RFID READER PROTOTYPE

import processing.serial.*;Serial myPort; String inString;

String RFID1 = “2500ABF07F01”; //Card 1 RFIDString RFID2 = “2500AC1754CA”; //Card 2 RFIDString RFID3 = “2500AC086AEB”; //Card 3 RFIDString RFID4 = “2500ABE36409”; //Card 4 RFIDString RFID5 = “4500B8BECE8D”; //Tiny glass RFIDString RFID6 = “4500BE2ED005”; //Button RFID

PImage jpg;PFont fontA;

void setup() { size(500, 500); myPort = new Serial(this, Serial.list()[0], 9600); myPort.bufferUntil(‘\n’); fontA = loadFont(“HelveticaNeue-24.vlw”); textFont(fontA, 14); jpg = loadImage(“scan.jpg”); imageMode(CENTER); textAlign(CENTER);}

void draw(){ background(255); image(jpg, width/2, height/2); if(inString != null){ fill(230,255); text(“RFID = “ + inString, width/2, 24); }}

void serialEvent (Serial myPort) { inString = myPort.readStringUntil(‘\n’);

if (inString != null) { inString = trim(inString); // trim off any whitespace: print(“RFID = “ + inString + “ : “); if (RFID1.equals(inString) == true){ //checks inString to see if it matches the first RFID println(“Card 1”); jpg = loadImage(“card1.jpg”); } if (RFID2.equals(inString) == true){ //checks inString to see if it matches the first RFID println(“Card 2”); jpg = loadImage(“card2.jpg”); } if (RFID3.equals(inString) == true){ //checks inString to see if it matches the first RFID println(“Card 3”); jpg = loadImage(“card3.jpg”); }

if (RFID4.equals(inString) == true){ //checks inString to see if it matches the first RFID println(“Card 4”); jpg = loadImage(“card4.jpg”); } if (RFID5.equals(inString) == true){ //checks inString to see if it matches the first RFID println(“Glass RFID”); jpg = loadImage(“glass.jpg”); } if (RFID6.equals(inString) == true){ //checks inString to see if it matches the first RFID println(“Button RFID”); jpg = loadImage(“button.jpg”); } } }