8
Traditional tactile cartography is complicated by problems associated with braille labeling and feature annotation. Audio-tactile display techniques can address many of these issues by associating spoken information and sounds with specific map elements. This article introduces Talking TMAP – a collaborative effort between The Smith-Kettlewell Eye Research Institute and Touch Graphics, Inc. Talking TMAP combines existing tools such as the World Wide Web, geographic information systems, braille embossers and touch tablet technology in new ways to produce a system capable of creating detailed and accurate audio-tactile street maps of any neighborhood. The article describes software design, user interface and plans for future implementation. Introduction The Smith-Kettlewell Eye Research Institute’s Tactile Maps Automated Production (TMAP) project addresses the blind or severely visually impaired traveler’s need for geographical information about unfamiliar environments (Miele, 2004).To help meet this need, a prototype website for rapidly producing high-quality, low-cost tactile street maps has been developed. It brings together online Geographic Information Systems (GIS), accessible Web design practices, tactile graphics design principles and modern tactile hardcopy production methods to enable a totally blind individual to independently produce customized tactile street maps (Miele and Marston, 2005). By incorporating touch tablet technology in the form of the Talking Tactile Tablet (TTT) from Touch Graphics, Inc. (Landau and Gourgey, 2001), the scope of the project is now being expanded to include the automated production of audio-enabled tactile street maps. The necessity for tactile street maps for orientation and mobility Sighted people take for granted ready access to maps, signs and other visual aids for navigation. In planning a trip to an unfamiliar town, or finding one’s way around upon arrival, using maps is routine, and most sighted people could not imagine a world without them. For blind people, on the other hand, access to a tactile street map that is detailed enough to use for travel within any particular area is an extreme rarity. The dearth of tactile maps has far- reaching implications, well beyond the orientation and mobility domain (Golledge, 1993). The expense and difficulty of producing tactile graphic materials in general means that young blind children have inadequate exposure to them, thereby precluding optimal development of spatial, graphical and map reading skills. Tactile street maps are beneficial to blind pedestrians across a wide range of age groups from elementary school-age, well into late adulthood (Espinosa and OchaÄ­ta, 1998). Although some early studies questioned the cognitive ability of blind individuals to interpret maps, most recent results indicate that with some training, they can and do make use of maps to inform their internal cognitive representation of space (Uttal, 2000). Compared to control participants, blind and visually impaired people who are able to feel tactile diagrams, and who have had the opportunity to study a tactile map of an area of interest, show improved ability to independently navigate within that area (Ungar , Blades and Spencer, 1993). Talking TMAP: automated generation of audio-tactile maps using Smith-Kettlewell’s TMAP software by Joshua A. Miele, Steven Landau and Deborah Gilden © 2006 SAGE Publications The British Journal of Visual Impairment Volume 24 Number 2 2006 ISSN 0264-6196 93 DOI:10.1177/0264619606064436

Talking TMAP: automated generation of audio-tactile … · One of the most difficult aspects of automating the generation of tactile maps is placing braille labels to identify particular

Embed Size (px)

Citation preview

Page 1: Talking TMAP: automated generation of audio-tactile … · One of the most difficult aspects of automating the generation of tactile maps is placing braille labels to identify particular

Traditional tactile cartography is complicated byproblems associated with braille labeling andfeature annotation. Audio-tactile displaytechniques can address many of these issues byassociating spoken information and sounds withspecific map elements. This article introducesTalking TMAP – a collaborative effort between TheSmith-Kettlewell Eye Research Institute and TouchGraphics, Inc. Talking TMAP combines existingtools such as the World Wide Web, geographicinformation systems, braille embossers andtouch tablet technology in new ways to produce asystem capable of creating detailed and accurateaudio-tactile street maps of any neighborhood.The article describes software design, userinterface and plans for future implementation.

Introduction

The Smith-Kettlewell Eye Research Institute’s TactileMaps Automated Production (TMAP) projectaddresses the blind or severely visually impairedtraveler’s need for geographical information aboutunfamiliar environments (Miele, 2004). To help meetthis need, a prototype website for rapidly producinghigh-quality, low-cost tactile street maps has beendeveloped. It brings together online GeographicInformation Systems (GIS), accessible Web designpractices, tactile graphics design principles andmodern tactile hardcopy production methods toenable a totally blind individual to independentlyproduce customized tactile street maps (Miele andMarston, 2005). By incorporating touch tablettechnology in the form of the Talking Tactile Tablet(TTT) from Touch Graphics, Inc. (Landau andGourgey, 2001), the scope of the project is now

being expanded to include the automated productionof audio-enabled tactile street maps.

The necessity for tactile street maps fororientation and mobility

Sighted people take for granted ready access tomaps, signs and other visual aids for navigation. Inplanning a trip to an unfamiliar town, or finding one’sway around upon arrival, using maps is routine, andmost sighted people could not imagine a worldwithout them. For blind people, on the other hand,access to a tactile street map that is detailed enoughto use for travel within any particular area is anextreme rarity. The dearth of tactile maps has far-reaching implications, well beyond the orientationand mobility domain (Golledge, 1993). The expenseand difficulty of producing tactile graphic materials ingeneral means that young blind children haveinadequate exposure to them, thereby precludingoptimal development of spatial, graphical and mapreading skills.

Tactile street maps are beneficial to blind pedestriansacross a wide range of age groups from elementaryschool-age, well into late adulthood (Espinosa andOchaÄ­ta, 1998). Although some early studiesquestioned the cognitive ability of blind individuals tointerpret maps, most recent results indicate that withsome training, they can and do make use of maps toinform their internal cognitive representation of space(Uttal, 2000). Compared to control participants, blindand visually impaired people who are able to feeltactile diagrams, and who have had the opportunityto study a tactile map of an area of interest, showimproved ability to independently navigate within thatarea (Ungar , Blades and Spencer, 1993).

Talking TMAP: automatedgeneration of audio-tactilemaps using Smith-Kettlewell’sTMAP softwareby Joshua A. Miele, Steven Landau and Deborah Gilden

© 2006 SAGE Publications The British Journal of Visual Impairment • Volume 24 • Number 2 • 2006 • ISSN 0264-6196 93DOI:10.1177/0264619606064436

Page 2: Talking TMAP: automated generation of audio-tactile … · One of the most difficult aspects of automating the generation of tactile maps is placing braille labels to identify particular

Tactile maps can also be an extremely effective toolfor representing spatial information for the orientationand mobility (O&M) student (Bentzen, 1997). ManyO&M instructors devote a significant portion of theirlesson preparation time to the creation of detailedtactile maps to be used only once or twice by thestudent. An automated technique for producing highlydetailed, ‘one-off’ maps would be of great benefit tothese instructors and their students, freeing theinstructor’s time for actual O&M instruction, insteadof the arduous task of tactile cartography.

The TMAP software uses a geographic database tobuild its maps. Currently, the data being used are theUS Census Bureau’s Topologically IntegratedGeographic Encoding and Referencing System(TIGER®) line data. This data set includes the namesand locations of virtually every street in the UnitedStates. As the TMAP project expands its scope, acommercial data set will be incorporated that willhave richer and more accurate data, as well asgeographic data for many other countries.Regardless of the source, street locations are storedas vectorized latitude and longitude points, ratherthan as images. This makes it possible to constructtactile maps from scratch, rather than trying totransform existing ‘visual’ maps to tactile ones. TheTMAP software generates tactile maps according toa set of labeling, street density, line style and scalingcriteria intended to maximize tactile map readability.The maps can be printed out on the user’s ownbraille embosser at minimal cost, or embossedremotely and mailed to the user.

A visual representation of a sample tactile mapgenerated by prototype TMAP software appears inFigure 1. It was generated simply by specifying thestreet address, along with some scale and labelingparameters. Map features include lines representingstreets, abbreviated braille street labels, a scaleindicator in the upper-right corner, a map title at thetop of the page and a donut-shaped marker indicatingthe location of the requested street address orintersection. This illustration happens to show thearea of San Francisco in the immediate neighborhoodof The Smith-Kettlewell Eye Research Institute, butcould just as easily show any other neighborhood.

A tactile map can readily reveal the shapes ofintersections, the direction a street curves, whether astreet dead-ends and many other geographicalfeatures which are evident visually but are extremelydifficult for an independent, visually impaired travelerto distinguish without previous knowledge. A tactilemap also allows the traveler to inspect an unfamiliararea in advance and to plan a route to the desired

destination, saving time and frustration and avoidingdangerous situations such as attempting a streetcrossing at an unusually configured intersection.

Technical description: the Tactile MapsAutomated Production (TMAP) modelWeb-based map generation

The TMAP model consists of a user interface, theTMAP Engine and a map production step. The userinterface will include both a website and a telephoneinterface, although to date only the website has beeninitiated. The map files – digital graphics filesspecially prepared for production on a brailleembosser or other device – are produced by theTMAP Engine. Among other functions, the TMAPEngine incorporates algorithms for applying theindividual user’s preferences, finds space on the mapfor abbreviated braille labels and determines if theuser-specified scale is reasonable for the density ofgeographical features. Once the digital map file hasbeen generated, the user can produce the physicalmap either by downloading the file from the Webinterface and sending it to a local braille embosser,or by requesting that the digital file be sent to a thirdparty capable of producing the tactile map. If producedby a third party, the map and its associated legendare sent to the user via regular mail subsequent toproduction. Figure 2 illustrates the steps a consumerwould go through when using TMAP.

94 The British Journal of Visual Impairment • Volume 24 • Number 2 • 2006 • ISSN 0264-6196 © 2006 SAGE Publications

Map title

Requestedlocation

Lines

Braillestreetlabels

streetsindicating

Scaleindicator500 ft

Figure 1: A visual representation of a sample TMAP

Page 3: Talking TMAP: automated generation of audio-tactile … · One of the most difficult aspects of automating the generation of tactile maps is placing braille labels to identify particular

One of the most difficult aspects of automating thegeneration of tactile maps is placing braille labels toidentify particular map features. Visual maps oftenuse small print and flexible text orientation toassociate a large number of labels with theirreferents. Printed text labels can even be curved to fitthe contour of a road or other map feature, thusassociating the two by shape as well as proximity.Nothing of this sort is possible with braille. Potentialconfusion from mixing braille with tactile graphicsimposes significant limitations on how map featurescan be labeled. Also, braille takes up much moreroom than print and cannot be resized to fit anavailable space: there is no such thing as ‘fine print’in braille.

TMAP uses an algorithm to generate a unique set ofabbreviations for a given list of feature names. Theabbreviated labels are placed on the tactile mapadjacent to the associated feature, and theabbreviation and full feature name are included in alegend provided on a separate page. The algorithmallows the user to select the number of characters tobe used in the label. For example, Fillmore Streetmight be abbreviated ‘FL’, ‘FLM’ or ‘FLMR’, dependingon whether the user chooses to use two, three orfour characters per label. The advantage of includingmore characters is that it makes the associatedfeature names easier to understand and remember.The advantage of fewer characters per label is that itleaves more room available for map features andother labels.

The TMAP researchers have investigated severallabeling schemes, but have focused primarily on atechnique that places feature labels around theperimeter of the map. In this approach each street

label is adjacent to the point where the streetintersects the edge of the map. The advantage is thatthe user knows that all of the labels are around theperimeter of the map. This significantly reduces mapclutter and label ambiguity. The disadvantage is thatstreets that do not intersect the edge of the map arenot labeled. Also, the limited space at the edge of themap leads to labeling conflicts, making it necessaryto omit labels for some streets. One solution to theproblem of effective TMAP annotation is to addinteractive audio tags, an approach that will bediscussed in detail later in this article.

Tactile output

The maps provided by the TMAP Project areprimarily designed to be produced using brailleembossers. The raised dots can be arrayed in tightformations to produce tactile figures such as lines,curves and shaded areas. This technology has asolid and proven track record, and has been usedeffectively for producing tactile graphics for over 30years. There are large numbers of braille embosserscurrently in use in homes, offices, schools, librariesand other public and private institutions. This meansthat with the availability of the TMAP tools, manyvisually impaired users will be capable of producingtheir own tactile maps.

Tactile graphics file formats and Scalable VectorGraphics (SVG)

In addition to supporting file formats for all graphics-capable braille embossers, TMAP can provide tactilemaps for any number of additional productiontechniques by using a universal graphics file formatknown as Scalable Vector Graphics (SVG). A numberof alternatives to embossed tactile output exist, themost common of these being micro-capsule (or‘swell’) paper (Pike, Blades and Spencer, 1992). Asignificant amount of research has also beenconducted in the area of using inkjet technology forproducing raised lines and textures (McCallum andUngar, 2003). In order to avoid the necessity ofproviding an unspecified number of tactile graphicsfile formats as new technologies emerge, the TMAPsoftware can produce output in the form of ScalableVector Graphics (SVG) files. This file format is basedon the extensible markup language (XML) asspecified by the World Wide Web Consortium (W3C),and will allow third-party researchers and developersto render tactile graphics from a standardized dataformat.

Using the SVG file format, each geographic featurecan have individual attributes assigned to it, such as

© 2006 SAGE Publications The British Journal of Visual Impairment • Volume 24 • Number 2 • 2006 • ISSN 0264-6196 95

TMAP Model

or

or TMAP

Engine

TMAP Model

or

oror TMAP

Engine

TMAP

Engine

TMAP model

TMAPengine

Figure 2: The steps for getting a map via TMAP

Page 4: Talking TMAP: automated generation of audio-tactile … · One of the most difficult aspects of automating the generation of tactile maps is placing braille labels to identify particular

title and description tags. Furthermore, objects canbe grouped together and assigned various collectiveattributes. For example, many streets are composedof multiple segments, or ‘blocks’. These segments arerepresented as individual elements in the SVG file,but can be grouped together and given sharedattributes such as the street name or description. TheSVG format is extremely flexible and will allow otherfeatures such as points of interest, map regions andbodies of water to be added in the future. Byproviding this kind of detailed structural and semanticinformation about the maps generated by TMAP, thenumber of uses for the TMAP data is greatlyexpanded.

The Talking Tactile Tablet (TTT) and TMAP

One way to get around the difficulties of braille-basedlabeling of tactile maps is through interactive audiotagging. To test this approach, the TMAP developers

at The Smith-Kettlewell Eye Research Institute in SanFrancisco have teamed up with Touch Graphics, Inc.,a New York City company that has developed theTalking Tactile Tablet, nicknamed TTT or T3. The TTTis a portable, rugged and inexpensive computerperipheral device that acts as a ‘viewer’ for tactilegraphic materials. Users place one of a collection ofraised-line and textured overlay sheets on the device,which measures about 15 inches wide, 12 inchesdeep and 1 inch thick (see Figure 3, a photograph ofthe device with a sample overlay sheet mounted).Overlays are held in position against a touch-sensitive surface by a heavy hinged frame that theuser can open and close. The TTT can detect afinger press through the tactile overlay sheet andtransmits that position to a PC via a USB connection.The computer runs a program that compares thepositions of each finger press on the picture, map ordiagram with a list of pre-defined hotspots, andresponds appropriately.

Many researchers have contributed to the field ofaudio-enabled haptic exploration and tactile graphics(Gardner & Bulatov, 2001; Krueger & Gilden, 1999;Parkes, 1988). The TTT system adds a TactileGraphic User Interface (TGUI) consisting of auniform set of simple tactile symbols arrayedidentically around the central workspace on eachoverlay sheet (see Figure 4, the Japan and Koreamap from the National Geographic Talking TactileAtlas of the World). The standardization of the TGUImakes it possible to construct audio-tactileapplications that rival mainstream mouse and video-based multi-media presentations for interactiverichness (Landau, Russell, Gourgey, Erin andCowan, 2003).

96 The British Journal of Visual Impairment • Volume 24 • Number 2 • 2006 • ISSN 0264-6196 © 2006 SAGE Publications

Figure 3: The Talking Tactile Tablet with the UK mapsheet mounted (photo credit: Devon Jarvis)

Figure 4: The Japan and Korea map. Labels show functions of various elements of the Tactile Graphics UserInterface, and do not appear on the overlay sheet

Select option10 bar

Workspaceboundary

Page 5: Talking TMAP: automated generation of audio-tactile … · One of the most difficult aspects of automating the generation of tactile maps is placing braille labels to identify particular

An identification routine allows quick and easyaccess to the digital data associated with each tactilefigure. After placing the sheet on the TTT, the userpresses three specific points, which appear as shortvertical bars along the horizontal ID strip located justbelow the top edge of every sheet. Since the positionof the short vertical bars is unique for each sheet, thesystem is able to identify a sheet by comparing thesepositions to a database of known combinations. Uponcompleting the identification process, the data forthat overlay sheet are loaded. The user can thentouch any position on the tactile figure to hear theinformation associated with that location. When theuser wishes to change sheets, he or she simplyrepeats the process with the ID bar of any newsheet.

The Talking TMAP application

Touch Graphics has developed a softwareapplication, known as Talking TMAP, that uses a TTTand an SVG file (downloaded from the TMAPwebsite) to completely automate the process ofadding audio annotations to a tactile map. Theapplication, implemented in the MacromediaDirector® authoring environment, facilitatesinteraction between a user and the TTT’s hostcomputer. This section will describe its operation andsome design aspects.

The Talking TMAP program automatically loads mapdata that have been generated by the TMAP server.These data arrive at the user’s computer via Internetdownload, as a single text-based file in the SVGformat, permitting the definition of any number ofshapes and associated text-based information. Eachblock of each street shown on the map is describedas a series of vertices along a linear path. Theprogram loops through the SVG data, and builds anon-screen version of the map by drawing a thick linelinking each of the vertices for a given street. TheTTT is seen to the host computer as a pointingdevice. Thus, when a user places a TMAP on theTTT and then presses tactile lines on its surface, thecomputer identifies which street is being pressed,just as if a sighted user had clicked a mouse on avisual representation of the map.

Making new maps for Talking TMAP is straight-forward. Users simply visit the TMAP website anduse the online forms to request a tactile map. Duringthis process, the user specifies that he or she will beviewing the map using the TTT by selecting ‘TTT’from a number of possible output formats. The userthen opens the resulting SVG file with the TalkingTMAP Creator program. The SVG information is

automatically converted to a format appropriate forthe user’s braille embosser, as well as a computerfile that associates objects and attributes withpositions on the TTT. The Talking TMAP Creatorprogram sends the map file to the embosser, whichproduces the finished, ready-to-use tactile overlay.The overlay includes the TGUI elements and theunique ID bar that associates the overlay sheet withthe digital file containing the map specification.

At the simplest level, Talking TMAP allows the user tofreely explore a map of a neighborhood by pressingon streets to hear their names spoken via thecomputer’s speech synthesizer, but a number ofother types of geographical content and softwarefunctionality are also available to the user. A layeringstrategy permits access to additional classes ofinformation so that they are easily accessed by theadvanced user, but in a controlled way, to ensure thata less-sophisticated user is not overwhelmed. Whena street is pressed and quickly released on the tactilemap, just the name of the street is spoken. But bytapping repeatedly on the same street, other layersare revealed. In the current version of the application,the second tap causes the system to speak theaddress ranges for the right and left sides of thatparticular block, the third tap announces the length ofthe block and the fourth tap causes the system tospell the street name.

As with all applications for the TTT, Talking TMAPincludes a Main Menu of special functions that areaccessed whenever the plus sign shape in the upper-right corner of the overlay sheet is pressed (Figure 5shows a program logic flow diagram). Users navigateamong three Main Menu choices by pressing the upor down arrows to move incrementally through a list,then press the circle button to choose one. The MainMenu options are:

● Index. All map readers need a way of searching alist of every place shown on a map, as well as aneasy method for finding the position of any itemin the list. With Talking TMAP, the user selects theIndex tool, and then scrolls through analphabetical list to find a feature of interest byrepeatedly pressing arrow buttons, followed bythe circle button upon hearing the name of thedesired feature. Then, he or she is guided to therequested destination on the map through aprocess of verbal coaching. To accomplish thisthe user is instructed to touch the map anywhereand then follow the guiding instructions, whichincrementally lead to the target. Two advancedindex options are available for more sophisticatedusers. The first of these guides the user to an

© 2006 SAGE Publications The British Journal of Visual Impairment • Volume 24 • Number 2 • 2006 • ISSN 0264-6196 97

Page 6: Talking TMAP: automated generation of audio-tactile … · One of the most difficult aspects of automating the generation of tactile maps is placing braille labels to identify particular

intersection of two streets, first by selecting astreet from the index, and then by selectinganother street from a list of only those streetsthat cross the first one. The second advancedoption allows the user to select a street from alist, and then enter an address using the tactilenumber keypad on the overlay sheet. If theaddress appears on the map, the user’s finger isthen led to that location.

● Distance calculator. Because maps are producedto a known scale, it is possible to determine thelinear distance between any sequence of way-points. This is useful in trip planning, and inOrientation and Mobility training, where studentsare taught to think about distances traveled alongeach leg of a route. When the distance calculatoris active, pressing a point on the map for morethan one second causes a way-point to be addedto the route. A tone sounds and the user isprompted to press the next point along the route.During this process, brief touches of the mapcontinue to announce feature names, permittingeasy exploration of the map while building aroute. When finished, the user simply presses thecircle button of the TGUI to hear total distancealong the route, calculated in the preferred unitsof measure.

● Settings. User preferences can be set in threeareas: sensitivity, which controls the firmness withwhich the user must press on the overlay sheetto cause it to respond; units of measure fordistance calculations (choices are meters, feet,kilometers and miles); and speech rate for thesynthetic voice. The arrows and circle button ofthe TGUI are used to select desired preferences,and those settings are saved for use in futuresessions.

98 The British Journal of Visual Impairment • Volume 24 • Number 2 • 2006 • ISSN 0264-6196 © 2006 SAGE Publications

Figure 5: A diagram illustrating the Talking TMAPprogram flow

Page 7: Talking TMAP: automated generation of audio-tactile … · One of the most difficult aspects of automating the generation of tactile maps is placing braille labels to identify particular

Future directions for talking tactile maps

The work described here is part of a six-monthfeasibility study funded under a Small BusinessInnovation and Research (SBIR) grant from theNational Institute on Disability and RehabilitationResearch (NIDRR), a division of the US Departmentof Education. Depending on outcomes and theavailability of future funding, it may lead to anadditional 24-month follow-on project. This wouldallow further development of the Talking TMAPconcept to the point where it can be disseminatedamong the user community. The Phase 2 work wouldset out the following goals:

1. Add additional classes of information. In thecurrent version, the amount of informationavailable is limited by what can be extractedfrom the public-domain TIGER® line data.However, one of the great virtues of the audio-tactile approach is that there is theoretically nolimit to the amount of data that can be usefullyembedded in a map, as long as the layeringstrategy described earlier is used to provideadequate reader control over playback. Someadditional classes of information that might beadded are: number of lanes of traffic; direction oftraffic flow; whether streets are considered majorarteries; locations of important buildings andlandmarks; and information about non-streetcartographic elements such as railroad tracks,coastlines, bodies of water, etc.

2. Expand Talking TMAP to other kinds of maps.The same technology and methods that permitthe creation of talking tactile street maps ofindividual neighborhoods can be extended toinclude other kinds of map products. Phase 1concentrates on materials mostly intended foruse in way-finding and orientation and mobilitytraining, but in Phase 2 it would be possible tostart producing maps showing entire cities andtowns, regions and countries. There are manypotential uses for these maps. For example, aperson living in New York City might want toexamine a map of the entire state to find outabout the relative distances between cities, priorto deciding whether to take a new job. Expandingthe scope of the TMAP system will requireadditional study of the kinds of symbols andtextures that would be needed to show a widerrange of map entities that could be madeeffectively with current braille embossers (Rowelland Ungar, 2003).

3. Develop a Talking TMAP production service.Many potential users will not have access to abraille embosser, and others may not have

Internet connections, or will lack thesophistication to place requests for maps on line.It must be recognized that, while some blind andlow-vision individuals are highly capablecomputer users, many others lack experienceand have difficulty working with various accesstechnologies. For these individuals, a map-request system that uses the telephone, email orpostal mail would be desirable. Touch Graphicsmight establish such a service, allowing the mapand associated computer files to be sent to theuser via post. Upon its arrival, the user wouldplace the digital medium (such as CD orcompact flash card) in his or her computer, placethe raised-line map overlay on their TTT andbegin. This distribution model is already in placeand familiar to customers (in the USA) who orderbraille materials and talking books via mail fromthe National Library Service or Reading for theBlind and Dyslexic. While this solution stillrequires some computer literacy and access toequipment, it will be easier for those who mighthave only minimal computer skills.

Conclusion

The lack of accurate, detailed and accessible streetmaps has long been a significant impediment tomany visually impaired individuals as they strive tolead independent and productive lives. The workdiscussed here on the TMAP and Talking TMAPsystems tries to address this deficiency in practicaland effective ways. Based on positive outcomes inupcoming user trials, this work may lead to theintroduction of real-life solutions to this persistentproblem in the very near term.

References

Bentzen, B. (1997) ‘Orientation aids’, in B. Blasch, R.Wiener and R. Welsh (eds) Foundations ofOrientation and Mobility, 2nd edn, pp. 284–316. NewYork: AFB Press.

Espinosa, M.A. & OchaÄ­ta, E. (1998) ‘Usingtactile maps to improve the practical spatialknowledge of adults who are blind’, Journal of VisualImpairment & Blindness, 92(5), 338–45.

Gardner, J. & Bulatov, V. (2001) ‘Smart figures, SVG,and accessible Web graphics’, paper presented atthe CSUN International Conference on Technologyand Persons with Disabilities, Los Angeles, CA,March.

© 2006 SAGE Publications The British Journal of Visual Impairment • Volume 24 • Number 2 • 2006 • ISSN 0264-6196 99

Page 8: Talking TMAP: automated generation of audio-tactile … · One of the most difficult aspects of automating the generation of tactile maps is placing braille labels to identify particular

Golledge, R.G. (1993) ‘Geography and the disabled:a survey with special reference to vision impairedand blind populations’, Transactions of the Institute ofBritish Geographers, 18, 63–85.

Krueger, M.W. & Gilden, D. (1999) ‘KnowWare™:virtual reality maps for blind people’, in J.D.Westwood (ed.) Medicine Meets Virtual Reality,pp. 191–7. Amsterdam: IOS Press.

Landau, S. & Gourgey, K. (2001) ‘Development of aTalking Tactile Tablet’, Information Technology andDisabilities, VII(1).

Landau, S., Russell, M., Gourgey, K., Erin, J., &Cowan, J. (2003) ‘Use of the Talking Tactile Tablet inmathematics testing’, Journal of Visual Impairment &Blindness, 97(2), 85–96.

McCallum, D.R.J. & Ungar, S. (2003) ‘Producingtactile maps using new inkjet technology: anintroduction’, Cartographic Journal, 40(3), 294–8.

Miele, J. (2004) ‘Tactile Map Automated Production(TMAP): using GIS data to generate Braille maps’,paper presented at the CSUN InternationalConference on Technology and Persons withDisabilities, Los Angeles, CA, March.

Miele, J. & Marston, J. (2005) ‘Tactile Map AutomatedProduction (TMAP): on-demand accessible streetmaps for blind and visually impaired travelers’, paperpresented at the Annual Meeting of the AmericanAssociation of Geographers, Denver, CO, April.

Parkes, D. (1988) ‘NOMAD: an audio-tactile tool forthe acquisition, use, and management of spatiallydistributed information by partially sighted and blind

people’, paper presented at the Second InternationalConference on Maps and Graphics for VisuallyDisabled People, Nottingham, June.

Pike, E., Blades, M. & Spencer, C. (1992) ‘Acomparison of two types of tactile maps for blindchildren’, Cartographica, 29(3&4), 83–8.

Rowell, J. & Ungar, S. (2003) ‘A taxonomy for tactilesymbols: creating a useable database for tactile mapdesigners’, Cartographic Journal, 40(3), 273–6.

Ungar, S., Blades, M. & Spencer, C. (1993) ‘The roleof tactile maps in mobility training’, British Journal ofVisual Impairment, 11, 59–62.

Uttal, D.H. (2000) ‘Seeing the big picture: map useand the development of spatial cognition’,Developmental Science, 3(3), 247–64.

Joshua A. MieleResearch FellowThe Smith-Kettlewell Eye Research InstituteSan Francisco, CA, USAEmail: [email protected]

Steven LandauPresidentTouch Graphics, Inc.New York, USAEmail: [email protected]

Deborah GildenSenior ScientistThe Smith-Kettlewell Eye Research InstituteSan Francisco, CA, USAEmail: [email protected]

100 The British Journal of Visual Impairment • Volume 24 • Number 2 • 2006 • ISSN 0264-6196 © 2006 SAGE Publications