Interactive Wayfinding for the Visually Impaired

The Last Mile

For the blind and visually impaired, wayfinding in public places is getting easier thanks to new touch-responsive talking maps and models.

Imagine arriving at an unfamiliar school building for the first time. You need to navigate a complex maze of rooms, corridors, stairwells, and spaces to find the classroom or office that is your final destination—all in a concentrated period of time.

Now imagine you are blind.

Unlike most users of this building, you can’t rely on traditional environmental graphics, such as signs, maps, or directories, to get a sense of the size and configuration of this new place. And your audible GPS doesn’t work the moment you enter the building, because the building itself blocks access to the GPS signal. While GPS for the blind and visually impaired is an amazing breakthrough that verbally identifies landmarks and constructs outdoor walking routes, the “last mile” problem means that complete travel independence may still not be possible. So you will have to resort to asking for help to read signs from a sighted stranger, or worse, you may need to request physical guidance, ruining the feeling of independence granted by successfully navigating to the building entrance on your own.

To address this “last-mile” problem, developers at the Center for Inclusive Design and Environmental Access (IDeA) at the University at Buffalo and Touch Graphics, Inc. designed, fabricated, installed, and evaluated a series of touch-responsive talking models for visually impaired travelers. The interactive models were placed in three locations frequented by blind staff and visitors: the Technology Center, Carroll Center for the Blind, Newton, Mass.; Chicago Lighthouse for the Blind; and Grousbeck Center, Perkins School for the Blind in Watertown, Mass.
Each talking map presents the spatial layout of its immediate surroundings in a multi-sensory format that is usable by everyone, with a particular emphasis on the needs of the blind.

The prototype models and maps represent spaces as 3D buildings in a landscape (for a campus), or as a raised-line and textured surface (for a building interior). In each case, forms were generalized to focus on only those features that are relevant to wayfinding and orientation, with all superfluous information omitted for tactile clarity and legibility.
The models are touch responsive; that is, as you explore them with your hands and fingers, they announce the names of the thing you are touching followed by a description of activities occurring at each place and, finally, spoken directions for walking to that place. By explaining the configuration of the building or campus, the systems are intended to make it possible for a determined independent blind traveler to identify and travel to any location. The models strive to be appealing and user-friendly for everyone, including those with other disabilities, or no disability, without compromising accessibility.

Goals of Universal Design
The design team started by considering the eight goals of universal design, as articulated in Universal Design: Creating Inclusive Environments (Steinfeld & Maisel, 2012):

1. Body Fit - Accommodating a wide a range of body sizes and abilities. Each map is placed on a horizontal or slightly sloping counter with knee space so that users in wheelchairs can pull up close. Controls are large and easy to use.

2. Comfort - Keeping demands within desirable limits of body function and perception. Trigger strength for activating speech by pressing on the map adapts to meet user
needs. The height of the map allows both standing and seated users to easily reach its entire surface.

3. Awareness - Ensuring that critical information for use is easily perceived. Maps are visual, tactile, and they describe themselves through spoken language, large-print captions, and refreshable Braille. Sound effects embedded in the map capture environmental sounds like fountains or bells; this may help non-verbal users recognize landmarks they may encounter when traveling through the environment.

4. Understanding - Making methods of operation and use intuitive, clear and unambiguous. The horizontal orientation gives the users a ‘birds-eye” view in the same relationship that they would experience it in the real world, unlike a vertical orientation that requires a more symbolic understanding of maps. Projected satellite images increase the reality of the simulation and allow focusing attention to specific parts of the model when necessary. 

5. Wellness - Contributing to health promotion, avoidance of disease, and protection from hazards. The edges of the map table and undersides are designed to avoid injury. The material used for the model is relatively impervious and can be cleaned easily.

6. Social Integration. All users are invited to use the talking maps. They are designed to be enjoyable and useful for users with and without disabilities. In the latest version, several users can utilize the map at the same time.

7. Personalization - Incorporating opportunities for choice and the expression of individual preferences. Users can just touch parts of the model to hear place names and directions, or they can select other options through a simple three-button user interface. Detailed spoken instructions are available, but experienced users can skip over them by pressing a button.

8. Appropriateness - Respecting and reinforcing cultural values, and the social and environmental contexts. Each talking map is located in an obvious position at a building’s main entrance and, if on a campus, at the visitors’ center, consistent with user expectations. The content presented by the maps can be customized and adjusted to emphasize the priorities of the users and the campus or building owners. For example, the history of the campuses can be conveyed as well as the physical features to provide access to culturally important information.

The requirements of accessibility regulations are minimal. For example, they only require that the number on a room sign be tactile and in Braille. Nothing about the room is required to be understandable by people who cannot read signs visually. Moreover, there are no requirements to provide information about the plan of a building or campus or directions from one place to another. Accessibility regulations give no guidance to designers for going beyond the minimum requirements. The Goals of Universal Design provide product developers with a simple checklist of outcomes that can help them create a higher level of accessibility and usability for all building users. 

As the population becomes familiar with navigational aids like GPS devices, the business case for universal design applications that exceed accessibility regulations becomes stronger because building users will have higher expectations. Moreover, the aging of the population will drive the market for increased usability. We may soon see a movement to certify universally designed products and environments through a point system like LEED certification for green buildings. And, regulations are being expanded to address unmet needs like those satisfied by the talking models. The work reported on here anticipates these possible developments, and foresees a day when enhanced accessibility features like those demonstrated here are routinely added to public information displays.

How touch-responsive models and maps work
All three installations rely on capacitance sensing to measure multi-finger touches on opaque, textured surfaces and shapes. The need to sense touches against irregularly shaped surfaces requires a different approach compared to flat touchscreens. In these examples, conductive paint was applied to plastic forms produced by 3D printing or CNC milling. Rooms, buildings, walking paths, roads, bus stops, or other map features that react when touched are created as individual, electrically isolated painted regions. The regions are connected by thin wires to sensors housed in the pedestal, and a computer handles all interactions and displays relevant media stored as sound clips and visual imagery. The sensors use a patented method (Landau & Eveland, 2014) of measuring finger pressure, and software permits building staff to “tune” the model, equalizing trigger-thresholds for each zone, to produce a convincing illusion of pressure sensitivity. Through user testing, the developers optimized sensing algorithms to ensure that users with different degrees of hand strength and dexterity found the systems easy and enjoyable to use.

While these experimental systems were designed to accommodate the needs of blind pedestrians, information was also displayed visually to be beneficial to all users and enhance social integration (Goal 6). In the model designed for the Carroll Center Campus, buildings were painted in colors that distinguished dormitories from academic buildings, and roads were painted white to set them off from black lawn areas. After user testing, developers determined that dynamic visual information should be added, so at Chicago Lighthouse for the Blind, the developers installed an overhead video projector that shines down on the shiny gray map surface (gray optimizes reflectivity in daylight conditions). This innovation animates the 3D tactile surface with light, and permits a variety of useful and beautiful effects, such as projecting matching satellite imagery on the model from above. The enhanced dynamic presentation techniques allow customization of the image to fit specific needs or values of the sponsors and to address unique features of the place; for example, valued natural features or acoustic qualities (Goal 8).

Universal Design Goal 4 calls for design that’s easy to understand, meaning that new users should be able to figure out what to do with a minimum of instructions. Any action taken should result in a satisfying, understandable result that moves the user closer to sought-after information. To this end, developers created dual interaction modes that can be used separately or in combination.

Direct touch. The first thing new users think to do when trying out one of these systems is to simply touch the tactile surface. The first time one of the touch-responsive zones is activated by a direct touch, the name of that room, building, or outdoor space is spoken and a visual spotlight appears there. Maintaining finger pressure causes the system to play a description of activities or occupants at that location, followed by walking directions to travel to that location.

Main menu. For those users seeking general information about the building or campus or who want to customize the way information is delivered, a simple three-button user interface provides easy access to a main menu of options. Users move forward or backward through menu options using right and left arrow buttons, and they select the current option by pressing the circle button between the triangles. The most powerful option in the main menu is the index, which permits users to move through a list of all places shown on the map, and then select one to be guided there on the map through a process of incremental voice coaching. For blind users, this is crucial, because it serves the same purpose as the alphabetical listing of offices in many mainstream building directories: if you know the name of the place where you are going but not its location, you can look it up.

Upon completion of each installation, staff from the IDeA Center implemented an on-site, user-centered evaluation to measure the effectiveness of various features, leading to recommendations that not only improved interaction on the evaluated model, but also highlighted features to eliminate, add, or modify in the next project.

Evaluation called for blind, low vision, and mobility-impaired users to carry out a five-minute free exploration of the map, then execute a series of tasks of escalating difficulty. Subjects were asked to try each menu feature, and then to use the alphabetical index to locate an unfamiliar destination. Then, they attempted to physically navigate to the place they had found on map as one of the researchers followed to see if they reached the destination successfully. Findings from the first two studies informed subsequent design modifications, and allowed innovative features to be incorporated in response to observations made experimentally.

Since universally designed products accommodate the needs of multiple users, their development always requires human testing with as a large a variety of subjects as possible. And because these installations are site-specific, they could not be successfully evaluated in the laboratory. The three-part development cycle of design-prototype-evaluate proved to be an effective way to promote user-centered innovation. Information obtained from each testing cycle fed forward to generate new ideas and features for the next installation. Feedback cycles and user participation are key characteristics of universal design practice since the knowledge needed to address the needs of very diverse populations is limited.

Future applications
While this study focused exclusively on orientation and wayfinding, these techniques can be applied to other cases where universal access to visual materials is needed. For example, Touch Graphics has added touch-responsiveness to museum exhibits. As in the maps and directories, these installations allow everyone to interact through direct touching, a visceral and intuitive way to learn. Touch has become an important aspect of our interactions with devices like tablet computers, but it appears that we have only scratched the surface. As touch moves into the third dimension and the Internet of Things begins to take shape, we may see sculptures, steering wheels, and exercise equipment controlled through touch and gesture. Meanwhile, these three talking map installations give a glimpse of what is possible.

--By Steve Landau, Heamchanad Subryan, and Edward Steinfeld, eg magazine No. 11, 2014

Editor's note: Steve Landau is president of Touch Graphics, a company that develops universally designed products and exhibits that include interactive touch. Heamchand Subryan is a designer and researcher interested in creating inclusive interaction in products and environments at the IDeA Center. Edward Steinfeld is a SUNY Distinguished Professor of Architecture at the University at Buffalo and Director of the IDeA Center.

The research described in this article was made possible by federal grant number number H133E050004 from the National Institute for Disability and Rehabilitation Research, part of the U.S. Department of Education. Touch Graphics and the IDeA Center partnered on the research.

Research and Design Team
Touch Graphics Inc.: Steve Landau (design), Nicole Rittenour (graphics), Zach Eveland (technology)

IDeA Center, University at Buffalo: Heamchand Subryan (evaluation), Edward Steinfeld (principle investigator)

Touch Graphics

For more content in your areas of interest, see SEGD's Xplore Experiential Graphic Design index.

Upcoming SEGD Events

2017 SEGD Xplorer
2017 SEGD Wayfinding
2017 Xlab Conference