Museums as Experiential Learning Labs

Museums as Experiential Learning Labs:
Developing User-centric Interactive Exhibits through Observational and Participatory research
Lisa Fontaine
Iowa State University

Abstract

This paper presents the results of a curricular approach that provides graphic design students with an opportunity to engage in extensive observational and participatory research to better understand user experience in interactive museum exhibits. Conducting research in museums can circumvent many of the difficulties associated with user studies in a classroom setting.

It can be challenging for educators to provide opportunities for students to engage in extensive observational and participatory research to understand user experience. This can even more difficult when attempting to study a large body of users performing a wide range of tasks. In many situations, this would require the design and approval of formal observational studies, where a small sample size would be observed performing a task. Students in the Interactive Exhibition Design studio course at Iowa state University conduct research in museum settings, where they can watch users (museum visitors) of many different ages, abilities, aptitudes and interest levels as they attempt to engage with the museum’s interactive exhibit stations. 

Upon documentation of many hours of both observation and participation, the students design their own interactive museum exhibits. Armed with insights into how visitors reallyuse exhibits, the students are able to design prolonged engagements that successfully facilitate learning, avoiding methods that won’t effectively engage, motivate, or educate the museum’s audience.

The Challenge of Observational Research

It can be challenging for educators to provide opportunities for students to engage in extensive observational and participatory research to understand user experience. This can even more difficult when attempting to study a large body of users performing a wide range of tasks. In many situations, this would require the design and approval of formal observational studies, where a small sample size would be observed performing a task. In university settings, such studies involving human subjects require an extensive proposal to be approved or waived by the Institutional Review Board (IRB) to ensure the ethical treatment of study subjects. Further complications involve the recruitment of a wider range of subjects and scheduling of user testing. These challenges can be eliminated by conducting observational and participatory research in museums, using interactive museum exhibits as the research focus.


Methodology

For the past 28 years, the author has incorporated museum exhibition into the graphic design curriculum at Iowa State University in order to introduce students to critical thinking and problem solving. Students in the Interactive Exhibition Design studio course begin the learning process by conducting research in museum settings, where they can observe users (museum visitors) of many different ages, abilities, aptitudes, attention spans and interest levels that are attempting to engage with the museum’s interactive exhibit stations. 

Since 2008, this has evolved into an ongoing collaboration with the Field Museum of Chicago, which has now included the design of exhibitions about Conservations, Sharks, Ants, Egypt, Biomechanics, Paleontology, Vikings, the Colorado River and Titanosaurus.

Approximately 18 students per year participate in the semester-long studio course. For 2 days during the course, onsite research is conducted in Chicago at the Field Museum, Children’s Museum, and the Museum of Science and Industry. These museums were selected for their intensive use of interactive learning throughout their exhibits. The onsite research requires students to develop an understanding of each of the museum’s visitors by observing their preferences, needs and limitations (fig. 1). Using a taxonomy developed by the instructor, they are asked to recognize six different interaction types. During the 2-day field study, they examine approximately 45 different interaction stations (fig. 2). 

In order to be effective, interactive exhibits must present a clear message, teach the intended lesson, and reward the visitor’s participation. They must appeal to visitors who have come to the museum for alternative educational experiences. At each exhibit station, the students determine the effectiveness of the intended learning outcomes by watching users’ interactions, then interacting with the exhibit themselves (fig. 3). 

In both written and photographic documentation, they describe and analyze interactions that succeed at their intended outcomes, and those that fall short of their goals. As students observe and test the possible reasons why some interactions don’t seem to work, they are able to identify many problem areas. These can be categorized as follows: 


Observing Problems with Communication

Some exhibit instructions are too complex or unclear to understand what to do. 

The intentions might be presented in a confusing way, or too much is being expected of the visitor. 

In an exhibit that intends to explain the difference between asexual and sexual cellular reproduction (fig. 4), Monica Pearson is concerned that “The topic was too complicated for parent to want to explain it to their kids. One kid pointed to the cells and said ‘Are those potatoes?’ The parent just said yes and kept walking through the exhibit.”  

One exhibit the students observed asks the visitor to look for a seed fern in a makeshift forest (fig. 5). Nancy A. Acosta found the instructions lacking. She states “The call to action –‘can you see this seed fern in the forest?’ is confusing enough, without wondering if the seed fern fossil has anything to do with searching for the initial object. Most people looked at the panel below, and moved on, without interacting with this.” 

Some interactions don’t answer their own questions.

In some exhibits, the visitor’s curiosity is not rewarded. Brennan Scott observed an interaction where the intent was to spin the wheel and watch as the energy from the sun is transferred to different aspects of the life cycle (fig. 6). “However, what happens is a circle moves around on a track and lights up different things as it passes them. It doesn’t work because it doesn’t actually teach you anything about how energy is transferred. The user just ends up spinning the wheel as fast as they can and doesn’t even pay attention to why they have been asked to do this activity.” 

Some exhibits make it hard for the visitor to know if they did it right or not.

As they observe and interact with numerous displays, the students become especially cognizant of gaps in an exhibit’s feedback loop. In an exhibit about genetic mutations (fig. 7), Huiwon Lim says “This interaction stand was about DNA fingerprinting. So maybe people could know about the fingerprinting through this interaction…(but there’s) no way to know if you did it right or not.”

Observing Problems With Audience Engagement 

Some exhibits lack sufficient interactivity.

Many students were surprised to observe this problem in the Children’s Museum, where a tiny room displays glass cases of varied object collections (fig. 8).They watch to see how unappealing these are to children. As Brennan Scott notes, Once the child or user has entered through this small door, they see a bunch of objects that they can’t touch or interact with so they almost immediately leave the space. It draws the user in but it doesn’t give them any reason to stay and experience the exhibit.” 

Sometimes the targeted audience is too young /too old to enjoy / learn from this activity.

One example of this is an exhibit showing a burning house on the wall with flip boards underneath; the flip boards have questions on them that children are supposed to answer and them flip to see if they were right (fig. 9). Ashton Temple observed children ignoring it, and stated “Most of the children were too young to understand the purpose of flip books and the children that were old enough were bored by this exhibit.”

Ryan Hubbard watched teenage boys playing with an exhibit that invites visitors to move packets from node to node to interrupt the frequency (fig. 10). Although they appeared to be the target audience, he notes that they “…seemed to be confused as to what exactly they were trying to prove in this exhibit. It was hard enough to use that small children would become uninterested almost immediately.” 

Some Interactions that take too long to maintain interest. 

Brennan Scott found evidence of this problem in a display that requires visitors to punch in a code and listen to information (fig 11). He observes, “In theory this is a great idea. However, the controls are unresponsive, the voice takes too long to start, and they don’t give you a good enough idea of what you are about to hear to be interested. Personally, I think that without some sort of visual aid the voice over interactions are boring and from the lack of use I think the other visitors agree.

Ashton Temple found a similar problem when observing an exhibit whose goal was to compare how different sized materials fall through a small opening (fig. 12). She states, Overall the exhibit is very boring; adults spent maybe 2 minutes tops looking at the exhibit and children spent even less time with the exhibit.”

Poorly designed exhibits result in kids just playing with buttons/moving parts. 

Frequently the students notice exhibits whose learning objectives are being ignored, leaving visitors to play with the buttons in a meaningless way. In one exhibit where the intention is to match the colored glass pieces with the butterfly wings on the wall, ostensibly to learn about patterns that different butterflies have on their wings (fig. 13). Nancy A. Acosta considers the experience to be far too open-ended. She notes, “…there are no instructions at all. No indication on what to do anywhere. Most kids came to play with the glass pieces, as if these pieces were the real puzzle.” In another exhibit, Kirsten baxter notes that she… “saw a toddler just moving (the lever) up quickly and down and his grandpa trying to get him to stop.”

Some experiences are too similar to those in video games or online activities.

Kirsten Baxter saw this as a shortcoming in an exhibit where families are invited to videotape themselves talking about their day (fig. 14). She finds this unappealing, stating that “everyone already has a phone and can video themselves at any time…it doesn’t have the “cool” factor it may have had 10 years ago.”

Observing Design or Fabrication Problems

Some exhibits haven’t given enough attention to the user interface.

Observation and interaction with exhibit displays shows the students the importance of designing the user experience with care and testing. While testing an interaction thatshows how airflow creates a tornado vortex (fig. 15), Tom Bos found both visual and functional challenges: “You were supposed to be able to affect the tornado by controlling the speed of the fans with levers at the control panel, however when I tried to increase the speed of a number of different fans, no change was reported on the LCD screen showing windspeed or with the tornado itself. The layout of the controls was another problem. The eight fans were organized by letter and laid out vertically, A on bottom, H on top. The controls were laid out vertically, but not with all eight in a row. Instead A through D were on the left, and E though H were on the right. This incongruouslayout was not intuitive.”

Text and image placement were also found to be unsuccessful in a display that aims to illustrate how mammoths could hold their massive heads up (fig. 16). Holden King notes, “Though the example was successful, the mammoths shadow covers explanatory texts, so that it’s difficult to read.”

Alison Schwartzhoff noticed the importance of prototype testing (or the lack thereof) in a storm chaser’ role-playing interaction (fig. 17), stating that it “….wasn’t conducive to learn for people that were not the initial viewer. The screen that the ‘player’ is looking at is not the same as the large screen the rest of the viewers can see. This makes it hard for people not playing to understand what is being taught.”

Two students expressed concerns about the complexity of the user interface in an exhibit intended to help visitors understand genomes (fig. 18).Gongming Yang observes “the instruction is too complicated because there are a lot of buttons and one controller wheel and a scan instrument. The operation is unclear, which make visitor easy to give up this interaction design. Ryan Hubbard agrees, addingA basic flight simulator (is used) to navigate through display parts but it’s really hard to do, especially for kids. Educational parts of the exhibit arebnot shown until you have completed the flight part.” 

Sometimes it is just too hard for some users to move / turn / activate the knobs, levers

This is of special note where the intended audience is young children. In a water exhibit where children are supposed to turn a wheel fast enough to move water through a windmill (fig. 19)

Devlen Dailey Gempis notes, “The wheel was really difficult for me to turn and even get a drop of water in the wheel; the picture shows Rachel trying to turn this wheel but it hardly has any water moving inside it.”

Not all of the exhibits are accessible for wheelchairs. 

Students become very attentive to the ways that disabled visitors have been excluded from some of the museum’s experiences.Observing an exhibit where visitors sit in a spinning chair and ‘flap their wings’ (fig. 20) Brennan Scott notes “…it would be incredibly difficult for them to experience this interaction. They would either have to miss out or wait for someone else to experience it and see if they could gain some understanding from just watching.” 

Some exhibits haven’t been tested with users. 

This surprised the students who tried to engage with an exhibit where visitors place toy cars at the top of a ramp to watch gravity at work (fig. 21). Tom Bos notes “Some of the cars are too big to fit in the track. Large tweezers were provided to remove the cars, but they weren’t long enough to remove one of the cars I saw stuck.”

Implications of theory and process

Upon documentation of many hours of both observation and participation, the students are then challenged by the Field Museum’s Design Director Alvaro Amat to design their own interactive museum exhibits based on the museum’s upcoming topics. This directive could be defined as an ‘ill-defined problem’ – the museum has very specific intentions regarding the learning outcomes of their natural history themes, but intentionally offers no suggestions about the students’ deliverables. Armed with insights into how visitors reallyuse exhibits, the students are able to design prolonged engagements that successfully facilitate learning, avoiding methods that won’t effectively engage, motivate, or educate the museum’s audience. During their observations, for example, they are surprised to note how often users fail to read instructions, resulting in superficial or meaningless interactions. They engage with the exhibit stations themselves with a critical eye for clarity of message and ease of use. 

In response to their field research, the students have come to realize that form-driven or style-driven design has no place in these exhibits; usability drives the design process. It would be impossible to build this exhaustive body of knowledge without leaving the studio classroom.

In their own design process, they are now able to review and reflect on the effectiveness of their own solutions in comparison to those they observed and experienced. 

Jesslyn Carroll and Kate Roth’s Water Footprint exhibit (fig. 22) responds to what they learned of the importance of clear instructions and of self-sustaining interactions; one can also see they’ve become attentive to the needs of wheelchair visitors. 

In Ruby Hotchkiss’ exhibit on Titanosaurus (fig. 23), she responded to her observations of the need for interaction spaces to be partially enclosed in order for users to feel comfortable and not distracted. 

Rachel Krysa and Kirsten Baxter were informed by their observations of varied approaches to concealing and revealing information in their exhibit on Extinction Prevention (fig. 24)

Ashton Temple and Monica Pearson experienced many different methods for designing matching games. Their observations of the comprehension difficulties within science exhibits caused them to take special care in explaining comparative river flows (fig. 25) 

Tom and Brooke Bos observed the importance of presenting clear cause/effect relationships when explaining physical or scientific phenomena. This drove their ‘control deck’ exhibit, an interaction which shows what happens if the delicate balance of the Colorado River is disturbed (fig. 26).

Francis Szynskie and Amir Yusof noticed the effectiveness of combining 3d models and digital games in order to provide alternative learning paths for different users. This led to their integration of both methods in their Colorado River Geology exhibit (fig. 27).


Contributions to the Field

For educators wanting to introduce new and effective ways to include user-centered research in a design studio course, interactive museum exhibits can provide an efficient opportunity for students to inconspicuously observe many users without setting up formal observational studies. As public learning ‘laboratories’ these environments are an ideal way to introduce students to user-centric design. The nature of the museum environment allows design students to not only observe others but also to experience the interactives themselves, developing a respect for the importance of user-testing and prototyping. That this can all be conducted at any time and without formal study protocols is another distinct advantage.

By designing interactive learning exhibits that engage a broad range of users, students who participate in this experience are challenged to identify graphic design as a problem-solving discipline more than one of form-making or self-expression. They become skilled in user-centered design and gain a deeper understanding of how people experience interfaces. 

The benefits of observational research in museums is not limited to exhibit design courses. Most of the interactive experiences at science and natural history museums now involve the design of educational game interfaces. This could inform graphic design students in any coursework that involves user experience; it could contribute to user-centric pedagogy being developed by current educators that encourages a deepened understanding of successful user experiences.   

 

Bibliography

Beale, Katy, editor. Museums at Play: Games, Interaction and Learning. Edinburgh: MuseumsEtc. Ltd. 2011.

Bowers, John. Introduction to Graphic Design Methods and Processes. Hoboken, NJ: John Wiley and Sons. 2011.

Fostering Active Prolonged Engagement. San Francisco: The Exploratorium. 2005.  

Goldowsky, Alexander, and Maureen McConnell. “The one-two punch: synergy between simulation games and other interactive approaches in exhibitions.” In Museums at Play: Games, Interaction, and Learning. Beale, Katy, editor. Edinburgh: Museums Etc. Ltd. 2011.

Jacob, George. Exhibit Design: The Future. Charleston, South Carolina: CreateSpace. 2011.

Laurel, Brenda. Design Research Methods and Perspectives. Cambridge, Massachusetts, London, England: MIT Press. 2003.

Vermeeren, Arnold, editor.Museum Experience Design: Crowds, Ecosystems and Novel Technologies. New York: Springer. 2018. 

Zaharias P, Machael D, and Chrysanthou, Y. Learning through multi- touch interfaces in museum exhibits: an empirical investigation. Education Technology and Society, 16:3 (2013), 374-384.