Multitouch Exhibit Design 2: Elements, Objects, and Environments

This post is the second in a series of three exploring multitouch and multiuser design. Our company, Ideum, develops computer-based interactive exhibits for museums.

The first post addresses user interaction and feedback, the second focuses on User Interface (UI) elements, objects and environments, while the third looks more broadly at how multitouch and multiuser exhibits can shape the visitor experience.

Objects, Environments, and Navigation
The last post explored the types of gestures and the way in which visitors interact. This post focuses on the elements that visitors interact with. As you’ll see, there is strong focus on objects.  Multitouch and multiuser applications also require that we rethink standard navigation schemes.

Objects. Resizing and rotating objects is a common feature in many multitouch applications. This type of interaction is very intuitive and we observed museum visitors spending a lot of time viewing photographs in this way.  For a multitouch multiuser table, an object-based approach makes sense. Objects can be rotated to accommodate visitors from any side of the table and multiple users can interact independently with objects.  The extension of this object-based approach to connect with museum collections seems natural.

Environments. There are a number of interesting possibilities when we look at environments in multitouch design. In the Yahoo! Maps and Flickr mashup application we developed, the map was the environment. It provided context for the geocoded photographs. However, the environment could be even more dynamic with ability to transform objects or trigger new actions.

For example, one could imagine an exhibit where visitors could move objects from one-side of the table to other—this movement could represent time and we could see that object or photograph change over time. The table could effectively work as a dynamic timeline: move right and move forward in time, move left and you move back.

Additionally, a table environment could be divided into zones, and as objects are moved into these areas events are triggered, or objects are transformed in some way.  There are interesting possibilities here for works of art. Exhibits where visitors manipulate sketches which transform into finished paintings could be developed.   For scientific images, different views of objects shot by different types of imaging equipment or in different wavelengths could be used. Finally, developing physics games where gravity, the coefficient of friction, or other parameters could be changed is another possibility for science exhibits.

orbUniversal, Omni-Directional Navigation. Since multiuser, multitouch applications can be accessed from any side of the table, there is a need, in some applications, for a universal navigation element.  In our mapping application, we wanted to provide museum visitors with the ability to control the map environment. We found that panning could be a shared function, but the ability to zoom and select the map-type  couldn’t be shared in the same way.

Visitors can “call the orb,” with a quick double-tap on the table surface and the orb even contains a compass to help the visitors orient themselves to the map they are controlling.

The Microsoft Surface team recently released an example of Newsreader with universal, omni-directional navigation.  In this case, the navigational element is fixed and rotates around the center of the table.

Symmetric Navigation. Another possible approach to navigation on a large table where users can approach from any direction is what we call symmetric navigation. In this case, the navigational items are repeated in two places on the table. For example, a simple menu could be present in opposite corners of the table. Another configuration would have sets of tabs or buttons appearing on opposite sides of the table.  The information would be repeated, but would be properly oriented for each side.

User Interface Elements
Many of the user-interface elements are similar to those found in touch screen mouse-driven applications. However, some new elements have appeared, mostly attributed to the development of the iPhone.

Buttons /Icons/ Thumbnails. These conventional navigational elements appear in most traditional applications. Of course, these graphic elements need to be large enough to pressed by a finger tip, like they would in a touch screen application.

dialsDials. Apple’s iPhone uses “slot machine-type” dials for some of the applications.  While I haven’t seen these dials appear in other multitouch applications, one could imagine these finding their way into computer-based exhibits.

We explored the possibility of using dials, although ours were right-to-left oriented. This was a very early concept for a floor-to-ceiling multitouch enabled health exhibit.
In this rough mock up we imagined that users would be able move the dials right to left and see changes in life expectancy and other important health information.

Drawers. Pull-down menus don’t work well for touch, or multitouch applications. Although, a variation requiring two touches can be used: one touch opens the menu (or drawer) and the second touch makes a selection. This can be used with symmetric navigation or for universal, omni-directional navigation.  Additionally, drawers can be used to provide additional information, not just navigational choices.

Flip (or Flick) elements. The HP TouchSmart and Apple iPhone have applications that allow visitors to “flip or flick” through photographs.  We used a similar photo viewer for our panoramic viewer exhibit.


In this exhibit, visitors can pan and zoom an image of the Birmingham skyline and click on points of interest. Each point has a photograph a visitor can select from the panoramic image or flip (or flick) through using the movable photo viewer. I should mention this exhibit is developed for the HP TouchSmart.  It is part of an installation for Vulcan Park and Museum in Birmingham, Alabama.

As you can see, there are some challenges and interesting possibilities when it comes to navigation and design for multitouch exhibits and applications.   In my next post, I’m going to look more broadly at how multitouch and multiuser exhibits can shape the visitor experience.

Back To Blog

Recent Posts