Yahoo Maps

  

Ideum Multitouch Table at Ontario Science Centre

There is a video on YouTube of our MT2 multitouch table in action at the Ontario Science Centre. The video shows our multitouch mash-up that uses Yahoo! Maps and Flickr photos. (The video also includes various applications from  NUI’s Snowflake software suite which comes bundled with each table.)

 We worked with Ontario Science Centre to customize the mapping program and they have opened it up to the Flickr community.  Visitors can join their group, Innovation in Toronto, and upload geo-tagged photos to be included in the exhibit on the museum floor. This is the first time our mashup has been used in this way. 

Update July 27, 2009: Ontario Science Centre just posted a video on YouTube that shows the Multi-touch Mapping Mashup in detail.

  
 
 
  
  

Multitouch Exhibit Design 3: The Visitor Experience

This post is the third in a series of three posts exploring multitouch and multiuser design. Our company, Ideum, develops computer-based interactive exhibits for museums.

The first post addresses user interaction and feedback, the second focuses on User Interface (UI) elements, objects and environments, while the third looks more broadly at how multitouch and multiuser exhibits can shape the visitor experience.

Traditional Computer-Based Interactives
Part of our attraction to multitouch and multiuser exhibits has to do with their ability to enhance the visitor-experience. Much of the criticism surrounding traditional computer-based exhibits and kiosks is valid:

1.    They tend to isolate visitors
2.    They are often too information heavy
3.    Interactivity is often limited
4.    In many cases the experience can be easily replicated at home, school or work

The traditional computer-based exhibit seems to have more in common with an ATM than it does with other interactive exhibits found in the museum. Traditional computer-based interactives can be functional, but are not often inspiring.

atm-windows2000

Multitouch, Multiuser
Making computer-based exhibits more interactive and engaging has always been the challenge. Ideally, these exhibits will act as catalysts for visitor communication and conversation. In the past we’ve developed computer-based interactives that have used push-button interfaces (also see Jukebox Memories), touch capacitors, and even spherical displays.  Multitouch and multiuser capabilities are a very welcome development, and I don’t think I’m overstating the case by saying it may be the most important innovation for computer-based exhibits since the development of the World Wide Web.

Multitouch, multiuser exhibits present a number of advantages over their single-touch, single-user counterparts:

1.    Multiuser exhibits bring museum visitors together and encourage social interaction
2.    Interaction is more physical and intuitive
3.    Multitouch tables are still novel and present an experience not found at home, school, or work

wii-players

With the popularity of large-screen TVs and gaming platforms like the Wii, we weren’t really sure what type of response we would receive during our initial testing. But we were really overwhelmed with the enthusiasm for the platform. There is something very compelling about the table top format itself. It allows visitors not only to interact with the program running on the surface, but also with each other.

kids-with-multitouch-table

Even with early prototypes, we found the table drew large crowds and kept visitors’ interest.  How long this novelty factor lasts is hard to judge. Obviously, good design with interesting content is required to make great exhibits. Looking forward, multitouch and multiuser capabilities have a lot to offer the future of exhibit design.

Hardware, Now More Software
Since last summer, we’ve had to work on both hardware and software solutions. However, now that we have a stable multitouch hardware platform, we are shifting focus onto software design and development along with more in-depth visitor testing. I’ll keep posting what we find. Your thoughts and comments are welcome.

Finally, if you’re looking for more about multitouch you can visit the Multitouch Blogs directory site or the following sites:  Interactive Multimedia Technology, The NUI Group Blog , Natural User Interface blog,  Point & Do, Microsoft Surface BlogMultiTouch Blog, Multitouch BarcelonaStimulant, and Touch User Interface.

  
 
 
  
  

Multitouch Exhibit Design 2: Elements, Objects, and Environments

This post is the second in a series of three exploring multitouch and multiuser design. Our company, Ideum, develops computer-based interactive exhibits for museums.

The first post addresses user interaction and feedback, the second focuses on User Interface (UI) elements, objects and environments, while the third looks more broadly at how multitouch and multiuser exhibits can shape the visitor experience.

Objects, Environments, and Navigation
The last post explored the types of gestures and the way in which visitors interact. This post focuses on the elements that visitors interact with. As you’ll see, there is strong focus on objects.  Multitouch and multiuser applications also require that we rethink standard navigation schemes.

Objects. Resizing and rotating objects is a common feature in many multitouch applications. This type of interaction is very intuitive and we observed museum visitors spending a lot of time viewing photographs in this way.  For a multitouch multiuser table, an object-based approach makes sense. Objects can be rotated to accommodate visitors from any side of the table and multiple users can interact independently with objects.  The extension of this object-based approach to connect with museum collections seems natural.

Environments. There are a number of interesting possibilities when we look at environments in multitouch design. In the Yahoo! Maps and Flickr mashup application we developed, the map was the environment. It provided context for the geocoded photographs. However, the environment could be even more dynamic with ability to transform objects or trigger new actions.

For example, one could imagine an exhibit where visitors could move objects from one-side of the table to other—this movement could represent time and we could see that object or photograph change over time. The table could effectively work as a dynamic timeline: move right and move forward in time, move left and you move back.

Additionally, a table environment could be divided into zones, and as objects are moved into these areas events are triggered, or objects are transformed in some way.  There are interesting possibilities here for works of art. Exhibits where visitors manipulate sketches which transform into finished paintings could be developed.   For scientific images, different views of objects shot by different types of imaging equipment or in different wavelengths could be used. Finally, developing physics games where gravity, the coefficient of friction, or other parameters could be changed is another possibility for science exhibits.

orbUniversal, Omni-Directional Navigation. Since multiuser, multitouch applications can be accessed from any side of the table, there is a need, in some applications, for a universal navigation element.  In our mapping application, we wanted to provide museum visitors with the ability to control the map environment. We found that panning could be a shared function, but the ability to zoom and select the map-type  couldn’t be shared in the same way.

Visitors can “call the orb,” with a quick double-tap on the table surface and the orb even contains a compass to help the visitors orient themselves to the map they are controlling.

The Microsoft Surface team recently released an example of Newsreader with universal, omni-directional navigation.  In this case, the navigational element is fixed and rotates around the center of the table.

Symmetric Navigation. Another possible approach to navigation on a large table where users can approach from any direction is what we call symmetric navigation. In this case, the navigational items are repeated in two places on the table. For example, a simple menu could be present in opposite corners of the table. Another configuration would have sets of tabs or buttons appearing on opposite sides of the table.  The information would be repeated, but would be properly oriented for each side.

User Interface Elements
Many of the user-interface elements are similar to those found in touch screen mouse-driven applications. However, some new elements have appeared, mostly attributed to the development of the iPhone.

Buttons /Icons/ Thumbnails. These conventional navigational elements appear in most traditional applications. Of course, these graphic elements need to be large enough to pressed by a finger tip, like they would in a touch screen application.

dialsDials. Apple’s iPhone uses “slot machine-type” dials for some of the applications.  While I haven’t seen these dials appear in other multitouch applications, one could imagine these finding their way into computer-based exhibits.

We explored the possibility of using dials, although ours were right-to-left oriented. This was a very early concept for a floor-to-ceiling multitouch enabled health exhibit.
health
In this rough mock up we imagined that users would be able move the dials right to left and see changes in life expectancy and other important health information.

Drawers. Pull-down menus don’t work well for touch, or multitouch applications. Although, a variation requiring two touches can be used: one touch opens the menu (or drawer) and the second touch makes a selection. This can be used with symmetric navigation or for universal, omni-directional navigation.  Additionally, drawers can be used to provide additional information, not just navigational choices.

Flip (or Flick) elements. The HP TouchSmart and Apple iPhone have applications that allow visitors to “flip or flick” through photographs.  We used a similar photo viewer for our panoramic viewer exhibit.

panoramic

In this exhibit, visitors can pan and zoom an image of the Birmingham skyline and click on points of interest. Each point has a photograph a visitor can select from the panoramic image or flip (or flick) through using the movable photo viewer. I should mention this exhibit is developed for the HP TouchSmart.  It is part of an installation for Vulcan Park and Museum in Birmingham, Alabama.

As you can see, there are some challenges and interesting possibilities when it comes to navigation and design for multitouch exhibits and applications.   In my next post, I’m going to look more broadly at how multitouch and multiuser exhibits can shape the visitor experience.

  
 
 
  
  

Multitouch Exhibit Design 1: Interaction and Feedback

With the development of our first interactive exhibits, and a few rounds of informal user testing, we’ve begun to explore approaches in multitouch and multiuser design. We’ve created both a multitouch mashup that uses Flickr and Yahoo! Maps, and a panoramic viewing application that allows visitors to access detailed photographs from points found on the larger image.

We developed these applications for our multitouch table (MT2) and for the HP TouchSmart platform. After developing touch screen exhibits for nearly a decade, the differences between standard touch and multitouch are very much in focus.

From the beginning, it has been clear that mouse or even standard touch-screen conventions wouldn’t be completely applicable. Multitouch and multiuser design requires new thinking, more experimentation, and careful user-study.  I want to share some of what we’ve learned and the areas that we are still investigating.  I’m also doing this in preparation for a workshop that we’ll be conducting at Museums and the Web (called “Make it Multitouch”) and a short presentation for the Canadian Museum Association’s annual meeting (called “Doers and Dreamers“) in Toronto at the end of March.

This discussion has been divided into three blog posts: The first explores user interaction and feedback, the second focuses on User Interface (UI) elements, objects and environments, while the third looks more broadly at how multitouch and multiuser exhibits can shape the visitor experience.

Interactions
How do users interact with interface elements and content on a multitouch screen or surface? And how are these interactions different than those we observe in standard mouse-driven or touch-screen applications? Below is a list of some of the unique ways visitors can interact with a multitouch interface. As you’ll see, some are very natural and others are more obscure. It is a strange blend of intuitive gestures and secret handshakes.

Touch. The same as standard touch screen interactions, touch areas are made larger to accommodate a finger tip  than those for mouse or trackball driven kiosks and exhibits.

Drag. With either one finger or multiple points, this type of interaction is similar to what we see with a mouse and pointer.

Pinch & Expand. This is an intuitive way to increase or decrease the size of objects in multitouch environments. In one case, we saw that just the act of placing a hand on the table surface slightly expanded an object (the hand opened a bit more as it impacted the surface). This allowed the visitor to immediately understand how to size the object. Pinch & Expand is common in ordinary hand gestures when talking about how big or small something is.

Rotate. As a visitor drags or pinches and expands an object it becomes apparent whether it can be rotated or not. Since multitouch tables have multiple points of approach, most applications provide visitors with the ability to rotate objects.

Double-Tap. We’ve used this type of interaction in a mapping mashup to “call over” a floating universal navigation element. We found this helpful for our large table, where the floating navigational item could be out of reach. However, our testing showed that this was not as an intuitive as some of the other types interaction. Although, once observed, most visitors found it simple and helpful.

kids-play-gravitor

Draw. Some multitouch applications allow visitors to draw shapes, such a NUI Gravitor application (seen above). It is also possible to draw “commands.” For example, you could draw an “x” on an object to close it. This would assume, however, that the object could not be dragged or resized, since those interactions would be interfere with the ability to draw.

Flip or Flick. It allows visitors to quickly browse through “stacks” of photographs or other fixed size objects. This works well with “dual touch” technologies like the iPhone and the HP TouchSmart.

Feedback
Visitors can benefit from additional feedback as they interact with multitouch applications. Occasionally, there can be a lag in direct feedback for some of the interactions listed earlier. This can be especially true in multiuser environments where the application is trying to process dozens of simultaneous points.

touch-cezanne

Tracers/Trails/Auras. As each finger point is detected as a “blob” by the “touch core” software, a small graphic or animation follows the point across the surface or screen. You can see a tracer (above) following the visitor’s finger as he resizes the painting. (His finger is slightly off the table so you can clearly see the “tracer.”)

Highlights and Ghosting. As visitors touch an object, it can be made to highlight or animate in some way. Ghosting can be helpful for dragging as you can still see where the item originated. Highlights provide the user with instantaneous feedback and reinforcement of their current action.

Connections. Lines (or other indicators) that connect objects can be helpful in way-finding particularly in multiuser environments. For our multitouch mapping application, we created connection lines from photographs to their points on the map—knowing that one user may be manipulating a photograph while another is controlling the map. This allows a visitor to trace the connection line back to the geographical point where the photograph was taken.

In my next post, I’ll explore how these interactions are applied to User Interface (UI) elements, objects and environments. As you’ll see, things get really interesting when we look to adapt and invent new ways for visitors to interact.

  
 
2
 
  
  

Multitouch Table and Mapping Exhibit Install

Earlier this week, we installed our first multitouch table at the Don Harrington Discovery Center in Amarillo, Texas. The touch table is right in the entranceway to the museum near a large satellite photograph of Amarillo and its’ environs.

multitouch-table-installation1

The table runs a custom mulituser, multitouch application we developed with the Don Harrington Discovery Center and Vulcan Park and Museum. This multitouch mashup application uses Flickr and Yahoo! Maps. There is more on the design and software development process on the Ideum portfolio. The video below shows some of the features found in the application.

The press came out to see the exhibit. The local newspaper and all three network news channels showed up. Below DHDC’s Executive Director, Joe Hastings got interviewed by the local press.

localpress-interviews-dhdc

Our next installation is in two weeks in Birmingham, Alabama. We’re going be installing another table along with two multitouch enabled HP TouchSmart kiosks. As far as we know, this will be the first time multitouch technology has been used exclusively throughout a permanent exhibit space.

  
 
3
 
  
  

New Mexico Stories – Flickr and Yahoo! Maps Mashup

New Mexico Stories is a map-based Flickr mashup that we developed for the Museum of New Mexico Foundation. The Foundation is a private, non-profit organization dedicated to the four museums and six historical state monuments that comprise the Museum of New Mexico.  

This mashup site is open to visitor-contributions and the photographs are drawn from a Flickr Group which is also called, New Mexico Stories.  The group is administered by the Foundation. I blogged about the planning process last October (see Planning for Social Sites).

new-mexico-stories-mashup 

New Mexico Stories  pulls geocoded images from Flickr and places them on a map of New Mexico. Images are divided by county, so the site should be able to easily hold thousands of images and remain, “navigable.” In addition, a Gallery view creates a random collage of 50 images from the collection.

The site has just become available and the Foundation will be formally announcing its release later this spring. Please let us know what you think, and if you have a New Mexico Story to share, please do.

  
 
2