Over on the Open Exhibits website, Jeff Heywood of Vancouver Aquarium has just shared a comprehensive field study on two multitouch tables in the Canada’s Arctic gallery space. The study was developed by The InnoVis Group, Interactions Lab at the University of Calgary.
We built the tables and worked with Vancouver Aquarium back in the summer of 2009 to create the software. The report looks at the “general experience of the digital tables”, including the form factor, and then it takes a closer look at the applications.
The study shows, as Jeff points out in his post, that “not everything was a success with the tables, but they are, overall, successful.” Considering the emergent nature of these types of exhibits, we were pleased to see that the study was generally very positive.
Still, some things didn’t work as well we would have liked. There were significant usability issues with the early version of the Collection Viewer. I’m happy to report that many of the issues cited in the report have been fixed in the newer version of the Collection Viewer that is available on the Open Exhibits site. We built in the ability to easily change some of the design parameters via XML. For example, button size and spacing can be modified by changing the XML. In addition, we remapped many of the gestures, so that the Collection Viewer objects respond better to visitor interaction. Still, some issues remain and we’ll be taking a closer look at this report and making additional changes.
Studies like this are incredibly valuable (and far too rare in the field). As designers and developers, we can only learn so much through testing and observation in the studio. The museum (or aquarium) setting and the sheer number and range of different visitors provides us with a new picture of the exhibit. You can download and read the full report on the Open Exhibits website, Interactive Tables at the Vancouver Aquarium.
[Cross-posted from Open Exhibits Blog]
We’ve recently released two new modules on Open Exhibits. The gigapixel viewer module allows Open Exhibits and GestureWorks users to plug any gigapixel image into our Flash application and drag and zoom it using multitouch inputs. We recently demo’d this app for the first time at CES 2011 and it was a big hit.
MT-Kinect, our other new module, allows users to interface with a Kinect to manipulate multitouch applications using gesturing (like in the movie Minority Report) rather than directly touching a screen. We combined this module with a gigapixel viewer to create an application that allows you to move and zoom by waving your arms.
So how does our application convert Kinect data to multitouch-compatible input that our Flash application can read? We wrote a “directshow” source filter, a virtualized webcam device that reads data from the drivers released by OpenKinect.
After adjusting the depth data to amplify the edges – which optimizes this application for gestures from a single user centered in the Kinect’s camera – we output a simple webcam feed. We route this information to a vanilla installation of CCV (theoretically, other trackers should work), which runs various filters, finds the blobs, and outputs the data in whatever format we would like to consume (in our case,”flosc,” which enables Flash apps to get “OSC” information ). Our gigapixel viewer software can then read this input as though it came from any multitouch device.
These modules are free to download and use; you just need to be an Open Exhibits member. The gigapixel viewer requires that you have either Open Exhibits Core or GestureWorks software. Open Exhibits Core is available free to educational users. Commercial users can try GestureWorks free or purchase a license.
And if you’re wondering about the stunning gigapixel image of El Capitán, it was taken by xRez Studio who were nice enough to let us use the image for this demo.
After months of development and ten nervous days in the Apple App Store approval process, we’ve just released the NASA Space Weather Media Viewer iPhone application. The Space Weather app allows you to view real-time and near-real-time imagery from a variety of NASA satellites, as well as videos and more!
Ideum, in partnership with Goddard Space Flight Center, was awarded a grant to extend the tremendously popular web-based Space Weather Media Viewer to the mobile platform. The application ships with informational videos, visualizations, NASA mission information, and enables near real-time observation and social network propagation of space weather phenomenon.
This was our first foray onto the rocky road of iPhone development, but with the help of libraries like Three20, we were able to complete a very full-featured and superbly performing application relatively quickly. We will say that the iPhone development process is not as simple as what we were promised when the iPhone first launched. Our next goal is the Android version of the application, and we’re examining other rapid development platforms, some of which, due to licensing issues, were not available for our use with the iPhone.
So, check out the app store page to download the Space Weather Media Viewer, mobile version. It’s free. You can also use the QR code to the right to access the page from your phone! Just click it to view the full size.
I’ve just read Shelley Bernstein’s response to the NY Times “From Picassos to Sarcophagi, Guided by Phone Apps” article over on the Brooklyn Museum blog and she brings up some great points about the use of emergent technology and experimentation.
Edward Rothstein at the Times didn’t seem to be too impressed by any of the apps he tried, and from a contextual or information standpoint, he may have a point. If you are looking for an extended, interactive version of the wall plaques that detail the artist’s life, history, and context, these apps may fall short. But in our work designing interactive exhibits, we’ve found that it is the social component that can make or break an exhibit, and the Brooklyn Museum is pushing how mobile technologies connect people through the art they’re viewing as well as inform them about that art.
If used well, these new technologies can change the museum from a place where people connect with exhibits in solitude (audio tour headphones on, reading quietly to themselves, or quietly tapping a single computer screen) to a place where people are able to actively connect, recommend and participate with other visitors and the exhibit. Enabling a “like” or similar feature, as the Brooklyn Museum has done, allows visitors to connect long after they leave the museum floor. And such connections aren’t just wishful thinking; as Ms. Bernstein points out, the app statistics show that people are actually using the Like feature to find and recommend objects to other visitors.
Such connections may add to the “scarcely literate cybergraffiti” for Mr. Rothstein, but to us, they’re what make facebook, twitter, and a new crop of interactive museum technologies exciting: the ability to share with and learn from people you know personally and the opportunity to forge new personal connections over shared exhibit interests.
Of course there’s always room to grow, especially when working with new and largely untried technologies. Even if the concept is perfect, technological, networking, and financial limitations often frustrate the creation of that ur-application or exhibit. The perfect museum app might well act as Wikipedia, Share This!, FourSquare and a brilliant curator all in one. But we’d like to give a thumbs up to the Brooklyn Museum for having the guts to experiment with these technologies in a thoughtful and interesting way.
The New Mexico Technology Excellence awards recognize “exceptional individual and organizational excellence in technology throughout the State.” We’re honored that two of our projects, the 100″ multitouch table exhibit and GestureWorks (our multitouch framework for Flash & Flex), are finalists for the awards.
Sponsored by the New Mexico Technology Council, the NM TechEX awards help to fund technology education for K-12 students in New Mexico. This year’s awards focus on two categories: Solution Innovation, for novel technologies that have potential for future impact, and Solution Impact, focusing on solutions that have already had a demonstrable impact on an individual or community level.
We’re glad to be a part of continuing efforts to build the technology community in New Mexico, and look forward to seeing everyone at the ceremony May 6.
The new Imaginarium Discovery Center at the Anchorage Museum is set to open May 22nd, and one of our exhibit technicians, Chris, was lucky enough to get a sneak peek when he went up this past weekend to install a MT-50 Multitouch Table.
The new Imaginarium has over 9,000 feet of exhibit space, with several galleries focusing on different scientific disciplines. The MT-50 will be part of the Earth and Life Sciences gallery. Designed in conjunction with Ansel Associates, the gallery will feature touch tanks, an aquarium and even alligators. Reptiles can’t survive in tundra climates, so for some native Alaskans, the gallery displays could be their first reptile sighting ever!
The MT-50 features a custom multitouch multiuser exhibit, designed by the Imaginarium & Ideum using Adobe Flash and GestureWorks, that allows visitors to compare and contrast two different species of animal by dragging their pictures into a spherical information interface in the center. Many of the animals in the virtual table exhibit will be featured in the live animal exhibits or can be seen in Alaska wilderness areas, allowing visitors to learn more about animals they can actually observe.
This week, we are installing a number of new technology-based exhibits for the Wonders of the Universe | Space Chase Gallery exhibition at Adventure Science Center in Nashville, Tennessee. One of the exhibits we collaborated on includes a large-scale multitouch table that allows visitors to explore and learn about the Electromagnetic Spectrum in new ways. Taking advantage of the table’s super-wide screen format, we’ve created a digital representation of the EM Spectrum from radio waves to gamma rays. Visitors can move images across the table to see how they are imaged in each waveform and tabs on each image allow them access information about what they are seeing.
With a 100″ surface and an 86″ viewable area, it is one of the largest contiguous multitouch tables yet developed. The screen has a 16 x 5 aspect ratio and a 2304 x 800 pixel high-resolution screen. The table can support over 50 simultaneous touch points, allowing several people to interact with the table at the same time.
The exhibit displays various celestial and terrestrial images in a variety of wavelengths. NASA images of the sun and various nebulae can be seen in all wavelengths . . .
. . . as well as common and iconic objects, specially photographed. For example, a birthday cake with lit candles, a toy robot, an alarm clock, and even a hand holding an iPhone are seen in visible, infrared, ultraviolet, and x-ray. The images that appear here along with high-resolution images of the 100″ table can be found on the Ideum Flickr site.
The custom software was developed with Adobe Flash and Ideum’s own GestureWorks framework, which allows Flash developers to easily develop their own custom multitouch applications. GestureWorks will be available for sale to other developers in early December. The table design is based on Ideum’s commercially available MT-50 multitouch table.
The Space Chase exhibition opens to the public on November 7th at the Adventure Science Center in Nashville, Tennessee.
There is a video on YouTube of our MT2 multitouch table in action at the Ontario Science Centre. The video shows our multitouch mash-up that uses Yahoo! Maps and Flickr photos. (The video also includes various applications from NUI’s Snowflake software suite which comes bundled with each table.)
We worked with Ontario Science Centre to customize the mapping program and they have opened it up to the Flickr community. Visitors can join their group, Innovation in Toronto, and upload geo-tagged photos to be included in the exhibit on the museum floor. This is the first time our mashup has been used in this way.
Update July 27, 2009: Ontario Science Centre just posted a video on YouTube that shows the Multi-touch Mapping Mashup in detail.
Since the release of the Wii gaming system developers have been experimenting in connecting game controllers with other computer systems. Our recent entry into multitouch, has deepened our interest in all forms of physical computing. So, we decided to take a quick look ourselves to see what possibilities Wii Remote controllers and Adobe Flash might provide for exhibit development.
In no time, Jonathan here at the studio had some examples working with the Wii controller and the Wii balance board. The controller examples took advantage of the motion sensing built into the device. (You can learn more about how the Wii works at the NY Times website. ) As you move the Wii controller a 3D-image of plane rotates and moves in unison on the screen.
We also tried out DarwiinRemote which turns the Wii infrared sensors into mouse coordinates. Both this application and the WiiFlash server connect via a bluetooth device in your computer. Any mouse or keyboard action can be mapped to the Wiimote buttons.
We also came across examples that use the controller as “receiver” with LEDs are used as input devices. A good example of this application is WiiSpray.
Securing the Wii controller in a museum environment is a major concern, as is power to the controller. Still, one could imagine providing constant power through some sort of tether that might simultaneously secure the device.
The Wii balance board shows a lot promise for museum exhibits. It provides a simple way to measure a visitor’s weight and get that data into the computer. One could easily picture a “your weight on other worlds” exhibit (see a simple online version at the Exploratorium). The board and Flash can also be used to detect the weight of each quadrant along with the total weight being registered on the board. Side-to-side and front-and-back movement can be detected via the four quadrants along Flash to detect shifts in weight.
The Wii Balance Board.
The output in WiiFlash displaying weight from the different quadrants.
The WiiFlash demo showing the total weight in Kilograms.
One limitation is the WiiFlash server cannot read the current battery power of the board. However, in a museum exhibit you’d need to wire direct power to battery area anyway. You’d also need to find a way to secure the board. The only major limitation to using the Wii balance board in a museum environment is that it needs to detect the blue tooth connection with the computer on start-up. This requires pressing a button on the bottom of the board when the computer boots. So, constant power would be necessary. Of course, this not a very “green” option.
We’ll post more about the Wii and Flash as a potential exhibit development platform as we continue to experiment.
On Wednesday, Paul Lacey and I will be conducting a full-day workshop entitled, Make It Multitouch at the Museums and the Web Conference in Indianapolis. We’ll be explaining the technical aspects of multitouch and exploring emerging design practices through a series of exercises. I posted some initial thoughts on design multitouch and multiuser exhibits back in February, in three parts; Interaction and Feedback, Elements, Objects, and Environments, and the Visitor-Experience. During the workshop, we’ll look at the concepts presented in these posts along with other activities with some new additions. An important new area for discussion is the use of physical objects (with fiducial markers) in conjunction with multitouch tables. A very interesting example came out just last week from the Media Computing Group, part of the Computer Science Department at RWTH Aachen University in Germany. Called, Slap Widgets these physical user-interface components work in conjunction with multitouch tables. The Media Computing Group has put together a short video explaining how Silicone ILluminated Active Peripherals, or SLAP widgets work.
In our workshop, we are primarily concerned with the implications of multitouch and multiuser interactions, still it is hard not to think about the possibilities that physical objects can present for computer-based exhibits. I don’t think a silicon slider or knob would last a day on the museum floor, but after working with hands-on science centers and other museums for so many years, the connection between computing and more physical interaction presented in this example is certainly intriguing. We’ll post more about the conference and the workshop later this week.
Update April 20, 2009: Shelly Mannion has some photos of multitouch table at MW2009. You can see the photos that are tagged, multitouch in her Flickr group.
Also, here is a picture we took of our multitouch table and the Microsoft Surface. We were moving out the our exhibit space, when we came across a Microsoft Surface set up for the a demo the next day. You can see more pictures in our Flickr set multitouch exhibits.
Update April 17, 2009: Paula Bray wrote a story about the workshop for Fresh + New(er) blog entitled, MW2009 – Multi-touch: what does this technology hold for future museum exhibits?