Last week I spoke at The Tech Museum’s Interfaces for the New Decade Conference and Gallery Opening. It was a great opportunity to meet and connect with folks from many of the Bay Area museums (and local companies) who share the same interest in new HCI (Human Computer Interaction) technologies. Along with the day-long conference there was an evening Gallery opening which unveiled The Tech Test Zone exhibit.
At the opening party we demoed Heist, our experimental project that allows digital objects to be shared from a multitouch table to visitor’s smart phones and tablets. We brought a MT55 Platform multitouch table for the demo and conference.
While Heist is not part of the permanent exhibit our Open Exhibit’s Kinect and Gigapixel Viewer software is. Visitors to the Test Zone can use gestures to navigate an amazing gigapixel image of Yosemite taken by xRez Studio. This Open Exhibits software exhibit is among the first installations in our year-old National Science Foundation sponsored project. (You can join Open Exhibits for free.)
CNET News has pictures of all of the exhibits including our and their Website, see Future tech exhibit plugs museum interactivity.
Our Open Exhibits multitouch software initiative has just completed its first year. Last year, we received funding from the National Science Foundation and we launched our full community site last November. We’ve learned a lot in year one and we are gearing up for an exciting second year.
If you haven’t been following developments on the Open Exhibits site, here’s an update:
The Heist project was announced today. Heist is an experimental project that uses Open Exhibits and GestureWorks software and is powered by Sensus server technology to enable effortless networking. It allows museum visitors to “steal” digital objects; easily placing them on their smart phones or tablets.
The system uses a captive WiFi portal to push an HTML5 app to visitors so there is no need to download an iOS or Android app. The visitor just connects to WiFi and opens their browser. We are planning a testbed with ten museums this winter. Learn more and check out a video of Heist.
Open Exhibits is on the road in October and November. There are presentations and workshops planned on both coasts and in Europe. We’ll be at Association of Science- Technology Center’s (ASTC) annual conference in Baltimore, the British Museum in London, and at The Tech Museum in San Jose. We will have one of our MT55 Platform Multitouch Tables at the British Museum if you want to check it out.
Work has begun on a new version of our most downloaded software module, the multitouch-enabled Collection Viewer. We’ve posted preliminary designs and have explained the new features that will become available in the new version.
Open Exhibits surpassed 10,000 software downloads last month and our community now has over 1,700 members. If you haven’t already done so, please join us. We are looking forward to an eventful second year.
Tune Grid is a 16-step multitouch audio sequencer built with GestureWorks. It allows multiple users to create harmonic rhythms, or play sequences of notes using multitouch input. We originally conceived of the idea with our friends at the Terry Lee Wells Nevada Discovery Museum, after seeing Andre Michelle’s ToneMatrix web application.
Tune Grid now comes standard on every Ideum multitouch table. It’s a fun multi-user sound application that does a great job showing off the integrated Bose audio system found in the MT55 Pro. Below is a video showing Tune Grid in action on the MT55 Pro multitouch table. We’d love to hear your feedback.
This video demonstrates the results of a recent collaboration between SENSUS and our own Open Exhibits software initiative. The concept is simple: make networking and sharing transparent across multitouch devices and operating systems. The demo video shows an Android Tablet (Samsung Galaxy), a Multitouch Table (our own, new MT55), a Windows 7 multitouch kiosk, and an iPod–all sharing media items (images, video, and a Google Map) effortlessly. This easy sharing is made possible with Konnectus software which is a new cloud-computing platform developed by SENSUS.
The Konnectus software and the Open Exhibits modules will be available later this summer. And, Yes! These “network friendly” software modules will also work with our GestureWorks multitouch framework.
Here’s a bit more about Konnectus and our partners at SENSUS…
KonnectUs is a new cloud-computing software platform by SENSUS designed to make sophisticated networking functions easy and intuitive for users across a range of devices including multitouch tables, desktop computers, tablets, and mobile phones. KonnectUs “Natural Networking Technology (NNT)” empowers users to connect seamlessly across all major platforms – from Windows to Android to iOS. The new software aims to deliver a desktop user experience for key cloud-based services such as file sharing, social networking and location-relevant distribution of content. Additionally, KonnectUs APIs allow developers the opportunity to leverage the power of SENSUS networking technology through integration into third party applications.
You can read the full-press release on the SENSUS Website.
For the latest installment see: Building a High-Resolution Multitouch Wall (Part 3), and Building a High-Resolution Multitouch Wall (Part 4). (Update 9/7/12: You might want checkout the Presenter Touch Wall, a 65″ multitouch wall built for public spaces.)
Back in November, I first blogged about building a 7-foot, round, high-resolution multitouch wall. At that point in the process we had just received the large ring and built the computer system. Now, we have installed the glass and have built out the rigging for the cameras and projectors.
The glass is haptic: it has a texture, doesn’t show fingerprints, and still displays the image beautifully. We purchased the glass from a company called Sevasa. They make an acid-etched architectural glass that has a great feel to it. The tempered glass is 10mm thick.
Due to the size of the glass, we are not adhering the project material directly on to the glass, but rather we have a piece of acrylic with projection material that will go directly behind the glass. We have already done projection test and the combination works great. (I will post more about that once we put the acrylic in place.)
Behind the 7-foot ring, we have built a rigging frame out of Bosch aluminum. The rigging holds the four IR (infrared) cameras in place. It will also hold the projectors in place.
We will be using an IR method called Laser Light Plane (LLP) illumination. The system will have four cameras that will gather the tracking information. The exhibit will be installed in early summer, it is being built for a major North American aquarium. We will post another update on this project in the next week or two.
To see the previous step in the process check out: Building a High-Resolution Multitouch Wall Part 1.
Our module for Kinect provides a simple solution for authoring gesture-based applications in Flash. Lately, we’ve been using it in conjunction with our other free Open Exhibits software modules. While the Kinect device itself doesn’t have the necessary precision for use with every module, we have successfully paired it with our gigapixel image viewer, our new VR Panoramic image viewer, and with our Google Maps module.
Our free Kinect module works with Community Core Vision (CCV) software, an open source software package for computer vision. We’ve used this software in the past with various multitouch tables and other installations. Our Kinect module is a “directshow” source filter, a virtualized webcam device that reads data from the drivers released by OpenKinect.
Here’s a video showing the Kinect module working with other Open Exhibits software.
The Kinect module and the others are all free and open on the Open Exhibits website. The Open Exhibits core software is free for students, educators, nonprofits, and museums. (Commercial users can download a free GestureWorks trial.) Add a $150 Kinect 3D Motion Controller to our software and you have a very cheap and flexible authoring solution.
There are photographs of the Kinect and Open Exhibits modules on the OE Flickr site.
This article is cross-posted on the Open Exhibits Website.In the video, the gigapixel image of El Capitán that appeared in the example was provided by xRez Studio. The cubic VR image of Chichen Itza was taken by Ideum back in 2005 and is part of the Traditions of the Sun project.
Over on the Open Exhibits website, Jeff Heywood of Vancouver Aquarium has just shared a comprehensive field study on two multitouch tables in the Canada’s Arctic gallery space. The study was developed by The InnoVis Group, Interactions Lab at the University of Calgary.
We built the tables and worked with Vancouver Aquarium back in the summer of 2009 to create the software. The report looks at the “general experience of the digital tables”, including the form factor, and then it takes a closer look at the applications.
The study shows, as Jeff points out in his post, that “not everything was a success with the tables, but they are, overall, successful.” Considering the emergent nature of these types of exhibits, we were pleased to see that the study was generally very positive.
Still, some things didn’t work as well we would have liked. There were significant usability issues with the early version of the Collection Viewer. I’m happy to report that many of the issues cited in the report have been fixed in the newer version of the Collection Viewer that is available on the Open Exhibits site. We built in the ability to easily change some of the design parameters via XML. For example, button size and spacing can be modified by changing the XML. In addition, we remapped many of the gestures, so that the Collection Viewer objects respond better to visitor interaction. Still, some issues remain and we’ll be taking a closer look at this report and making additional changes.
Studies like this are incredibly valuable (and far too rare in the field). As designers and developers, we can only learn so much through testing and observation in the studio. The museum (or aquarium) setting and the sheer number and range of different visitors provides us with a new picture of the exhibit. You can download and read the full report on the Open Exhibits website, Interactive Tables at the Vancouver Aquarium.
[Cross-posted from Open Exhibits Blog]
We’ve recently released two new modules on Open Exhibits. The gigapixel viewer module allows Open Exhibits and GestureWorks users to plug any gigapixel image into our Flash application and drag and zoom it using multitouch inputs. We recently demo’d this app for the first time at CES 2011 and it was a big hit.
MT-Kinect, our other new module, allows users to interface with a Kinect to manipulate multitouch applications using gesturing (like in the movie Minority Report) rather than directly touching a screen. We combined this module with a gigapixel viewer to create an application that allows you to move and zoom by waving your arms.
So how does our application convert Kinect data to multitouch-compatible input that our Flash application can read? We wrote a “directshow” source filter, a virtualized webcam device that reads data from the drivers released by OpenKinect.
After adjusting the depth data to amplify the edges – which optimizes this application for gestures from a single user centered in the Kinect’s camera – we output a simple webcam feed. We route this information to a vanilla installation of CCV (theoretically, other trackers should work), which runs various filters, finds the blobs, and outputs the data in whatever format we would like to consume (in our case,”flosc,” which enables Flash apps to get “OSC” information ). Our gigapixel viewer software can then read this input as though it came from any multitouch device.
These modules are free to download and use; you just need to be an Open Exhibits member. The gigapixel viewer requires that you have either Open Exhibits Core or GestureWorks software. Open Exhibits Core is available free to educational users. Commercial users can try GestureWorks free or purchase a license.
And if you’re wondering about the stunning gigapixel image of El Capitán, it was taken by xRez Studio who were nice enough to let us use the image for this demo.
After months of development and ten nervous days in the Apple App Store approval process, we’ve just released the NASA Space Weather Media Viewer iPhone application. The Space Weather app allows you to view real-time and near-real-time imagery from a variety of NASA satellites, as well as videos and more!
Ideum, in partnership with Goddard Space Flight Center, was awarded a grant to extend the tremendously popular web-based Space Weather Media Viewer to the mobile platform. The application ships with informational videos, visualizations, NASA mission information, and enables near real-time observation and social network propagation of space weather phenomenon.
This was our first foray onto the rocky road of iPhone development, but with the help of libraries like Three20, we were able to complete a very full-featured and superbly performing application relatively quickly. We will say that the iPhone development process is not as simple as what we were promised when the iPhone first launched. Our next goal is the Android version of the application, and we’re examining other rapid development platforms, some of which, due to licensing issues, were not available for our use with the iPhone.
So, check out the app store page to download the Space Weather Media Viewer, mobile version. It’s free. You can also use the QR code to the right to access the page from your phone! Just click it to view the full size.
I’ve just read Shelley Bernstein’s response to the NY Times “From Picassos to Sarcophagi, Guided by Phone Apps” article over on the Brooklyn Museum blog and she brings up some great points about the use of emergent technology and experimentation.
Edward Rothstein at the Times didn’t seem to be too impressed by any of the apps he tried, and from a contextual or information standpoint, he may have a point. If you are looking for an extended, interactive version of the wall plaques that detail the artist’s life, history, and context, these apps may fall short. But in our work designing interactive exhibits, we’ve found that it is the social component that can make or break an exhibit, and the Brooklyn Museum is pushing how mobile technologies connect people through the art they’re viewing as well as inform them about that art.
If used well, these new technologies can change the museum from a place where people connect with exhibits in solitude (audio tour headphones on, reading quietly to themselves, or quietly tapping a single computer screen) to a place where people are able to actively connect, recommend and participate with other visitors and the exhibit. Enabling a “like” or similar feature, as the Brooklyn Museum has done, allows visitors to connect long after they leave the museum floor. And such connections aren’t just wishful thinking; as Ms. Bernstein points out, the app statistics show that people are actually using the Like feature to find and recommend objects to other visitors.
Such connections may add to the “scarcely literate cybergraffiti” for Mr. Rothstein, but to us, they’re what make facebook, twitter, and a new crop of interactive museum technologies exciting: the ability to share with and learn from people you know personally and the opportunity to forge new personal connections over shared exhibit interests.
Of course there’s always room to grow, especially when working with new and largely untried technologies. Even if the concept is perfect, technological, networking, and financial limitations often frustrate the creation of that ur-application or exhibit. The perfect museum app might well act as Wikipedia, Share This!, FourSquare and a brilliant curator all in one. But we’d like to give a thumbs up to the Brooklyn Museum for having the guts to experiment with these technologies in a thoughtful and interesting way.