This video demonstrates the results of a recent collaboration between SENSUS and our own Open Exhibits software initiative. The concept is simple: make networking and sharing transparent across multitouch devices and operating systems. The demo video shows an Android Tablet (Samsung Galaxy), a Multitouch Table (our own, new MT55), a Windows 7 multitouch kiosk, and an iPod–all sharing media items (images, video, and a Google Map) effortlessly. This easy sharing is made possible with Konnectus software which is a new cloud-computing platform developed by SENSUS.
The Konnectus software and the Open Exhibits modules will be available later this summer. And, Yes! These “network friendly” software modules will also work with our GestureWorks multitouch framework.
Here’s a bit more about Konnectus and our partners at SENSUS…
KonnectUs is a new cloud-computing software platform by SENSUS designed to make sophisticated networking functions easy and intuitive for users across a range of devices including multitouch tables, desktop computers, tablets, and mobile phones. KonnectUs “Natural Networking Technology (NNT)” empowers users to connect seamlessly across all major platforms – from Windows to Android to iOS. The new software aims to deliver a desktop user experience for key cloud-based services such as file sharing, social networking and location-relevant distribution of content. Additionally, KonnectUs APIs allow developers the opportunity to leverage the power of SENSUS networking technology through integration into third party applications.
You can read the full-press release on the SENSUS Website.
For the latest installment see: Building a High-Resolution Multitouch Wall (Part 3), and Building a High-Resolution Multitouch Wall (Part 4). (Update 9/7/12: You might want checkout the Presenter Touch Wall, a 65″ multitouch wall built for public spaces.)
Back in November, I first blogged about building a 7-foot, round, high-resolution multitouch wall. At that point in the process we had just received the large ring and built the computer system. Now, we have installed the glass and have built out the rigging for the cameras and projectors.
The glass is haptic: it has a texture, doesn’t show fingerprints, and still displays the image beautifully. We purchased the glass from a company called Sevasa. They make an acid-etched architectural glass that has a great feel to it. The tempered glass is 10mm thick.
Due to the size of the glass, we are not adhering the project material directly on to the glass, but rather we have a piece of acrylic with projection material that will go directly behind the glass. We have already done projection test and the combination works great. (I will post more about that once we put the acrylic in place.)
Behind the 7-foot ring, we have built a rigging frame out of Bosch aluminum. The rigging holds the four IR (infrared) cameras in place. It will also hold the projectors in place.
We will be using an IR method called Laser Light Plane (LLP) illumination. The system will have four cameras that will gather the tracking information. The exhibit will be installed in early summer, it is being built for a major North American aquarium. We will post another update on this project in the next week or two.
To see the previous step in the process check out: Building a High-Resolution Multitouch Wall Part 1.
Our module for Kinect provides a simple solution for authoring gesture-based applications in Flash. Lately, we’ve been using it in conjunction with our other free Open Exhibits software modules. While the Kinect device itself doesn’t have the necessary precision for use with every module, we have successfully paired it with our gigapixel image viewer, our new VR Panoramic image viewer, and with our Google Maps module.
Our free Kinect module works with Community Core Vision (CCV) software, an open source software package for computer vision. We’ve used this software in the past with various multitouch tables and other installations. Our Kinect module is a “directshow” source filter, a virtualized webcam device that reads data from the drivers released by OpenKinect.
Here’s a video showing the Kinect module working with other Open Exhibits software.
The Kinect module and the others are all free and open on the Open Exhibits website. The Open Exhibits core software is free for students, educators, nonprofits, and museums. (Commercial users can download a free GestureWorks trial.) Add a $150 Kinect 3D Motion Controller to our software and you have a very cheap and flexible authoring solution.
There are photographs of the Kinect and Open Exhibits modules on the OE Flickr site.
This article is cross-posted on the Open Exhibits Website.In the video, the gigapixel image of El Capitán that appeared in the example was provided by xRez Studio. The cubic VR image of Chichen Itza was taken by Ideum back in 2005 and is part of the Traditions of the Sun project.
Over on the Open Exhibits website, Jeff Heywood of Vancouver Aquarium has just shared a comprehensive field study on two multitouch tables in the Canada’s Arctic gallery space. The study was developed by The InnoVis Group, Interactions Lab at the University of Calgary.
We built the tables and worked with Vancouver Aquarium back in the summer of 2009 to create the software. The report looks at the “general experience of the digital tables”, including the form factor, and then it takes a closer look at the applications.
The study shows, as Jeff points out in his post, that “not everything was a success with the tables, but they are, overall, successful.” Considering the emergent nature of these types of exhibits, we were pleased to see that the study was generally very positive.
Still, some things didn’t work as well we would have liked. There were significant usability issues with the early version of the Collection Viewer. I’m happy to report that many of the issues cited in the report have been fixed in the newer version of the Collection Viewer that is available on the Open Exhibits site. We built in the ability to easily change some of the design parameters via XML. For example, button size and spacing can be modified by changing the XML. In addition, we remapped many of the gestures, so that the Collection Viewer objects respond better to visitor interaction. Still, some issues remain and we’ll be taking a closer look at this report and making additional changes.
Studies like this are incredibly valuable (and far too rare in the field). As designers and developers, we can only learn so much through testing and observation in the studio. The museum (or aquarium) setting and the sheer number and range of different visitors provides us with a new picture of the exhibit. You can download and read the full report on the Open Exhibits website, Interactive Tables at the Vancouver Aquarium.
[Cross-posted from Open Exhibits Blog]
We’ve recently released two new modules on Open Exhibits. The gigapixel viewer module allows Open Exhibits and GestureWorks users to plug any gigapixel image into our Flash application and drag and zoom it using multitouch inputs. We recently demo’d this app for the first time at CES 2011 and it was a big hit.
MT-Kinect, our other new module, allows users to interface with a Kinect to manipulate multitouch applications using gesturing (like in the movie Minority Report) rather than directly touching a screen. We combined this module with a gigapixel viewer to create an application that allows you to move and zoom by waving your arms.
So how does our application convert Kinect data to multitouch-compatible input that our Flash application can read? We wrote a “directshow” source filter, a virtualized webcam device that reads data from the drivers released by OpenKinect.
After adjusting the depth data to amplify the edges – which optimizes this application for gestures from a single user centered in the Kinect’s camera – we output a simple webcam feed. We route this information to a vanilla installation of CCV (theoretically, other trackers should work), which runs various filters, finds the blobs, and outputs the data in whatever format we would like to consume (in our case,”flosc,” which enables Flash apps to get “OSC” information ). Our gigapixel viewer software can then read this input as though it came from any multitouch device.
These modules are free to download and use; you just need to be an Open Exhibits member. The gigapixel viewer requires that you have either Open Exhibits Core or GestureWorks software. Open Exhibits Core is available free to educational users. Commercial users can try GestureWorks free or purchase a license.
And if you’re wondering about the stunning gigapixel image of El Capitán, it was taken by xRez Studio who were nice enough to let us use the image for this demo.
After months of development and ten nervous days in the Apple App Store approval process, we’ve just released the NASA Space Weather Media Viewer iPhone application. The Space Weather app allows you to view real-time and near-real-time imagery from a variety of NASA satellites, as well as videos and more!
Ideum, in partnership with Goddard Space Flight Center, was awarded a grant to extend the tremendously popular web-based Space Weather Media Viewer to the mobile platform. The application ships with informational videos, visualizations, NASA mission information, and enables near real-time observation and social network propagation of space weather phenomenon.
This was our first foray onto the rocky road of iPhone development, but with the help of libraries like Three20, we were able to complete a very full-featured and superbly performing application relatively quickly. We will say that the iPhone development process is not as simple as what we were promised when the iPhone first launched. Our next goal is the Android version of the application, and we’re examining other rapid development platforms, some of which, due to licensing issues, were not available for our use with the iPhone.
So, check out the app store page to download the Space Weather Media Viewer, mobile version. It’s free. You can also use the QR code to the right to access the page from your phone! Just click it to view the full size.
I’ve just read Shelley Bernstein’s response to the NY Times “From Picassos to Sarcophagi, Guided by Phone Apps” article over on the Brooklyn Museum blog and she brings up some great points about the use of emergent technology and experimentation.
Edward Rothstein at the Times didn’t seem to be too impressed by any of the apps he tried, and from a contextual or information standpoint, he may have a point. If you are looking for an extended, interactive version of the wall plaques that detail the artist’s life, history, and context, these apps may fall short. But in our work designing interactive exhibits, we’ve found that it is the social component that can make or break an exhibit, and the Brooklyn Museum is pushing how mobile technologies connect people through the art they’re viewing as well as inform them about that art.
If used well, these new technologies can change the museum from a place where people connect with exhibits in solitude (audio tour headphones on, reading quietly to themselves, or quietly tapping a single computer screen) to a place where people are able to actively connect, recommend and participate with other visitors and the exhibit. Enabling a “like” or similar feature, as the Brooklyn Museum has done, allows visitors to connect long after they leave the museum floor. And such connections aren’t just wishful thinking; as Ms. Bernstein points out, the app statistics show that people are actually using the Like feature to find and recommend objects to other visitors.
Such connections may add to the “scarcely literate cybergraffiti” for Mr. Rothstein, but to us, they’re what make facebook, twitter, and a new crop of interactive museum technologies exciting: the ability to share with and learn from people you know personally and the opportunity to forge new personal connections over shared exhibit interests.
Of course there’s always room to grow, especially when working with new and largely untried technologies. Even if the concept is perfect, technological, networking, and financial limitations often frustrate the creation of that ur-application or exhibit. The perfect museum app might well act as Wikipedia, Share This!, FourSquare and a brilliant curator all in one. But we’d like to give a thumbs up to the Brooklyn Museum for having the guts to experiment with these technologies in a thoughtful and interesting way.
The New Mexico Technology Excellence awards recognize “exceptional individual and organizational excellence in technology throughout the State.” We’re honored that two of our projects, the 100″ multitouch table exhibit and GestureWorks (our multitouch framework for Flash & Flex), are finalists for the awards.
Sponsored by the New Mexico Technology Council, the NM TechEX awards help to fund technology education for K-12 students in New Mexico. This year’s awards focus on two categories: Solution Innovation, for novel technologies that have potential for future impact, and Solution Impact, focusing on solutions that have already had a demonstrable impact on an individual or community level.
We’re glad to be a part of continuing efforts to build the technology community in New Mexico, and look forward to seeing everyone at the ceremony May 6.
The new Imaginarium Discovery Center at the Anchorage Museum is set to open May 22nd, and one of our exhibit technicians, Chris, was lucky enough to get a sneak peek when he went up this past weekend to install a MT-50 Multitouch Table.
The new Imaginarium has over 9,000 feet of exhibit space, with several galleries focusing on different scientific disciplines. The MT-50 will be part of the Earth and Life Sciences gallery. Designed in conjunction with Ansel Associates, the gallery will feature touch tanks, an aquarium and even alligators. Reptiles can’t survive in tundra climates, so for some native Alaskans, the gallery displays could be their first reptile sighting ever!
The MT-50 features a custom multitouch multiuser exhibit, designed by the Imaginarium & Ideum using Adobe Flash and GestureWorks, that allows visitors to compare and contrast two different species of animal by dragging their pictures into a spherical information interface in the center. Many of the animals in the virtual table exhibit will be featured in the live animal exhibits or can be seen in Alaska wilderness areas, allowing visitors to learn more about animals they can actually observe.
This week, we are installing a number of new technology-based exhibits for the Wonders of the Universe | Space Chase Gallery exhibition at Adventure Science Center in Nashville, Tennessee. One of the exhibits we collaborated on includes a large-scale multitouch table that allows visitors to explore and learn about the Electromagnetic Spectrum in new ways. Taking advantage of the table’s super-wide screen format, we’ve created a digital representation of the EM Spectrum from radio waves to gamma rays. Visitors can move images across the table to see how they are imaged in each waveform and tabs on each image allow them access information about what they are seeing.
With a 100″ surface and an 86″ viewable area, it is one of the largest contiguous multitouch tables yet developed. The screen has a 16 x 5 aspect ratio and a 2304 x 800 pixel high-resolution screen. The table can support over 50 simultaneous touch points, allowing several people to interact with the table at the same time.
The exhibit displays various celestial and terrestrial images in a variety of wavelengths. NASA images of the sun and various nebulae can be seen in all wavelengths . . .
. . . as well as common and iconic objects, specially photographed. For example, a birthday cake with lit candles, a toy robot, an alarm clock, and even a hand holding an iPhone are seen in visible, infrared, ultraviolet, and x-ray. The images that appear here along with high-resolution images of the 100″ table can be found on the Ideum Flickr site.
The custom software was developed with Adobe Flash and Ideum’s own GestureWorks framework, which allows Flash developers to easily develop their own custom multitouch applications. GestureWorks will be available for sale to other developers in early December. The table design is based on Ideum’s commercially available MT-50 multitouch table.
The Space Chase exhibition opens to the public on November 7th at the Adventure Science Center in Nashville, Tennessee.