The HCI+ISE Conference, supported by the National Science Foundation (NSF), will bring together museum exhibit designers and developers, learning researchers, and technology industry professionals to explore the potential impact of new human-computer interaction (HCI) technologies in informal science education (ISE) settings. The theme of the Conference is the balance between exhibit technology and the visitor experience.
The emergence of multitouch, motion-recognition, radio frequency identification (RFID), near field communication (NFC), voice recognition, augmented reality, and other technologies are already beginning to shape the visitor experience. In this three-day conference, presenters and participants will share effective practices and explore both the enormous potential, and possible pitfalls, that HCI advancements present for exhibit development.
The Conference will have activities, discussions, and group interactions exploring technical, design, and educational topics centered on HCI+ISE. A variety of technology examples such as multitouch tables, touch walls, Arduino controllers, Kinect hacks, RFID tags, and other prototypes and gear will be demonstrated.
HCI+ISE will focus on the practical considerations of implementing new HCI technologies in educational settings with an eye on the future. Along with a survey of how HCI is shaping the museum world, participants will be challenged to envision the museum experience a decade into future.
Conference events will be held in Albuquerque, New Mexico at the Indian Pueblo Cultural Center, New Mexico Museum of Natural History and Science, and at Ideum’s Corrales studio.
Attendance at the HCI+ISE Conference is limited to 60 participants, some of whom will be invited because of their specific point of view and expertise. Conference participant interactions will provide opportunities for collaboration and potential partnerships that will persist well into the future. Applications are welcome from exhibit developers, designers, educators, researchers, evaluators, and technologists. Funding from the NSF is available to help support the travel costs of attendees.
The application form and more information about the Conference can be found on this website at: http://www.openexhibits.org/research/hci-ise
Conference co-chairs: Kathleen McLean of Independent Exhibitions and Jim Spadaccini of Ideum, Inc.
Last week I spoke at The Tech Museum’s Interfaces for the New Decade Conference and Gallery Opening. It was a great opportunity to meet and connect with folks from many of the Bay Area museums (and local companies) who share the same interest in new HCI (Human Computer Interaction) technologies. Along with the day-long conference there was an evening Gallery opening which unveiled The Tech Test Zone exhibit.
At the opening party we demoed Heist, our experimental project that allows digital objects to be shared from a multitouch table to visitor’s smart phones and tablets. We brought a MT55 Platform multitouch table for the demo and conference.
While Heist is not part of the permanent exhibit our Open Exhibit’s Kinect and Gigapixel Viewer software is. Visitors to the Test Zone can use gestures to navigate an amazing gigapixel image of Yosemite taken by xRez Studio. This Open Exhibits software exhibit is among the first installations in our year-old National Science Foundation sponsored project. (You can join Open Exhibits for free.)
CNET News has pictures of all of the exhibits including our and their Website, see Future tech exhibit plugs museum interactivity.
One of great things about creating authoring software is seeing the creative things that people make with it. In the last week, we’ve come across two interesting installations; one built with GestureWorks (our commercial multitouch framework) and one developed with Open Exhibits software (the open, educational software initiative that we are leading).
Just last week week in Tech Crunch (Video: Kinect-Controlled Gigapixel Image Exploration) we saw an installation at the University of Lincoln in the UK in which visitors interact with gigapixel photographs. The installation was developed by Sam Cox who used our Open Exhibits TUIO Kinect and Gigapixel Viewer module. He added the ability for visitors to “step switch” between gigapixel scenes. He also added ambient sound and the ability to print zoomed in scenes. Check out the video below.
You can learn more about this installation on the Gigalinc website.
Yesterday we came across an interesting three-screen kiosk installation developed by the Spinifex Group for the Sydney Theatre Company. They used GestureWorks and FDT to develop an Adobe AIR desktop application which allows visitors to learn how the Sydney Theatre Company is reducing their carbon footprint. You can learn more about this installation and see a video of it in action on Karkaris.com.
This fall I will be teaching a course on exhibit development for the Cultural Resource Management Program at the University of Victoria. The course will be held in Vancouver at the Museum of Vancouver from September 26 -28. It is a blended course, so an online component proceeds the three days, starting on September 12th. You can learn more about, Emerging Exhibits: Exploring New Models of Human Computer Interaction (HCI) and register on the UVIC Website. Here’s a short description of the course:
Computer-based interactive exhibits are undergoing a major transformation. The lone, single-user kiosk is beginning to be replaced by multitouch tables and walls, motion-sensing spaces, networked installations, and RFID-based exhibits. Advances in augmented reality, voice recognition, eye tracking, and other technologies promise even more radical change for exhibits in the near future.
Collectively these new technologies represent a fundamental advance in Human Computer Interaction (HCI). This course will look at a new generation of computer-based exhibits that are more physical, more intuitive, and have more social qualities than their predecessors.
For decades, museum and education professionals have understood that interesting and provocative exhibitions and exhibits can encourage dialogue and deepen the visitor experience. However, until recently, the vast majority of the computer-based exhibits have been information-heavy kiosks with limited interactivity, providing only solitary experiences for visitors. The new models for HCI provide us with opportunities to rethink how technology is used in museums and other public spaces. Computer technology is on the cusp of finally living up to its promise in the museum world, providing a platform for developing compelling and authentic experiences for the public.
cross posted from Open Exhibits Blog
Open Exhibits has just passed 4,000 software downloads! If you’re not a programmer, or you haven’t downloaded the code, yet – we thought we’d give you an idea of what you may be missing.
This video demonstrates several of the free multitouch software modules already posted on the Open Exhibits site. These modules are designed to simplify Flash and ActionScript exhibit development, and many are compatible with the Microsoft Kinect (using MT-Kinect). The source code for every module shown in the video can be downloaded today on the Open Exhibits Software page.
We’ll continue to post videos as we release new software. If you have any comments or feedback, we’d love to hear from you.
Our module for Kinect provides a simple solution for authoring gesture-based applications in Flash. Lately, we’ve been using it in conjunction with our other free Open Exhibits software modules. While the Kinect device itself doesn’t have the necessary precision for use with every module, we have successfully paired it with our gigapixel image viewer, our new VR Panoramic image viewer, and with our Google Maps module.
Our free Kinect module works with Community Core Vision (CCV) software, an open source software package for computer vision. We’ve used this software in the past with various multitouch tables and other installations. Our Kinect module is a “directshow” source filter, a virtualized webcam device that reads data from the drivers released by OpenKinect.
Here’s a video showing the Kinect module working with other Open Exhibits software.
The Kinect module and the others are all free and open on the Open Exhibits website. The Open Exhibits core software is free for students, educators, nonprofits, and museums. (Commercial users can download a free GestureWorks trial.) Add a $150 Kinect 3D Motion Controller to our software and you have a very cheap and flexible authoring solution.
There are photographs of the Kinect and Open Exhibits modules on the OE Flickr site.
This article is cross-posted on the Open Exhibits Website.In the video, the gigapixel image of El Capitán that appeared in the example was provided by xRez Studio. The cubic VR image of Chichen Itza was taken by Ideum back in 2005 and is part of the Traditions of the Sun project.
The Blur Conference focuses on the new ways in which people are interacting with computers. This is first time this event has ever been held. So what is Blur about? From the conference webpage…
It’s easy to forget that the computer mouse is over 45 years old.
What’s not as easy to forget is that we’re now collectively getting used to interacting with computers via means and interfaces that have moved way beyond the keyboard and the mouse — the iPhone and Wii being the most prominent examples.
The truth is that we stand on the verge of a major revolution in the models of Human Computer Interaction (HCI). A revolution that will fly right past academic and into a world of retail, medical, gaming, military, public event, sporting, personal and marketing applications.
From multi-touch to motion capture to spatial operating environments, over the next 10 years, everything we know about HCI will change.
Blur is the only conference that is exploring the line of interaction between computers and humans in a substantive, real-world and hands-on way.
I’ll be presenting, “New Museum Experiences: Learning from Multitouch and Multiuser Installations” on February 22nd. I’ll also be on a panel that same afternoon talking about Kinect and our Open Exhibits module. You can learn more about the Blur conference on their website.
[Cross-posted from Open Exhibits Blog]
We’ve recently released two new modules on Open Exhibits. The gigapixel viewer module allows Open Exhibits and GestureWorks users to plug any gigapixel image into our Flash application and drag and zoom it using multitouch inputs. We recently demo’d this app for the first time at CES 2011 and it was a big hit.
MT-Kinect, our other new module, allows users to interface with a Kinect to manipulate multitouch applications using gesturing (like in the movie Minority Report) rather than directly touching a screen. We combined this module with a gigapixel viewer to create an application that allows you to move and zoom by waving your arms.
So how does our application convert Kinect data to multitouch-compatible input that our Flash application can read? We wrote a “directshow” source filter, a virtualized webcam device that reads data from the drivers released by OpenKinect.
After adjusting the depth data to amplify the edges – which optimizes this application for gestures from a single user centered in the Kinect’s camera – we output a simple webcam feed. We route this information to a vanilla installation of CCV (theoretically, other trackers should work), which runs various filters, finds the blobs, and outputs the data in whatever format we would like to consume (in our case,”flosc,” which enables Flash apps to get “OSC” information ). Our gigapixel viewer software can then read this input as though it came from any multitouch device.
These modules are free to download and use; you just need to be an Open Exhibits member. The gigapixel viewer requires that you have either Open Exhibits Core or GestureWorks software. Open Exhibits Core is available free to educational users. Commercial users can try GestureWorks free or purchase a license.
And if you’re wondering about the stunning gigapixel image of El Capitán, it was taken by xRez Studio who were nice enough to let us use the image for this demo.