← News

Touchless Gesture-Based Exhibits, Part Three: Touchless Design

A New Initiative for Touchless Interaction Focuses on Onboarding and Feedback Systems
Jul
09
2020
Authored by
Jim Spadaccini
Founder & Creative Director

With the coronavirus crisis, the last few months have seen a whirlwind of conversations, proposals, prototyping, and research into touchless interaction. This third post in our series of articles about touchless interaction is quite different from the first two posts, which were retrospectives. Touchless Gesture-Based Exhibits, Part One: High-Fidelity Interaction looked at prototype software we developed with Intel. Our second post, Touchless Gesture-Based Exhibits Part Two: Full Body Interaction, examined several touchless exhibits we created for zoos and museums. (You may want to check out our new Digital Dinos project with the Las Vegas Natural History Museum, which also uses full-body touchless interaction.) This post focuses on our recent touchless prototyping and R & D—and where we are headed with touchless technology in the coming months.

The Touchless Interactive System

We started our recent prototyping using a Leap Motion device to detect hand motions and have designed several new prototypes that incorporate this technology into existing interactive exhibits. One is a modified version of the Bayou to Battlefield interactive we developed with The National World War II Museum. Another touchless prototype is based on Flight Director, one of a suite of exhibits we developed with the Smithsonian National Air and Space Museum. Additionally, we are currently working on a third prototype, modifying the exhibit information app developed by ARTECHOUSE which runs on an Ideum Drafting Table. Our plan was to modify a number of interactive exhibits to clarify what it would take to create an effective mouse emulation overlay that could work with a variety of applications and designs.

Early on, we decided to create an integrated system around the Leap Motion device to assist in onboarding and real-time feedback, both of which are critical in creating an intuitive touchless interface. This prototype system is based on mouse emulation: visitors point and move their finger to click and interact with the exhibit, and “grab” with a fist to drag an object on the screen. In addition, if a visitor does touch the screen, they receive a gentle reminder that the system is touchless. In addition to the Leap Motion device itself, the integrated system includes a small 3.5” monitor to provide instructions and LED lights for feedback. The video below shows the integrated system at work and gives an explanation of our design approach.

In addition to the software, the hardware system was also designed and developed in-house, using acrylic and walnut instead of the powder-coated aluminum we use for our tables and displays. This allows us to do all of the manufacturing using our CNC machine, laser cutter, and other tools.

Touchless Interaction: A User Interface (UI) Challenge, Not a Technological Challenge

We began our prototyping and research with the premise that touchless interaction is novel: many users won’t have experienced it yet. That means that for visitors to understand what it is and how to use it, they need to learn how to get started and then get real-time feedback along the way. But the underlying technology is robust: touchless technology has been around for years and works reasonably well, especially once you are oriented to the key movements and gestures.

The computer mouse offers a good analogy. For those of us old enough to recall the early days of personal computers, there was a time when people had to learn how to use that strange object connected to the PC. In fact, navigating a touchless interface is actually easier, because nearly everyone today knows how to use a mouse, and this touchless system emulates that interaction.

For every exhibit design challenge, we always remind ourselves that museums are free-choice learning environments, so every interaction needs to be clear and intuitive, or most visitors will simply move on to another exhibit. That’s why careful development of the integrated touchless system to provide feedback via custom cursors, the small display, and the LEDs was key to its success.

Besides onboarding help and immediate and continuous feedback, the touchless system allows any messaging about interaction to be “offloaded” from the large display. This is particularly advantageous for retrofitting existing exhibits, since changing the underlying application to add such messaging can be costly and time-consuming. Also, as we all hope that the COVID-19 crisis may last only a year or two, a kiosk or table that employs this system could simply remove the touchless add-on and revert to a touch-based interface when the pandemic ends.

A Solution Looking for a Problem

Touchless interaction probably won’t replace touch permanently, because direct touch will likely remain the most intuitive way to interact for years to come. For example, the gesture-based prototype we built with Intel 5 years ago worked well, but pre-COVID-19, it was in some ways a solution looking for a problem. Why use touchless if touch (and mouse) are more efficient and intuitive? Today, however, COVID-19 is the problem, and touchless is a powerful solution for as long as that problem lasts.

But this raises another question: when the pandemic is over, will touchless technology fade away? In many instances, it may well do so. However, in other cases, such as heavily-trafficked public spaces, touchless interaction may stick around for some time. It’s also worth noting that expansions of the ways that visitors interact with exhibits present new opportunities to make exhibits more inclusive. In that sense, touchless, like assistive audio, might become an option for an alternative method for interaction. Regardless of the specifics, lessons that we learn about strong, flexible user interfaces are always applicable in future projects.

Touchless.Design

What comes next? The prototyping and development we are currently doing is the leading salvo for our new open-source initiative, Touchless.Design. As part of that initiative, we recently began work on a new proof-of-concept application with National Gallery of Art that will orient visitors to the works on view and allow them to learn about masterpieces, artists, and art movements with a wave of the hand. This kiosk, which was made possible by a grant to the National Gallery of Art from the Alice L. Walton Foundation, will offer suggested tours of the collections based on a variety of themes, artists, and art movements, making the artwork much more accessible to visitors. The kiosk will debut in the fall and will help inform the development of the open-source software that will eventually be available on Touchless.Design for use by other museums and cultural organizations. We are glad to report that we received funding in part from Intel as part of their Pandemic Response Technology Initiative for this initiative.

On the website for the Touchless.Design initiative, we will be sharing results from our testing of the National Gallery of Art kiosk and other prototypes and proof-of-concept applications, additional research findings, and technical details for DIY hardware, in addition to the software itself. We are already exploring other technical approaches as well, using the Intel RealSense depth camera D435 for touchless interaction and developing new software to support gesture recognition. We will make further announcements as the project progresses. Stay tuned!