Building an Interactive Video Wall

Image for the post: 'Building an Interactive Video Wall'

Our most ambitious technical project of 2016 was the DinoStomp 3D interactive video wall. We thought we would use this blog post to share a little bit about the design and development process.

Our most ambitious technical project of 2016 was the DinoStomp 3D interactive video wall that we developed with the Fort Worth Museum of Science and History.  The DinoStomp exhibit consists of a video wall 8’ high and 20’ wide composed of (15) LG 55” monitors, video array controllers, and three Microsoft Kinect motion sensing cameras. We’d like to share some of our experiences with the process of designing, developing, testing, and installing this immersive exhibit. (You can learn more about what we can do on our interactive video wall page.)

Presenting a high-resolution interactive application on a convex large-scale video wall came with complex technical and design challenges for our team to overcome. The first challenge was to ensure that we compiled a powerful enough computer system to handle the resolution of the video wall and the 3D graphics for the application. We wanted to use every physical pixel available across the video wall which meant we had to be able to output to a resolution of 9600 x 3240 pixels from a computer that was also rendering our custom application.

To run the video wall, we built a custom computer that we thought would handle the application and outputting to the displays. We found that our application ran much better across GeForce based GPU’s so we ended up with Dual GTX1080 graphics cards. We then decided to look into multi-display controllers as we could only get 8 outputs from our machine and having the GPU’s maintain the video wall configuration could potentially cause issues while they also rendered the application. There are many options for video wall controllers on the market, but we weren’t building a “traditional” video wall. One of our partners pointed us to Datapath who manufactures the Fx4 display wall controller. Each Fx4 controller was able to accept a single 4K video signal from the computer, output that signal to up to 4 displays, and maintain the video wall configuration which meant the operating system and the GPU would not need to handle setting up all 15 displays. With 5 Fx4 controllers in place, 5 video outputs from the computer, our video wall came to life.

Figuring out how to hang a matrix of 15 displays on a curved wall presented us with a number of challenging questions. How will we run the cabling? How will we access ports on each display? How will these be serviced once installation is complete? We knew that in the museum space we would be unable to get behind the displays once installed. Fortunately, our team knew about the ConnexSys mounting system from Chief Manufacturing. Each individual mount in this system can be adjusted for depth, height, and tilt and be pulled outwards for easy access to the back of the display. Once all of the displays were mounted and in place, we took the time to go through and adjust each mount to get the most seamless arrangement of the screens.  We used 55” LG commercial video wall displays with ultra thin bezels – 3.5mm from active screen to active screen. At Ideum, we built our own curved wall in our Usability and Prototyping Studio to allow us to perform a full test of the video wall system. Check out the time lapse video below of the wall being built.

The DinoStomp application was developed in Unity3D. The prehistoric scene, and the dinosaurs that appear in it, took several months to design and develop in conjunction with our partners at Ft. Worth. We needed to design and create the scene along with the dinosaur characters. This work included 3D modeling, rigging and animation, textures, and lighting. There were technical challenges in development, from a performance stand point, with a nearly 10,000 pixel wide scene. As an example, due to the scale of the scene, Unity3D’s built-in lighting tools took up to 10 hours to build every time we made significant changes to the lighting environment.

Beyond developing the scene and characters, the interactive elements of the exhibit posed the greatest challenge.  The primary interaction in the DinoStomp exhibit was to be accomplished by tracking people walking in front of the wall with a Kinect motion sensor so we could use that data in the application to have dinosaurs follow visitors across displays. Accurate body tracking was complicated as we had to take into account lighting conditions, varying distances between people and the wall, and the curve of the wall. In early prototyping, we used a single Microsoft Kinect device using skeletal tracking and additional qualifiers from the tracking data. The curve of the wall came into play again – we found that we couldn’t cover the entire length of the 20 foot wall with a single Kinect. Ultimately, to solve this problem we needed to add additional Kinect devices to our set up.

We ended up using three strategically placed Kinect devices to fully capture any movement. The position and orientation of each Kinect was different, so they all presented unique views of visitors in the action space.  In order to persistently track visitors across the exhibit, we had to develop an algorithm to calibrate all three Kinect devices so that they agreed on the location of people in the real world. Once calibrated, we could associate a unique identifier with each tracked body. This tracking data was then passed to the artificial intelligence (AI) system we developed for the dinosaurs.

Some of the 3D dinosaurs characters in DinoStomp are looped to appear at different times, such as the Tyrannosaurus and the Brachiosaurus. However, the smaller raptors are programmed to appear when a body in motion is detected by one of the three Kinect devices. This interactive feature allows participants to have their motions mimicked by 3D raptors and makes the scene more participatory and fun. The mimicking raptors also diminish the “fear factor” for younger visitors since they are able to control the dinosaurs in the scene.

Directing the dinosaurs within the application was an interesting problem from an AI standpoint. We needed to design the application to play stage director to actors (the raptors) who each have their own personality through which to react and play with the audience. Providing the right number of raptors to accommodate interaction in a public space was a challenge. We had to ensure that the raptors wouldn’t run into each other when the scene got crowded and make sure all the dinosaurs would run in fear when the Tyrannosaurus and Brachiosaurus entered the scene.

After besting all of our hardware and software challenges in building this large large-scale video wall interactive and testing the complete experience thoroughly at Ideum, we installed the video wall at the Fort Worth Museum of Science and History.  Museum visitors really enjoyed it, as you can see, from this time lapse video shot on the day the exhibit opened to the public.

Developers Ryan Leonski and Cairn Overturf contributed to this blog post.

Back To Blog

Recent Posts

Image for the post: 'GestureWorks Version 2 Release'

GestureWorks Version 2 Release

Version 2 is an extensive update to the GestureWorks authoring framework that makes it simpler than ever to offer multitouch and multi-user applications.​ GestureWorks Version 2 comes with a newly rebuilt visualizer for touch and gesture tracking. The…

Image for the post: 'Tangible Engine 1.5'

Tangible Engine 1.5

Ideum is proud to announce our Tangible Engine 1.5, a visualizer, configurator, and software development kit that allows developers to easily connect applications to real-world objects on Ideum multitouch tables. Tangible Engine opens up…