This week the new site for the Informal Science website (informalscience.org) was launched. Ideum, in collaboration with CAISE (The Center for Advancement of Informal Science Education) and the Lawrence Hall of Science at UC Berkeley, designed the site and compiled the resources of three previous educational sites into a single, comprehensive online destination for informal science education (ISE) professionals. The site was developed with funding from the National Science Foundation (NSF).
Along with a new and improved design, the Informal Science website features an active member community, interactive guides, a robust project and research search engine, discussion forum, in-depth wiki and much more.
Learn more about the Informal Science web project in our portfolio.
Later this spring, I will be teaching a course on exhibit development for the Cultural Resource Management Program at the University of Victoria. The course will be held in Toronto at the Ontario Science Centre from April 22-24. It is a blended course, so an online component proceeds the three days, starting on April 9. You can learn more about, Emerging Exhibits: Exploring New Models of Human Computer Interaction (HCI) and register on the UVIC Website. (FYI, when I taught the course at the Museum of Vancouver last fall it sold out fast.) Here’s a short description of the course:
Computer-based interactive exhibits are undergoing a major transformation. The lone, single-user kiosk is now being replaced by multi-touch tables and walls, motion-sensing spaces, networked installations, and RFID-based exhibits. Advances in augmented reality, speech recognition, eye tracking, and other technologies promise even more radical change for exhibits in the near future.
Collectively these new technologies represent a fundamental advance in Human Computer Interaction (HCI). This course will look at a new generation of computer-based exhibits that are more physical, more intuitive, and have more social qualities than their predecessors.
The new models for HCI provide us with opportunities to rethink how technology is used in museums and other public spaces. Computer technology is on the cusp of finally living up to its promise in the museum world, providing a platform for developing compelling and authentic experiences for the public.
This is the fifth blog post about our multitouch wall installation. To see the previous ones see: Building a High-Resolution Multitouch Wall Part 1, Part 2 , Part 3 & Part 4. (Update 9/7/12: You might want checkout the Presenter Touch Wall, a 65″ multitouch wall built for public spaces.)
As I have mentioned in previous posts, while we received permission to share the development process we’ve been unable to say just which “major North American aquarium” we have been working with. Now we can share the name and we are proud to say it is the Monterey Bay Aquarium.
The 7- foot, round multitouch wall that we’ve been developing will be part of the Open Sea exhibition which is fully open to public on July 2nd. You can learn more about the Open Sea exhibition in Monterey Bay Aquarium Pressroom.
The previous blog posts detail the methods, materials, hardware, software, and other aspects of the development process. So I won’t go to far in depth here, but I wanted to mention a few more details about the visitor experience and the software.
The large size and round form-factor of the multitouch wall should make for an engaging visitor experience. The wall is big enough to accommodate multiple visitors simultaneously. It also support hundreds of simultaneous touch points.
As I mentioned earlier the exhibit will allow aquarium visitors to “touch” phytoplankton and learn more about them.The fact that microscopic plankton are the base of the marine food web and they produce most of the oxygen present in the Earth’s atmosphere makes the exhibit all the more significant. We hope this installation will provide a compelling way for visitors learn about these important tiny organisms.
The exhibit software was created in Unity 3D and the programming and design was done by Lindsay Digital (they also took the photographs that appear below). This is one of the first projects where we concentrated just on hardware.
Here are a few photographs of the installation at the Monterey Bay Aquarium. We will share photos and video of the exhibit in full operation after the opening on July 2nd!
Ideum’s Paul Lacey and Chris Steinmetz work on calibrating the multitouch wall. The number “2560″ which appears on the screen is the resolution of the round multitouch wall. It is 2560 x 2560 pixel which is better than HD resolution.
The NASA Space Weather Viewer is now available in the Google Android Market. You can download it here. Becoming a Google Developer and posting the app to the market was very simple process.
Back in November, I shared some of the difficulties we encountered developing and publishing the iOS version of the application. (By the way, I still believe, long term Apple will have difficulties with their model, but certainly Android and the iPad alternatives have stumbled quite a bit with the Honeycomb release.) Still, in the end the iOS version has been very successful (see: Over 100K Downloads for NASA Space Weather iPhone App in March).
Below is a video we made showing the NASA Space Weather Viewer running on Samsung Galaxy tablet and Android phone. The video is also embedded in app listing the Android Market. A simple, but smart feature for previewing apps in the market.
We will let you know how the Space Weather app does in the Android Market and if we see anything like the success we’ve had in the iTunes Store.
We’ve just completed our first release of the Android version of the NASA Space Weather Media Viewer. Like the version we developed last fall for the Apple iPhone & iPod, the Space Weather Viewer for Android features near-real-time imagery from a wide variety of NASA missions, as well as video interviews with prominent scientists.
The new Android version will be available in the Google Android Market Place and on the Amazon App Store later this month.
f you’d like to get a sneak peak of this new NASA app, you can download the alpha version right here: NASASpaceViewer.apk (4.6 mb) Update: May 31: It’s now live in the Google Market Place: NASA Space Weather Viewer
The NASA Space Weather Viewer is now available in the Google Android Market.
You can download it here.
It requires Android 2.2 or greater and Adobe AIR 2.6. We’ve run it with Android 3.0 “Honeycomb” and it runs great.
The app is optimized for phones or tablets and we’ve tested it on the following devices: HTC EVO, Motorola Xoom, Nexus One, Samsung Galaxy Tablet, and the Samsung Galaxy Epic. Please let us know what you think. We will be making the source code for this Android app later this summer.
If you’re looking for more information about the Apple iOS version and source code, see our last blog post on that version, “Over 100K Downloads for NASA Space Weather iPhone App in March.”
(Update 9/7/12: You might want checkout the Presenter Touch Wall, a 65″ multitouch wall built for public spaces.)
This will be the last installment of this series on our multitouch wall project here at the Ideum studio. Next month, we’ll be able to show you the exhibit installed. For the last two weeks we’ve been working on blending the two high-resolution images and the infrared illumination and tracking. Also, yesterday we began to pack up the pieces for shipping.
Blending the two high-resolution projectors has taken a bit of time. The two projectors are dVision 35 WQXGA XB LED projectors by Digital Projection. The blending hardware is also from Digital Projection. The combined resolution of our round, 7 foot projected images is 2560 x 2560. Before settling on a hardware solution, we tried a few different blending methods using software. One of the software methods we first explored, used corners of the two projected images to calibrate. Sort of a non-starter when your image is round! The blending hardware we are using is Digital Projection’s Fusion 3D hardware.
In the photograph above you can see one of the calibration tests. Paul Lacey (Senior Multitouch Engineer at Ideum) and Chris Steinmetz (Support Specialist at Ideum) are examining a test blend.
Infrared Illumination and tracking was another major challenge. As we mentioned in our last blog post (See Part 3), we’re using low-powered lasers for illumination. For tracking we are using NUITEQ’s SnowFlake Software. The software will use the four cameras (See Part 2) to track the IR touch points on the wall’s surface. The IR cameras we are using are from Point Grey.
The photograph above shows a calibration test using Snowflake software which works with Point Grey cameras and supports four camera input.
Finally, packing up all of these materials for shipment is a major task in itself. Many of the items are extremely fragile, such as the 7 foot acrylic projection screen. Others, like the large ring are just hard to handle and move.
The round acrylic projection surface needed to be wrapped very carefully for transport.
The 10mm haptic glass is both fragile (when not mounted) and heavy.
In the next post, we’ll show you the installation! Also, we’ll finally be able to share the name of the aquarium that has partnered with us to develop the exhibit. (They gave permission to share the progress, but wanted their name kept private until the exhibition opens.) To see the previous steps in the process check out: Building a High-Resolution Multitouch Wall Part 1, Part 2 and Part 3.
For the latest installment see: Building a High-Resolution Multitouch Wall (Part 4). (Update 9/7/12: You might want checkout the Presenter Touch Wall, a 65″ multitouch wall built for public spaces.)
We’ve made a great deal of progress since our last blog post on the multitouch wall project (see: Building a Multitouch Wall (Part 2)). The big news is that we’ve completed an illumination test and the image looks really great! The exhibit is using dVision 35 WQXGA XB LED projectors by Digital Projection. These awesome, short throw projectors have a resolution of 2560 x 1600. The combined resolution of the round 7-foot, multitouch wall will be better than HD at 2560 x 2560.
To mount these two projectors, we designed custom, adjustable projector mounts out of aluminum. These mounts –along with the cameras and first-surface mirrors– connect to the aluminum rigging we designed (mentioned in the last installment). All of these pieces need to fit and work within a tight 3 foot space!
In the last blog post, I mentioned that we are using 10mm haptic tempered glass fronting a piece of acrylic with projection material that will go directly behind it. The acrylic is from Draper it is 1/4″ Cine 13 Optical Coating it has a dark grey tint. Locating 84″ of projection material that was seamless, was one of the challenges we faced in developing this exhibit.
For our illumination test we displayed an early version of the software. The exhibit will allow visitors to “touch” plankton and learn more about them. As I mentioned in a previous post, this exhibit is going to be installed in a major North American aquarium. (We have received permission from them to share the development process as long as we don’t divulge their name.)
Finally, we have designed and fabricated all of the laser mounts for the outside wall. The image above shows a close-up. There are 8 laser mounts with a total of 16 lasers creating two interlaced grids. The lasers are 5mW, similar in power to a laser pointer. These lasers will be used to track visitors fingers and hands as they interact with the wall (it is a method called Laser Light Plane (LLP). This application of LLP is completely safe; along with the low power, these laser will not come in contact with visitor’s eyes. A metal flange will completely cover the laser mounts.
In our next installment we will show you how we are blending the two high-resolution images together. To see the previous steps in the process check out: Building a High-Resolution Multitouch Wall Part 1 & Part 2.
For the latest installment see: Building a High-Resolution Multitouch Wall (Part 3), and Building a High-Resolution Multitouch Wall (Part 4). (Update 9/7/12: You might want checkout the Presenter Touch Wall, a 65″ multitouch wall built for public spaces.)
Back in November, I first blogged about building a 7-foot, round, high-resolution multitouch wall. At that point in the process we had just received the large ring and built the computer system. Now, we have installed the glass and have built out the rigging for the cameras and projectors.
The glass is haptic: it has a texture, doesn’t show fingerprints, and still displays the image beautifully. We purchased the glass from a company called Sevasa. They make an acid-etched architectural glass that has a great feel to it. The tempered glass is 10mm thick.
Due to the size of the glass, we are not adhering the project material directly on to the glass, but rather we have a piece of acrylic with projection material that will go directly behind the glass. We have already done projection test and the combination works great. (I will post more about that once we put the acrylic in place.)
Behind the 7-foot ring, we have built a rigging frame out of Bosch aluminum. The rigging holds the four IR (infrared) cameras in place. It will also hold the projectors in place.
We will be using an IR method called Laser Light Plane (LLP) illumination. The system will have four cameras that will gather the tracking information. The exhibit will be installed in early summer, it is being built for a major North American aquarium. We will post another update on this project in the next week or two.
To see the previous step in the process check out: Building a High-Resolution Multitouch Wall Part 1.
For anybody interested in the NASA Space Weather Media Viewer and iPhone/iPad/iPod development, we’ve released the source code for the NASA Space Weather Media Viewer mobile edition! You can find it on its new GitHub home page (https://github.com/ideum/NASA-Space-Weather-Media-Viewer). If you’re looking for the app itself, you can download it for free in the iTunes store.
We’ve utilized the awesome Three20 library originally engineered by the folks at Facebook, and a simple CoreData store organizes the media assets. Though they’re streamed from the NASA server to your device, you’ll find all the video content in the source (be prepared for a long checkout process). The RichContentViewController displays HTML content with text sizing options and ShareKit integration, and the SegmentedNavigationController provides an alternative interface to the icon-based buttons available with the standard navigation controller.
As we mentioned in a previous post on the Space Weather Viewer app, the iPhone development process was not as smooth as we might have hoped. By releasing the source code, we hope to aid students and educational programs that may want to try building their own iPhone app as well as programmers just getting into iPhone development.
We’ve just posted a video on our YouTube channel (multitouchexhibits) showing the features of the NASA Space Weather Viewer iPhone app that we released at the end of October. The app connects to near-realtime views of the Sun from NASA Satellites. Check out the video below:
The app is free. Look for the NASA Space Weather Viewer on the iTunes Preview page. You can scan the QR code here to visit the page. Also, in case you missed it, we posted more about the iPhone development process a few weeks ago. See “NASA Space Weather Media Viewer Mobile – Uncaged and Wild.”