When we were first presented with the concept of the Eclipse Megamovie Mobile project, we had some concerns. Could a legion of citizen-scientists along the path of totality, armed only with their smartphones, capture scientifically meaningful images of a total solar eclipse? But the more we talked with astronomers Laura Peticolas, Hugh Hudson, and their colleagues at the Space Sciences Laboratory at UC Berkeley, the more we began to believe it was not only possible, but revolutionary.
The data that the app could capture for scientific analysis would be important enough. And if we could pull it off, the project could also significantly advance the boundaries of what is possible to capture with increasingly advanced smartphones. But even more fundamentally, we knew the project had the power to change the perspective of the people who would participate. Citizen science projects work on multiple levels: they can produce real science, but they also empower the public by making them part of a real scientific enterprise and contribute to the vital task of helping people understand the techniques and processes of science itself. More broadly, they excite the imagination and spark the curiosity of the people they engage.
The app we would develop was part of a much larger initiative, the Eclipse Megamovie Project. This partnership between Google and UC Berkeley planned to enlist and train amateur photographers to use their own DSLR cameras to capture and upload high-res images. Google would then stitch these images together to create a “megamovie” showing the progress of the eclipse across the country. Our primary focus was working with Space Sciences Laboratory scientists to design the smartphone app that would complement the movie.
The Eclipse Megamovie app allowed users to automatically capture a series of images of varying exposures spanning the entire period of totality. Obviously, precise timing and control of exposure settings were critical factors in capturing scientifically valuable images. Our UC Berkeley partners played a key role in providing precise information about the precise path of the eclipse; we then needed to marry that information to GPS data to get the timing of photo captures just right.
Capturing multiple images during totality turned out to be a fairly simple operation in comparison to the challenges of assisting users in aiming their cameras at the right point in the sky to capture totality. Smartphone cameras have very wide-angle lenses which are easy to align, but once you add a telephoto lens (which we recommended to maximize image size), precise aiming becomes much more difficult. We designed the app to work with lens as large as 50x. To make the app successful with a lens this size, we needed to accurately adjust direction and tilt within one degree of the eclipsed sun. To do this, we used the compass and accelerometer built into the smartphone, as well as the gyroscope found in some newer high-end phones.
Another challenge lay in the fact that we wanted to support the app on as many devices as possible. Only a few types of phones use the iOS platform, and there are only a few versions of the operating system, so tuning the app for iPhones was relatively manageable. The Android platform was another story entirely. We were actually able to support hundreds of Android devices going back to Android version 5.0. The fact that the app was designed for an unusual event happening on a specific date complicated the story further. This meant that we needed to develop innovative ways to test the app in advance of the event it was designed for. We approached this by developing a built-in Practice Mode that allowed users to orient their phones and take test images of either the sun or moon to get used to the setup procedure. We also benefitted from an early test during an annular eclipse in Patagonia in February 2017.
In total, app users captured close to 60,000 photographs during the eclipse, totaling approximately 210 GB of imagery. These images come from across the path of totality—literally from sea to shining sea. Hundreds of users’ smartphone cameras automatically uploaded images to the server. (Note that many people who downloaded the app were either not in the path of totality on eclipse day or chose not to participate. The bulk of our photographs came from a small subset of the people who downloaded the app.)
With so many users using so many camera setups, the images we obtained vary widely in quality. Some users’ smartphones had fairly basic cameras, while others had high-resolution equipment. Also, the app didn’t function as expected with some users’ phones, a puzzle we’re still investigating. The experience of app users also ranged from excellent to poor. Many people gave the app very strong reviews, but fully half of our 60,000 users downloaded the app on the morning of the eclipse, which meant that they were unable to take full advantage of the app’s Practice Mode or to buy and learn to use the recommended external telephoto lens.
Ultimately, these factors meant that a significant percentage of smartphone photos fell short of expectations. Of course, while citizen science can be powerful tool for engaging the public, it often requires some advance preparation. Megamovie Mobile offers illuminating proof that the success of a large public science project requires not only innovative technologies but clear planning and marketing strategies as well.
Nonetheless, even in the face of these challenges, we collected a large number of powerful and scientifically useful images. For example, the dazzling image of totality at the top of this post was taken with a user’s iPhone. People also captured surprisingly subtle aspects of the event, such as Baily’s Beads and the sun’s corona glowing near the star Regulus. Images like these illustrate the high image quality made possible by the app in combination with the increasingly sophisticated cameras found in today’s smartphones.
The eclipse of 2017 is over, but we still have plenty of work to do and gigabytes of data to examine. While our partners at the Space Sciences Laboratory are sifting through the images to see what might be learned for scientific standpoint, we want to better understand how citizen scientists used the app. What aspects of the app worked well for them, and what pieces need refinement? How might we improve the app’s user interface and functionality? How could we better get word out in advance that the app would be most successful with time to practice and the addition of an external lens? With tens of thousands of images to investigate, we have our work cut out for us.
Finally, smartphones are continuing to evolve, and the idea that they can be used in increasingly sophisticated ways in the name of science is very powerful. The Eclipse Megamovie Mobile concept (originally presented by Hugh Hudson and Mark Bender, see Sky & Telescope July 2017) will almost surely be extended and improved for future eclipses. As part of the project, we have open-sourced the app code on the GitHub. We may well be back for Eclipse 2019 in South America or Eclipse 2024 here in the U.S. Imagine what smartphone technology will allow when the moon next moves across the face of the sun!