Google’s Gcam research team focuses on computational photography. They develop the algorithms that assist in shooting photos with smartphones and other small cameras. When one of the team’s then researchers, Florian Kainz, went out to shoot under a full moon with a high end DSLR. He got one photograph he was particularly proud of, so took it into work. His teammates challenged him to recreate the image with a phone. Of course, Florian shot it with a Canon 1DX and Zeiss Otus 28mm f/1.4 ZE lens. Recreating the same kind of quality in a night time landscape with a phone’s camera and its tiny sensor was not going to be easy. Of course, Florian is a Google researcher. So he’s not one to back down from a challenge.
The Gcam team is responsible for things like the HDR+ mode that resides in the camera app on the Nexus and Pixel phones. This technology shoots a burst of up to ten exposures and averages them into a single image. It helps to reduce camera shake, increase dynamic range, and produces pretty decent results. But not decent enough for the kinds of exposures Florian needs.
So he set to work. Inspired by SeeInTheDark by Stanford professor Marc Levoy, Florian created an app which performs a similar task. Unlike HDR+ and SeeInTheDark, which shoot images of around 1/10th of a second, Florian’s app creates long exposures. And it captures a lot of them. HDR+ makes its results from 10 frames. Florian’s app shoots up to 64 images mounted on a tripod. The first major challenge was focusing. Fortunately, for landscapes, pretty much everything is sharp when focused to infinity. The app allows him to control shutter speed and ISO, and saves each of the images out to a raw DNG. Florian would shoot some exposures of the scene itself, then he would shoot the same amount of “dark frame” exposures with opaque tape over the lens. But that’s where the mobile side of things ends. The images were then transferred over to the desktop and brought into photoshop. First the exposures of the scene were combined and averaged together. Then the dark frames were combined and averaged together. The dark frame average was then subtraced from the photographs of the scene. This allowed him to reduce the noise as much as possible and create a cleaner image. And while he’s still not getting quite the results he’d get from a DSLR, it’s definitely a step up from what we typically shoot at night with a phone. Even if you do have to move onto the desktop for the final part of the process. Florian hopes that may change, though. You can see some sample images from Florian’s experiment in this Google Photos album. And for more in-depth information on his process, check out the Google Research Blog. Unfortunately, right now, the app Florian created isn’t available for download. Whether or not he’ll take the experiment further remains to be seen. But, hopefully, somebody will pick this up if Florian doesn’t continue it. Photography by Florian Kainz/Google