When the iPhone X came along with its Portrait Lighting effects, a lot of people were very impressed. Apple was even claiming that you don’t need studio lighting at all anymore or any other fancy equipment. You just need your phone. And while the iPhone hasn’t taken over as the portrait photographer’s camera of choice, it’s an intriguing concept. A concept so intriguing that researchers and engineers at UC San Diego and Google have taken it a few steps further. They’ve trained neural networks to relight portraits after the fact without requiring any 3D depth data and with a lot more control than a few Apple presets.
The technique was documented in a research paper that the team submitted to SIGGRAPH 2019. The AI was trained by capturing photographs of only 18 people under different directional lighting sources in studio conditions. Each person was captured from multiple angles simultaneously to see how the light reacts to different parts of the face at different angles. The results aren’t quite perfect yet, but they are rather good.
Once trained on these images, smartphone photos were then fed into the AI, which were dynamically “relit” using various environment maps – a similar process to lighting a scene with an environment map in 3D software.
It’s certainly an improvement over many prior techniques and offers a lot more options than Apple’s Portrait Lighting effects. As well as changing the lighting direction and pattern, it can also adjust the colour temperature, and it manages to do it without any kind of depth map. It still has some issues with certain elements, such as the nose not casting a shadow on the cheek as the environment map light revolves around the subject, but I’m sure development won’t stop here. I still don’t think this will ever be able to replace studio lighting completely, but if it eventually makes its way into Android devices (probably starting with Google’s own Pixel phones), then it will allow users to make their selfies and photos of friends a little more flattering. Naturally, though, there’s no word on if or when this technology might come to our pockets. You can read the complete research paper here. [via Reddit]