An Adobe Research paper titled Deep Image Matting, might just put an end to green and blue screen techniques. Adobe collaborated with the Beckman Institute for Advanced Science and Technology, to develop a new system based on deep convolution neural networks. This system extracts foreground content from its background accurately and intelligently without any kind of blue or green screen background. Eliminating the green screen isn’t a completely new idea. Lyryo’s cinema cameras are able to do this based on depth perception. But this solution is 100% software based. The paper outlines the process to evaluate images. It then determines what needs to be cut from the background, and how.
Typically, to cut out actors and subjects from their environment, one films them in front of a solid colour background. For Hollywood, this generally means green or blue. For photographers, it’s often white, grey or black. Now, though, Adobe can do it with virtually any background.
The images above show a human created matte (a), and the computer generated matte (b). The computed foreground colours (c) are used to create more realistic transitions to composite the subject onto other backgrounds (d-f). It’s quite mind boggling just how far learning neural networks have come with imaging in the last few years. And while the phrase is often overused, this could very well be a game changer. Not only for video & cinema, but for the photography world, too. Especially given some of the criticism Photoshop’s selection tools seem to receive. No word yet on when this technology will come to Photoshop, After Effects or Premiere Pro. But if and when it does, there’s going to be a lot of happy people out there. If you want to find out more about the process, you can read the full paper here. [via The Stack]