Lytro Cinema Brings Light Field Capture To Film & TV Production
Lytro Cinema is one of the first practical applications of the Light Field capture system to be developed for film and television, virtualising most aspects of the live action camera. In effect, by delaying traditional creative camera controls from fixed on-set decisions to computational post-production processes, the DP has many more options for shot design and composition from one shoot than has been possible in the past.
Furthermore, imagery from the Lytro Cinema, because it captures all rays of light within a scene, potentially changes the relationship between live action footage and computer generated visual effects, integrating the two much more closely. The dense dataset captured by the system produces a Light Field master, comprised of positions in space and information about those positions, that can be rendered in any format in post-production and enables some creative possibilities that may never have been considered before.
According to Jon Karafin, Head of Light Field Video at Lytro, virtualising creative camera controls implies that decisions that have traditionally been made on set, like focus position, camera angle and depth of field, can now be made computationally – later on. In other words, one dataset could yield several different productions, depending on the DP but also on the post production team. From the Light Field Master, content can also be rendered out in multiple formats including IMAX, RealD and traditional cinema and broadcast at variable frame rates and shutter angles.
Using Lytro Cinema output, every frame of a live action scene becomes a 3D model, each pixel of which has recorded colour and directional and depth properties, in some ways making the image as controllable as computer generated VFX. The system includes different methods of integrating live action footage and visual effects with functionality like Light Field Camera Tracking and Lytro Depth Screen - the ability to accurately key green screens for every object and space in the scene without using a green screen.
Lytro Cinema’s specifications for raw data capture and optical performance are very high, beyond the scope of cameras currently used to shoot movies. It has a 755 RAW megapixel sensor, captures up to 300 fps with up to 16 stops of dynamic range and wide color gamut, and integrates high resolution active scanning.
Lytro Cinema comprises a camera, a server array for storage and processing that can also be performed in the cloud, and software to edit Light Field data. The entire system may be integrated into existing production and post-production workflows, and worked on with standard software. You can see a video about Lytro Cinema here.
A filmmaker could wonder if such a device will make the DP’s job less important, or easier, but this seems unlikely. Cinematography is an art, and movies are works of art. Datasets will only ever be datasets. How effective an idea moving cinematography from set to post really is will depend on many factors such as the people involved and their skills and expectations. Nevertheless, the prospects are interesting. For example, how would using this camera change the data wrangler's role, or the work of the lighting crew?
‘Life’, the first short produced with Lytro Cinema, made in association with The Virtual Reality Company, VRC, will premiere at the 2016 NAB Show. ‘Life’ was directed by Robert Stromberg, Chief Creative Officer at VRC and shot by David Stump, Chief Imaging Scientist at VRC.
Lytro Cinema activities during the 2016 NAB Show and a behind-the-scenes look on the set of ‘Life’ can be found here. Lytro Cinema will be available for production later in 2016 to partners on a subscription basis. www.lytro.com/cinema