Our recently announced Lytro Volume Tracer (Lytro VT) is a set of tools and a methodology for creating a Light Field volume from CG 3D assets to deliver a fully immersive volumetric VR experience with the highest levels of visual quality. To read the Lytro VT announcement check out this article on our blog.
Lytro VT can use any DCC and render engine (Maya and VRay for example) to generate a set of 2D volume samples of a 3D scene. Lytro VT starts by placing a virtual camera rig into the CG scene. The camera rig contains every possible viewpoint of the scene within a defined volume and can be animated or scripted to maximize quality and performance. The render engine is used to trace virtual rays of light in the scene and capture a volume of 2D sample images from each of the cameras in the rig. Lytro VT reconstructs a View Volume from those sample images by tracing rays of light from each rendered pixel back to the origin point of those cameras (Volume Tracing) to create an immersive Light Field VR experience.
Above is an example of a View Volume comprised of 1000 viewpoints. Within the volume, a viewer in a VR HMD can experience the reconstructed CG scene with the highest levels of ray traced optical effects, perfect stereo in every direction, full parallax and six degrees of freedom (6DOF).
The ray traced samples include both color and depth information (RGBZ.exr data). The number of cameras and their configuration is determined by the scene’s visual complexity, and the predetermined size of the View Volume needed during playback. Lytro Volume Tracer processes both the color and depth data from the samples to create a Light Field volume for playback in VR via our Lytro Player.
The View Volume in this 3D scene is represented by the white cube. A single camera with a viewpoint of the scene is represented by the green sphere. A virtual Lytro VT camera rig can include hundreds or even thousands of individual cameras. The 2D scene sample renders are ray traced using each camera in the virtual rig.
This close-up of same camera viewpoint from above shows five colored rays of light from the ray traced pixels in the sample image being Volume Traced inwards to the camera’s origin. By tracing the scene from every camera viewpoint that was rendered, a Light Field can be reconstructed.
In the future, Lytro VT and rendering may be merged into a single seamless process that allows a Light Field to be ray traced directly without having the intermediary step of 2D sample images. However that process that would require deep renderer integration and surrender the flexibility of a renderer agnostic system that Lytro VT is today.
As a rendering technique for creating realistic 2D images from a virtual 3D scene, ray tracing is capable of producing extremely high quality images. In the simplest terms, colored pixels are rendered on a 2D image plane based on how simulated light interacts with object surfaces in the 3D scene. Ray tracing is suitable for accurately rendering optical effects such as reflections, refractions and scattering (luminosity), but these require significant time to calculate. Ray tracing with full optical effects is too slow for real-time frame rates. However, ray tracing is well suited for applications that require the highest levels of image quality and can be rendered offline, such as cinematic visual effects.
The ray tracing process: Through a virtual camera’s field of view, the paths of light rays are traced from the camera as they reflect from object to object, until they reach the light source. If an object occludes the path to the light source, it produces a shadow ray. This technique is computationally efficient as it only traces the paths of light that the camera can see through it’s virtual lens.
Lytro VT and ray tracing are complementary, yet contrast in the conceptual direction the rays of light are traced. As shown above, ray tracing renders color pixels in an image by following the paths of light outward from a fixed camera with a view of the scene. Inversely, Lytro VT reconstructs a Light Field volume by tracing paths of light from each rendered pixel inwards towards the viewer from every viewpoint inside a volume. In the Lytro player, the viewer moves through this dense set of light rays, immersed in the reconstructed CG scene with the highest levels of visual quality, perfect stereo in every direction, full parallax and six degrees of freedom. In this experience, the rays of light are not being rendered in real-time, but retrieved in real-time from a very large set of pre-rendered rays, to compose an image for each eye in every position within the View Volume.