SIGGRAPH 2017 – Observations from a VFX veteran

SIGGRAPH has always been a conference about ideas, specifically the ideas that emerge from the most advanced thinking about how computers can make images. In previous years many of those ideas dealt with models based on our physical reality, and sometimes they have been about visual effects, the flights of fantasy and illusion that put dinosaurs, aliens, and big explosions on the silver screen. More recently they have been about interactivity and bigger and better computer gaming. In the 1990s some of those ideas became the fundamental concepts of Virtual Reality (VR).

Two years ago, we saw a resurgence of interest in VR with the advent of widely available head-mounted displays, but some wondered if the technology was a fad. VR had come and gone at least once before, as had 3D Stereographic Cinema, and even computer graphics (CG) faded briefly after the dismal box office performance of Tron in 1982. But CG and Cinematic 3D revived, and VR is now clearly inspiring a new generation to explore and enhance the techniques of this revolutionary medium.

Usually the center of SIGGRAPH is the show floor, but this year it was the VR Village. Lines started forming at 6am to get tickets for the VR Theatre and by Tuesday a sign was posted at the ticket desk; “All tickets for this year’s VR Theatre are gone. See you next year.” Popular booths included the abstract computer art of VFX Supervisor Kevin Mack; a headset that reads your brain activity and guides your activity according to what it “knows” you like; a Disney Research project that puts you inside a multi-sensory environment; and Lytro’s own co-production with Within Studios, Hallelujah, a live action 6DoF experience that immerses the viewer in beautiful music and visuals without the constraints of traditional cinematic VR. In fact, there were lines at almost every booth in the VR Village.

Getting on the wait list to see Hallelujah in the VR Village.

On the exhibit floor there was a lot of emphasis on GPU rendering and various forms of motion capture. There were more rendering solutions than we have seen in many years with Clarisse and RedShift joining Arnold and V-Ray as attractive options, while mainstream renderers such as RenderMan continue to improve. The big game engines, Unity and Unreal, move closer every day to equaling the image quality of non-real-time algorithms, and new volumetric renderers such as AtomView allow large-scale 3D datasets to be viewed in VR. Outside the VR Theatre, in the MeetMike installation, Mike Seymour and Epic allowed local and remote reporters to interview Mike while his actions and speech were captured and then rendered in real time as a photo-real CGI reproduction of his face. In the next space, the Mill’s remarkable VFX car Blackbird was displayed. It can be driven around by stunt drivers in the real world and then accurately replaced by any CGI car you happen to have.

The Mill’s awesome Blackbird platform for automotive CG.

Siggraph technical papers and panels draw together artists, engineers, educators and scientists. This year, journalists presented new ways to use 360 cameras to report stories in 360 video and/or VR. In one panel, Graham Roberts of The New York Times VR Team discussed how his team had to rethink their approach to covering breaking news because of the all-encompassing nature of the format. Elsewhere there were panels on how to better understand the human vision system and how colors and contrast are perceived by the human eye. If parts of the system, such as peripheral vision, see less color and more contrast, then we can render those regions with less information to optimize rendering for VR headsets (assuming good eye tracking). There is also work being done to build headsets that change the display so that your natural focus can operate in VR the way it does in the outside world. For older people, however, current VR headsets work just fine since as we age, our lenses become less flexible and we lose our ability to focus across a wide range anyway!

Lytro put on a great panel about the making of Hallelujah showing off the creative and technical acumen of the team behind our Lytro Immerge. Tim, Chrissy, Orin, and Nikhil did us all proud! Throughout the conference there was significant buzz about Lytro as more and more people saw Hallelujah and began to see the possibilities of 6DoF live action VR. It was fun to see people’s eyes get wide when they realized that I work for Lytro and to answer their questions about Lytro and Light Field VR.

Chrissy talking creative at the Breakthrough VR: Hallelujah & Lytro Immerge panel.

No SIGGRAPH experience is complete without the Electronic Theater and this year’s version showcased a wide range of animations from schools and from professionals. There was a lot of dark humor as well as some spectacular animation and rendering, not just in the algorithms, but in the artistic choices facilitated by those systems. The big awards went to “Song of Toad” and “Garden Party”, but I was also quite fond of “Happy Valentines Day.”  There were also reels from this years major work in visual effects including Guardians of the Galaxy: Vol. 2, War for the Planet of the Apes and Star Wars: Rogue One, showing what’s possible at the very tip top of the rocket. Still amazing after all these years.

Last and maybe most important were the hallway discussions and after hours sessions with friends and colleagues where the real work is done, hashing over the events of the day, from GPUs to giraffes, and sharing the excitement of new discoveries and inventions which beget more discoveries and more excitement. Everyone I met, from the CG pioneers of the twentieth century to high-school hackers, spoke about how a week of SIGGRAPH always inspires new ideas and endeavors, as well as eagerness to get back to the work of creating even better images to show next year. 

Be the first to comment

Leave a Reply

Your email address will not be published.


17 + 6 =