Creating The Lion King with a groundbreaking approach…

The Lion King Virtual Production System is a multiplayer real-time collaborative platform that uses VR to put filmmakers inside their computer-generated films with live-action capabilities.

Why, what, and how we took the filmmakers to Africa virtually.

As visual effects and computer-generated images grow to represent more and more of the final picture, traditional filmmakers find themselves losing touch with the creation of their films.  Jon Favreau gave us the mandate to do anything necessary to make The Lion King feel like a live action film.  Though the imagery itself could look photo-real through modern visual effects, the creative process and the “presence of the film crew” needed to be evident.  This was a big challenge that had to be solved.

 

Traditionally in visual effects work, The Director, Cinematographer, and other department heads find themselves giving notes while looking “over the shoulder” of a technical operator. The direction they give can take days weeks or even months to be addressed, and the accuracy and quality of those revisions are subject to the creative interpretation of the artists and technicians at the desk, who might not have received the direction first-hand. The filmmakers lose the ability to be ‘hands on’ during the creative process and they lose the ability to react instinctively to the environment, and to the performances of other crew members, as they do on a live action set.

To solve this, we knew we needed to put the filmmakers back inside their movies, and put the traditional tools of cinema back in their hands.  By building the world of the film in a real-time game engine we could put the filmmakers inside that world through virtual reality, and by connecting interfaces to their traditional live action cinematography equipment, we could give them and the crew an experience similar to live action filmmaking.  Instead of standing over someone’s shoulder looking at a monitor, we could put them “inside the monitor.”  This approach could be used  by the Director, Assistant Director, and all members of the camera department, art department, animation, and visual effects, and would even allow traditional live action crew, like grips and gaffers, to contribute their expertise.

The filmmakers would start their day “scouting the set” which was prepared by the Virtual Art Department (VAD) in multiplayer VR using the virtual production tools. From this, shot lists, lighting notes, animation feedback, and design changes can be made live, or in some cases, requested as revisions by the VAD or Animation teams. When a scene was rehearsed or blocked and ready to shoot, the departments would reassemble in the scene to shoot out the scene using a traditional live-action workflow.  As each shot was captured, every detail and change was precisely recorded and sent to the appropriate departments.  Video streams were sent to editorial, where they could be cut in minutes after being shot.  Camera animations (as well as the movements of every piece of film equipment like dollies, gear heads, fluid heads, and cranes) would be saved as keyframes with each take and sent to visual effects, along with any changes made to lighting by the cinematographer.  Changes to the assets in the scene made by the Art Department could happen on a per-shot basis with full continuity tracking to post-production as well, so that no detail was lost.  In addition to providing clear creative direction to post production, the keyframes of the live-action camera operation were used as the final rendered keyframes in visual effects, providing the human-touch that the filmmakers required.

During about a year of development, more than a dozen engineers and artists wrote various new user interfaces, back-end networking protocols, and cutting edge capture systems into a heavily-modified version of Unity game engine. Seventeen virtual production stage computers were simultaneously connected to the stage system, each one responsible for either VR access to the digital world, hardware data input from equipment on stage, or video output to monitors/recorders. Data input from hardware on stage (dollies, a crane, Steadicam, camera operating wheels, FIZ, drones, joysticks, etc) would move the virtual camera inside the virtual environment in the same relation as its physical counterpart. The film crew, or even visitors to the stage, could play their roles and use their equipment in VR, or on any external monitor or touchscreen.  Several next-generation active motion capture systems were utilized to get high precision data from a larger physical capture volume. The workflow was designed with traditional departments in mind, whereas story-boarders, art department, production, editorial, grips, ADs, sound, and vfx were all utilized as they are on live action productions.

 

The characters within these virtual scenes could even be puppetteered when necessary, using a variety of controllers. Head and mouth movements could be mapped to the VR user’s own head and mouth movements, enabling a performer to “become” an actor in the scene. When necessary, individual pieces of equipment or performances could be isolated and re-recorded in sequence without requiring the crew to nail a shot in concert. An Avid editorial station on stage captured material live into the sequence, enabling editorial to suggest new coverage on the fly to flush out important storytelling beats. Circled takes from the day were captured, rendered, edited, and turned over to VFX as a package including reference videos, camera data, lighting setups, and set-dressing/layout, all of which were created or guided by the filmmakers own hands. The system was designed to carry all the on-set creative decisions through the entirety of the VFX pipeline.

 

Real-time engines allowed the bridging of advanced technologies with traditional live-action film equipment, and when combined with immersive technologies like VR, we succeeded in putting the filmmakers back inside their films.  We gave them the creative freedom to explore, while ensuring that their intentions and actions flowed through to post production, and by doing so allowed those artists and technicians to focus on making that vision exceptional, rather than trying to figure out what it should be.  As this approach and the technology continues to advance, we’ll move towards more realistic rendering in real-time, which can power videos screens that can turn a movie set into the Holodeck.  Having put a film crew inside the world of their story, we can soon put the audiences inside the films as well.

for more information:

Top

Downtown Los Angeles    523 W. Sixth St. Suite 1216    Los Angeles California 90014

Public Relations: pr@magnopus.com

Creating The Lion King with a groundbreaking approach… Why, what, and how we took the filmmakers to Africa virtually.
for more information: