CampusGame DevelopmentVisual Effects

Virtual Production with Industrial Light & Magic’s Stagecraft

On Friday’s events for the gaming fest, this Virtual Production panel stole the show. With panelists from Industrial Light and Magic, Dan Lobi (Supervisor for Global Content Strategy Statecraft Virtual Production at ILM), and Robb Gardner (Virtual Cinematography Lead at NetEase Games), they went over the magic of Virtual Production in film and television. 

Starting with ILM’s demo reel, the panelists introduced how they began their creative journey leading to their ILM experience. Robb shared that the first Star Wars inspired him to work in this big company. It took him three decades to reach this goal. He started by making his own films with his mom’s camera back in high school all the way through college. Robb confessed that he was insistent and sent ILM several letters regarding his interest in working there. However, they denied him the opportunity. His first job in a feature film was Space Jam.

On the other hand, Dan Lobi shared with us that his journey started while studying computer science in college, which led to his interest in computer graphics. He started his career as a computer engineering assistant at Silicon Graphics. It wasn’t until Jurassic Park revolutionized the industry that Dan realized that he wanted to be part of that team. He had to work there, but unfortunately, nobody was teaching the skill yet. To work successfully at ILM, you have to have a curious brain. Everything before Jurassic Park was optical and camera-based. It was a big photochemical to digital shift. 

Today, virtual production is revolutionizing the industry the same way digital media impacted it back in the 90s. It is another moment of significant change. The elements of Virtual Production are the following: 

  1. Real-Time Visualization
  2. Virtual Cameras
  3. Performance capture
  4. Live-action and virtual integration (no more waiting for post-production) 

The panelists shared that the industry has been building up to this virtual production shift for decades. For example, Rango was one of the first movies to use real-time camera movement instead of virtually animated. For this film, photogrammetry was vital in creating the movie’s environments. In addition to this, another similar approach that led up to virtual production was seen in Solo. Here the use of rear projection (projection of images from behind the screen) made it possible to get interactive lighting on the actors. John Favreau made an iteration of Solo by bringing the Virtual Production process to The Mandalorian. We were able to see Virtual Production being used for the first time within the film industry in this famous series. The LED screen is 75′ across. This stage was the primary light source of the set. 

The panelists mentioned that this screen is psychologically interesting. “It plays tricks on your brain. You believe that you are outside when you are actually indoors surrounded by LED walls”. Dan shared the funny story of a director giving directions to an in-wall puppet; he did not know that it was not an actual person. 

The pipeline of this new technology is slightly different from a traditional approach. It starts with pre-production, where the reference and concepts are created. However, in this stage of the pipeline, the team known as the VR scouts has to develop 3D environments and decide what will be CG (projected on the wall) and what will be practical. They use Stage Plans, the virtual stage locations for different scenes. In these plans, what is white will be virtual, and what is yellow will be the practical assets within the stage. 

This LED volume is lighting work. The panelists described it as an extension of the DP (Director of Photography. They developed their own iPad app that connects to the wall where you can control multiple aspects in real-time without having to wait ’till post to fix it. For example, they can move white cards, flags, and bounce cards and change the light’s size, intensity, and softness in real-time. Color correction, garbage mattes, color, and exposure blend can also be done in real-time with this approach. 

One of the most important things that the panelists mentioned about Virtual Production is the harmony within the integration of virtual objects and physical objects. It leads to an outstanding, invisible blend when it is well done. In The Mandalorian, they shared how there was a back and forth between practical and digital assets. They would create models digitally and print them on a smaller scale. Then, artists would paint them as necessary to make them accurate to the digital environment. This approach was easier than modeling, lighting, and texturing, all in CG to make it look natural. There is nothing more real than reality itself. Another great integration between digital and physical assets was the background reflection on the water. To do this using a green screen is almost impossible. However, when the crew flooded the stage with water, it provided the most accurate interactive light within the scene. 

To understand the VSFX for the Virtual Production pipeline, the panelists shared the following outline:

  1. Pre-Production
  • Virtual Arts Department (VAD)
  • VR Scouts
  • Virtual Set Construction
  • Light Baking vs. Live Lighting
  • Real-Time Virtual Sets
  • Animation and Optimization

      2. Production

  •  Virtual Set Operations (Virtual environments are ready for shooting)
  • The “Brain Bar” controls the 3D scene, Mocap, Color, and Lighting

      3. Post-Production

  • Final VFX Shot Work (Final Composite/creatures/final effects/clean up)
  • The wall can also be used as a green screen to key in post-production. It works as a backup.

The panelists wrapped up the presentation by discussing the expectations for the future within Virtual Production and a Q&A session with the audience. They hope to have environment captures (Scan environments) for Re-Shoots or sequels. It is nice to have a library of settings to use. Another thing that they are looking forward to is Performance Capture driving real-time actors—not splitting performance and face action. They also wish to add large backlots of digital environments and incorporate this into metaverses.

During the Q&A session, the panelists shared some advice regarding Virtual Production and answered some of the audience’s questions. They mentioned that there is no creature and actor interaction in real-time yet. It is too hard because of the perspective and composition of the camera. They also recommended that when using an XR stage, it is important to stay away from the wall; keep in mind that it is an extension of the set. Cross-contamination between the lighting of the set is challenging. They advised avoiding this in set designing. This part of the pipeline also helps arrange the perspective to “hide the wall” and unnecessary reflective light. 

In the end, Robb and Dan mentioned that there are plenty of jobs out there. They recommended that we, students, take advantage of SCAD’s opportunities, and learn, grow, and understand this process. Virtual production is the future of film… and the future is starting to unfold now.