Sans-serif

Aa

Serif

Aa

Font size

+ -

Line height

+ -
Light
Dark
Sepia

The Mandalorian: The Future of Filmmaking

Visual Effects is an evolving art that only gets better as technology improves over generations. Practical effects and in-camera tricks can still be seen in major productions today, but most movies and tv shows rely on digital effects created in post-production. What you will find on any production that uses compositing effects will be a green screen of a varying size with even some props that are the same green screen color depending on the scene. Afterward, the Visual Effects artists will create whatever needed effect for that particular shot and hand it off to a compositor who will composite the live-action scene and the post-production effects together using software like Nuke.

This is the way that studios have been making movies since the birth of digital effects in movies. Even though this has been the industry standard for decades now, there have been some obvious flaws with creating effects like this. The first is that the actor cannot see what they are supposed to interact with and have to imagine what it is. The second is there may be some green screen spill which means that when you take your green screen footage and key out the background, you might still see a little of the green on the subject’s edges. Another issue is that the filmmakers won’t have a good idea of what the final product will look like until it is nearly done. 

Just like how 1964’s Mary Poppins revolutionized blue screen by creating the sodium vapor process, we were all waiting on the next project to change compositing effects forever. The Mandalorian was first released onto Disney’s streaming service Disney+ on November 12, 2019, and the way that they were able to composite the background in the show will forever change how we do it.

Via TechCrunch

 Rather than filming the Mandalorian against a green screen and creating the background environments in post-production, the show uses what they call “the Volume,” a massive LED screen that’s 20ft tall, 270 degrees around, and 75ft across. On the screen is the projected background. Rear screen projection is not new to cinema, and it’s been used since the golden age of Hollywood but what makes the Volume unique is that the background is a real-time rendering that is synced to the camera. So in plain speak, when the camera moves, the background will shift perspective to match.

The Volume runs through the Unreal Engine making the Mandalorian one of the first major productions to use Virtual Production to this extent. The advantage of filming this way is that it makes it much easier to appear as if you are filming on location. In addition, the green spill isn’t an issue anymore with an LED wall, and it also adds proper lighting to the scene. The show was able to take advantage of this since the main character wears highly reflective armor making the natural light reflections from the Volume even more convincing for the audience.

via TechCrunch

Now that the world has seen what is possible with virtual production on a live-action film shoot, the question is, where do we go from here? Even though Mary Poppins revolutionized blue screen with the sodium vapor process, it couldn’t be replicated easily since there was only one camera built that could complete the process. The difference with using virtual production is that the Unreal Engine has been around for decades now, and it is highly accessible to the general public. Even low-budget filmmakers can create the same effects used in the Mandalorian if they have the proper equipment and access to an XR stage. Every generation sees one film that changes visual effects forever with films such as Mary Poppins, Star Wars, and Jurrasic Park. Arguably, it’s too soon to say that The Mandalorian is one of those productions that could be considered in that same category, but given how revolutionary the effects in the show were, I think it’s safe to say that this is going to be the way it’s done for many years to come.