Skip to main content

How virtual production stages are revolutionizing filmmaking

MELS Making of for the virtual production studio

Visual effects studio Industrial Light & Magic generated some buzz earlier this year with a video showcasing the innovative “virtual production” technology used in The Mandalorian, Disney’s wildly popular Star Wars spinoff series. The behind-the-scenes video revealed how the series used a stage surrounded by massive LED screens to blend practical elements with digital video environments that could be controlled and manipulated in real time by the show’s filmmakers and visual effects team.

In recent years, movie and television studios have increasingly embraced virtual production, due in no small part to the success of projects like The Mandalorian and 2018’s First Man, which made extensive use of virtual production in re-creating the Apollo 11 mission and won an Academy Award for its visual effects. As its grown more popular, the technique has also become more accessible, with production studios of all sizes investing in immersive environments that make virtual production possible.

One such studio is the the Montreal-based MELS, which recently announced it had completed work on a new virtual production stage that — like the stage used in The Mandalorian — will be surrounded by LED screens and powered by Unreal Engine, the groundbreaking 3D creation software developed by Epic Games (the creator of Fortnite).

Digital Trends spoke to MELS Studio President Martin Carrier and Head of VFX Christina Wise about the studio’s decision to embrace virtual production, and how this relatively new production technique is not only changing how movies are made, but also offering a safer, more efficient filmmaking environment during the coronavirus pandemic.

Image used with permission by copyright holder

Digital Trends: What is the virtual production stage like at MELS?

Martin Carrier: The space we built at MELS is kind of like a starter studio. We’ve all seen how The Mandalorian uses virtual production, and that’s sort of the benchmark with all of this. We didn’t have as much space [in MELS] as they did for The Mandalorian, but our stage is about 30 feet wide by 15 deep with a curved wall, and then there’s a fully tiled ceiling on top of that.

Many people associate the idea of filming actors in front of a large screen with more traditional green-screen techniques. What sort of advantages does virtual production have over more traditional filmmaking techniques?

Christina Wise: One of the big advantages is that you have much more control of your time frame. It can be magic hour all day. You can have 12 straight hours of the perfect conditions. You don’t get that when using a green screen or a blue screen. You’re also in a controlled environment, so you can shoot at dusk all day.

Image used with permission by copyright holder

Another advantage is that you don’t have to deal with issues like green-screen spill, or keep track of reflections in and out of glasses, or things like that. And for the actors and directors, you’re actually in the space — which is important. Actors might not like looking at a tennis ball or green screen that represents some character or element while they’re performing, and their eyelines can be off when they can’t see what it is they’re reacting to or interacting with. So it really helps us create the whole set in front of them. It’s more complete that way.

What prompted you to invest in a virtual production stage for MELS?

Carrier: Well, we wanted to make sure we would actually be able to back up our our pretensions as far as, “We can do that.” My background is in video games and Christine is coming from visual effects, so we know about some of this stuff, but once you actually do something like this, you find out there’s so much more to learn.

Wise: We also had all the departments together here [at MELS] already. We have the camera, we have the stages, we have the VFX, and we have the DI [aka the Digital Intermediate, which is responsible for digitizing a film and adjusting the color]. So we thought, “Let’s put it all in the petri dish and go for it.”

Image used with permission by copyright holder

Virtual production has created a partnership between gaming and filmmaking in some ways, particularly when it comes to the Unreal Engine. What is that partnership like on your side of things?

Carrier: When you look at Montreal, where MELS is located, virtual production is a manifestation of three or four industries that have come together here. Montreal is one of the world’s hubs for video games. Over 10,000 people work in that space here. Thousands of people also work in visual effects here, as well as in the movie and television industries, and advertising. There’s also a fourth pillar, too: Companies like our partner, Solotech, which manages stage shows and builds the infrastructures for shows, like for Taylor Swift or Lady Gaga. There is a lot going on in Montreal.

So, all of these people come together in this idea of virtual production. You get to put the best of all these worlds together and have people collaborate in a new way. We saw that collaboration manifest in the way that the vocabulary evolved on sets. Suddenly, we had game programmers and filmmakers on set at the same time, and it took a while for everyone to be using the same terms for what they wanted to do, but we got there. And now the whole is definitely greater than the sum of its parts — and I think that, in a sense, is the promise of virtual production.

Image used with permission by copyright holder

What about visual effects? How has virtual production changed VFX for you, Christina?

Wise: A lot of VFX artists love to go on set, and they don’t often get the opportunity to do so. With the Unreal Engine making it possible to create visual assets live and do modifications in real time, now they get the chance to do that. So it’s a great advantage for them and something they might not necessarily have experienced in the traditional pipeline for visual effects. It’s also a bridge for our artists to actually go to the set, interact with the director, and have a direct relationship — which is great, because often the director doesn’t get to meet the person who created that beautiful visual asset for them.

In terms of visual effects in general, I just think this is the next stage of evolution. It’s just such a creative and powerful tool for our clients and our producers and directors. Someone can say, “I need the show to be in Mumbai tomorrow,” and we can tell them, “OK, we’ll see what we can do for you.” That’s something we couldn’t have done three or four years ago without the Unreal Engine and the real-time rendering we have now. If I were making a soap opera, I’d absolutely have one of these stages, so I could send my characters anywhere in the world from one episode to the next.

Image used with permission by copyright holder

The coronavirus pandemic has had such a huge impact on filmmaking, but virtual production seems to offer some unique ways to make movies a bit safer. What are some of the advantages there?

Carrier: Using the virtual stage requires that a lot of the prep work needs to be done ahead of time. So, once you get to the set, you’re pretty much ready to go. You can keep numbers down that way.

[Using virtual production] also means that you have to go to a particular location fewer times. You can shoot the environment once, then bring it back into the studio and continue shooting there. That’s a controlled environment, so that’s another factor that should help in this COVID-affected world.

So what has the response to the new studio been like so far? What are you hearing from filmmakers?

Wise: We get lots of phone calls asking for the specs of the stage and the availability, and they also want to know who’s going to run it. They want to know who the team is at MELS. Like I mentioned, we had multiple departments here working on the making-of studio reel we created, from the stage and camera, to effects and the DI. We not only had an on-set digital imaging technician, but we had a colorist in a suite here adjusting the image live. That’s the MELS method of using virtual production techniques.

They also want to know if other teams can come in, too. For example, if you’re working on a large, Game of Thrones-type show where there’s more than one VFX vendor, all of the vendors need access to the stage. That’s something we’re open to and invite. We want other collaborators to come in and say, “I’d like to use the stage for a client, and we can all work together.” We’re basically Switzerland. We’re neutral in how we use our pipeline, the stage, and manage it, and try to work with everyone.

Image used with permission by copyright holder

You can find more information about MELS and its virtual production stage on the studio’s website.

Editors' Recommendations

Rick Marshall
A veteran journalist with more than two decades of experience covering local and national news, arts and entertainment, and…
How the Thanos VFX team brought The Quarry’s characters to life (and then killed them)
Ted Raimi wears a head-mounted camera and stares while filming a scene for The Quarry game.

Game developer Supermassive Games earned heaps of praise for its interactive, cinematic horror adventure Until Dawn, and followed it up with this year's "spiritual successor" to that 2015 hit, The Quarry. The game puts players in control of a group of camp counselors who find themselves besieged by a variety of deadly threats -- both human and supernatural -- after getting stranded at the remote campground where they spent their summer.

Like Until Dawn, The Quarry features a cast of familiar actors portraying the game's characters, with a blend of cutting-edge, performance-capture systems and Hollywood-level visual effects technology allowing the cast to bring their in-game counterparts to life and put those characters' fates in the hands of players. In order to bring the actors' work on a performance-capture set into the game's terrifying world, Supermassive recruited Oscar-winning visual effects studio Digital Domain, best known for its work in turning Josh Brolin into Thanos in Avengers: Infinity War.

Read more
Why Chip ‘n Dale: Rescue Rangers is a master class in animation
Chip and Dale stand in a tunnel, confused by their surroundings.

The Disney original film Chip 'n Dale: Rescue Rangers seemingly managed to pull off a trifecta with a reboot of the Rescue Rangers franchise that won over fans of the original series, young audiences, and critics when it premiered on Disney+ in May.

Directed by Akiva Schaffer (Hot Rod), Chip 'n Dale: Rescue Rangers is set in a world where animated characters and humans coexist, and finds the titular duo reunited when one of their former costars in the Rescue Rangers series goes missing. With Chip depicted in the more traditional, hand-drawn style of 2D animation in the series and Dale getting a 3D, computer-generated visual makeover, the pair find themselves interacting with characters from various eras of animation over the years -- as well as humans -- while attempting to solve the mystery.

Read more
How VFX gave Doctor Strange’s Gargantos a magic makeover
Doctor Strange stands on a street facing the monstrous, tentacled creature Gargantos in a scene from Doctor Strange in the Multiverse of Madness.

Superhero sorcerer Stephen Strange sent Marvel Studios back to the top of the box office with Doctor Strange in the Multiverse of Madness, which had Benedict Cumberbatch's titular hero explore the myriad dimensions of the Marvel Cinematic Universe.

Not only did the film introduce a long list of new characters, but it also brought filmmaker Sam Raimi back to Marvel after the Evil Dead director helmed the original, pre-MCU trilogy of Spider-Man films. To no one's surprise, Raimi delivered one of the franchise's darkest, most horror-fueled films to date, complete with terrifying zombies, gruesome deaths, and Gargantos -- a massive, tentacled creature from a nightmarish dimension that tore apart Manhattan in the movie's wild, opening scene.

Read more