Already one of the biggest movies of all time, Marvel Studios’ massive team-up feature Avengers: Infinity War surprised audiences by turning Thanos, the film’s purple-skinned, digital villain, into one of its breakout stars.
Portrayed by Academy Award nominee Josh Brolin through a mix of performance-capture technology and digital animation, Thanos managed to be more than just a mad titan on a mission to eradicate half the population of the universe. The team at visual effects studios Digital Domain and Peter Jackson’s Weta shared duties in bringing Thanos to life on the screen, and the end result was an impressively detailed, complicated character that held his own — both physically and theatrically — against the combined might of Marvel’s (mostly) live-action heroes.
Digital Trends spoke to two members of the Digital Domain team — visual effects supervisor Kelly Port, and the studio’s head of digital humans, Darren Hendler — about the process of bringing Thanos to the screen, as well as their role in another character’s surprising appearance in the film. (Consider this a spoiler warning for anyone who hasn’t yet seen Infinity War.)
Digital Trends: When you were conceptualizing the look and feel of Thanos, what were the directives you were given by Marvel? What guidance did you get from the studio and the rest of the Infinity War team?
Kelly Port: In terms of the performance aspect, it was to make absolutely sure that, as much as it was technically and aesthetically feasible, whatever we did on the technological side enabled Josh’s performance to come through in the Thanos character. I think that was our goal from the very beginning, so when we did the initial tests, we set up the technology as it would be used in the actual shoot and presented it around the time of the beginning of the shoot. Josh saw that, and what was really nice was that he was able to see it and understand that he could play the character in a much more subtle way, as he would have preferred.
So in some ways, the technology helped to shape his performance?
Well, it wasn’t us doing the acting, it was Josh Brolin, but I think he really liked the idea that he didn’t necessarily have to do anything above and beyond because it was a computer-generated character on the screen, or play the character bigger in an effort to make his performance come through a CG filtering process. Having seen the results of the test and that all of his subtle facial performance came through with high fidelity, that gave him the confidence moving forward that he could do it in a much more subtle, understated way — and that’s what he did, and what the directors were really happy with.
“It wasn’t us doing the acting, it was Josh Brolin.”
We’ve mentioned the role the technology played in the performance, so what can you tell us about that technology, Darren?
Darren Hendler: Digital Domain created a new, two-step system to handle the facial work: Masquerade and Direct Drive. These two processes work in unison to create the highest quality creature facial animation from an actor’s live on-set performance.
Masquerade takes frames from a helmet-mounted camera system and creates a high-resolution actor face scan. It then uses machine learning to take previously collected high-res tracking data and turns the 150 facial data points taken from a motion capture session into roughly 40,000 points of high-res, 3D face-motion data.
How does it extrapolate from those 150 facial data points to the larger catalog of high-res facial data?
Training data collected from high-resolution scans shows us what the actor’s face is capable of doing through a regiment of facial movements. This allows the computer to see many details including how the actor’s face moves from expression to expression, the limits of the actor’s facial range, and how the actor’s skin wrinkles, for example.
We then do a motion capture session with the actor during their performance. They wear a motion capture suit with a helmet-mounted camera system and perform live on set with the film’s cast. During this session, we’re able to do body capture and facial capture of the actor at the same time. Prior to Masquerade, that level of high-resolution data for the actor’s face from their live performance alone was impossible.
“We made some modifications due to how well Brolin’s performance came through.”
So how do you get all of that data you collected on the actor to show through on the computer-generated character?
The second process in the system, Direct Drive, takes that data from Masquerade and transfers it to the creature — in the case of Avengers: Infinity War, from Josh Brolin to Thanos — by creating a mapping algorithm between the actor and the creature.
The mapping includes defining the correspondence between the actor and the character, including how different elements of each individual’s unique anatomy align. Direct Drive then figures out the best way to transfer Brolin’s unique facial performance to Thanos’ unique face. During the Direct Drive stage, we transfer a range of performances and facial exercises from the actor’s face to the creature’s face and have the opportunity to modify how it transfers. This is a huge part of the process because it allows us to add a level of additional control to ensure that the performance is portrayed as accurately as possible on the character.
Computer-generated characters like Thanos tend to evolve quite a bit from the earliest conceptual stages to the final product we see on the screen. What were some of the ways the character changed over the time you were working on him?
Kelly Port: We started out with the previous visual history for the character in the comics and films. That established the initial design of the character. For Infinity War, we received a slightly updated maquette and digital sculpture of Thanos that came from Marvel to us and Weta.
I think we made some modifications due to how well Brolin’s performance came through. We did some tweaks, for example, to get Thanos’ eyes more proportionally closer to those of Brolin. It was always about striking a balance between how much of Brolin is in Thanos and how much the established design of Thanos was there. We didn’t want to go too far with replicating Josh Brolin, of course, but it was nice to see a little of him and his physical characteristics in the design for Thanos. That helps with establishing the performance a little bit tighter.
Digital Domain worked on the scene in which the Red Skull makes a return appearance, which was a big surprise for Marvel movie fans. What were the conversations like when you were given that particular scene, and how did you go about bringing that character back to the screen authentically?
We had fun with that scene, because the last time we’d seen Red Skull was after he grabbed The Tesseract [in Captain America: The First Avenger], and the idea was that he was very much affected by it. We had to reference how the character looked in previous films and artwork, but at the same time, we played around with different ideas.
“The effect, which we called his ‘Quantum Cloak,’ ended up looking like a cool smoke effect.”
We considered that [as a result of his interaction with The Tesseract] he would be caught between dimensions, and we played with this “phasing” idea — that he would be constantly phasing between dimensions.
Ultimately, the effect, which we called his “Quantum Cloak,” ended up looking like a cool smoke effect. Pieces of his cloak came off of him in a nice, wraith-like effect. It was really fun to work on that reveal, and it was really rewarding to be in the theater when he comes out of the darkness and reveals that he’s Red Skull. To hear the audience’s reaction and experience that moment with them was fun.
Is there a particular scene that really encapsulates your experience working on Infinity War and what you’re most proud of in this particular film?
I think what we’re most proud of are the really important scenes with Gamora — not only with the adult Gamora on Vormir, but the young Gamora scene after Thanos snaps his fingers. I think those are some of the more subtle performances, and I felt really proud of the work we were able to do there and nailing the subtleties of the performance in those scenes. It was so important to do that, and they were longer shots that had to convey a lot of emotion without much facial movement at all.
Those scenes relied on subtle micro-expressions, and when you think of the scope of all the visual effects, those scenes, especially on Vormir, were so rich on a number of levels. They had different characters — Thanos, Red Skull, Gamora — and so much environment with the beautiful mountaintops and the behavior of the clouds and everything else below. I think those scenes really came together in a nice way.
Avengers: Infinity War is in theaters now.