Ang Lee’s Gemini Man is a big-screen spectacle in every sense of the word. Not only was Gemini Man designed to be watched at 4K resolution and a whopping 120 frames per second — producing a picture so detailed that the film’s actors couldn’t even wear makeup — but it features superstar Will Smith in two separate roles: Henry Brogan, a 51-year-old hit man, and Junior, Henry’s 23-year-old clone.
In order to transform the 50-year-old Smith into Junior, Guy Williams and the team at visual effects studio Weta Digital created a fully digital version of young Will Smith that resembled the one audiences remember from ’90s hits like The Fresh Prince of Bel-Air, Independence Day, and Men in Black. Smith provided the performance via motion-capture acting, but Junior’s appearance on screen was all Weta. The result is one of the most convincing CGI humans ever put on film.
You can look at a picture and say it looks wrong, but you’ll be lost to figure out what’s off.
Digital Trends spoke to Williams and Gemini Man visual effects supervisor Bill Westenhofer about the challenges of bringing Junior to life, as well as what it took to resurrect one of the most famous young faces in Hollywood history.
Digital Trends: Various studios have been trying to make Gemini Man since 1997, when Tony Scott was signed on to direct and actors like Harrison Ford were being eyed to star, but over the years, nobody could get the effects right. What changed?
Bill Westenhofer: There’s no single thing that suddenly said, “Boom, you can make a digital human now!” It’s been a lot of incremental steps to the point where we were at with Rogue One and Blade Runner. Then, it was just analyzing the things that got close enough and pushing it the rest of the way there.
I think what was awesome about Weta is that they really delved into the science of it all. Without really letting the computers have the whole model of the skin and the face and the melanin and the textures, you’re not going to get there. You can look at a picture and say it looks wrong, but you’ll be lost to figure out what’s off.
Can you talk a little bit about what the rest of the Gemini Man team brought to the process?
Guy Williams: As much as we believed in the technology and we believed in our ability to do it, the other thing that really clicked into place was the film itself and the people behind the film. You have Ang Lee, who’s going to do it right [and is] going to be committed to doing it right.
Westenhofer: That part’s important. We thought the technology could get there by the end and that we could pull it off, and then it was just the raw determination, and knowing we’re going to make mistakes along the way. We just have to have time to make those mistakes and repair them before we get to the end.
Williams: One of the other things we said is that our digital human will only ever be as good as the performance that the actor gives. Something we said at the beginning was that you can’t treat your mo-cap (motion capture) like it’s a scientific step. Your mo-cap is just as important as your principle day on set. To that end, you have to have a good actor, and lo and behold, we get Will Smith in the role.
Just to be clear, this isn’t digital de-aging, like what you see in the Marvel movies. He’s a fully CGI human. Can you talk a little bit about the differences between the process used in films like Captain Marvel and what you did for Gemini Man, and why you chose your approach?
Westenhofer: Well, we had to because we had scenes where Henry and Junior are fighting each other. They were grappling with each other. … The other thing: The high frame rate and the high resolution. The actors can’t wear makeup. The Marvel de-aging technique required Samuel L. Jackson to be on set with a costume, and they put on as much makeup as they can. That’s the starting point [with that technique]. In Gemini Man, we’re seeing pores and things like that. You couldn’t do it with makeup, so we had to have something that was digital just for that reason alone.
“… Something didn’t quite feel right. We were asking ourselves, ‘Why is this happening? What’s wrong?’ “
Junior is a little different from digital characters like Gollum or Thanos because he’s not a fantasy creature. He’s a person — and a very famous person, too.
Williams: Therein lies the hardest part, right? One of the single biggest challenges we had on this show was that we didn’t have access to the person we were creating. I don’t mean that we didn’t have access to Will. He was incredibly generous. We just didn’t have access to Will 30 years ago. As much as we did digital effects 30 years ago, we didn’t do the same kind as we do today. There’s no photo turntable of Will Smith from 30 years ago to work off of.
So, in this case, we start with 50-year-old Will Smith. We photographed him three separate times over the course of the shoot. We did Clear Angle scans to get high-resolution models of him twice over the course of the shoot. We gathered the absolute maximum amount of visual resources we could on Will. We then built the 50-year-old model of Will and got that to the point where we’re comfortable with it and ready to start on the young one. It doesn’t mean that it was totally perfect, but it meant that we could take that turn down the street to Junior.
Was it difficult to get Will Smith’s likeness just right?
Westenhofer: Junior is a very brooding, serious character. After we did these shots and put all of the motion capture in there, something didn’t quite feel right. We were asking ourselves, “Why is this happening? What’s wrong?” Then we went and looked at his old films, and because Will was a very bon vivant, sarcastic, lighthearted guy, we had to find frames from those rare scowling moments he had back then and line them up — and that’s when we realized we’d actually captured him perfectly. We just weren’t used to seeing Will Smith act that way.
Williams: The next biggest challenge came from the fact that Will Smith is a fantastically well-kept 50-year-old man. When we lined up old photos and stills from movies alongside the Junior we created, we were like, “Okay, this is good.” But then we looked at it next to Henry and realized the haircut was different, but that’s all. We knew we had to go even further at that point, and that’s where the science comes in.
You’ve alluded to the role that biological science played in creating Junior. How did science contribute to what we see on the screen?
Williams: We’re fortunate to be surrounded by brilliant people. You’d be surprised how important pores on the face are, especially when you’re talking about 4K resolution seen at 120 frames per second. Typically, we would just take a cast of an actor’s face and then create small little patches of latex skin that we then flatbed scan to get their exact pores.
So one particular artist on the team said, “What if I grow the pores procedurally?” So, he wrote this insanely complex system — it took him about six months to even get it off the ground — that grows every single pore using flow fields to define the grain of the face. Your pores aren’t all just horizontal. They flow along the lines of your face, and that’s why wrinkles aren’t all parallel.
He created this system and we dialed it in over another three months, just iterating and iterating and iterating to make sure all of the pores lined up to where they actually were on Will’s face. Everybody’s face has a different texture.
Finally, is there any shot or sequence that you’re especially proud of or that epitomizes the project for you?
Williams: I’d say there are two. There’s a scene when Henry’s confronting Junior at gunpoint and trying to explain to him that he’s a clone. Then there’s a scene very shortly thereafter where Junior finds Clive [Owen] and says, “Hey, by the way, I think I’m a clone.”
It’s straightforward to get the scenes where he’s angry or he’s talking just right, because you have a lot to anchor those scenes to. But the moments where he’s just sitting there being talked to and he’s internalizing it, those are just super-rich moments. You can see every emotion going through his face, and you can see him taking things in, discounting certain things, and formulating arguments. He’s not talking, but he’s performing. For me, those shots were really fun.
Westenhofer: Those two scenes were it for me, too, and I have one other element. In the fighting sequences, when I — as a supervisor who was there on the shoot and should know exactly what’s happening — had to check to see whether Will Smith’s head was really Will Smith or whether that was a digital replacement, that was a really good moment.
- How the Thanos VFX team brought The Quarry’s characters to life (and then killed them)
- How visual effects created Snowpiercer’s frozen world
- How visual effects made Manhattan a war zone in HBO’s DMZ
- How visual effects made The Batman hit harder & drive faster
- How VFX powered Spider-Man: No Way Home’s villain team-up