Disney is no stranger to reinventing its most popular properties, and that’s exactly what the studio did with 2019’s The Lion King, a remake of the 1994 feature of the same name that swapped traditional animation for a photo-realistic, computer-generated environment and animal characters.
Directed by Jon Favreau, the film follows a young lion cub named Simba who must embrace his destiny as ruler of the land and avenge the murder of his father. The film’s visual effects team was tasked with making an entire cast of photo-realistic CG animals talk — and occasionally sing — their way through Simba’s story, and was led by three-time Academy Award winner Robert Legato.
Digital Trends spoke to Legato about his work on The Lion King, which not only brought him back to an animal-centric feature after winning an Oscar for 2017’s The Jungle Book, but also had the filmmakers and cast rely heavily on an immersive, virtual-reality environment to make a familiar tale feel new again. The Lion King is one of five films contending for an Academy Award in the “Best Visual Effects” category this year.
Digital Trends: It’s been a few years since you worked on The Jungle Book. What have been some of the biggest changes in your approach and the tools available to you since working on that film?
Robert Legato: So many things were perfected that we had just started on with Jungle Book. The trouble with doing this type of work is that when you start on a film, it’s a couple years’ process, and by the time you finish it, there’s all new stuff that’s been improved or become the new standard, but you’re stuck with what you started with.
It can work to your benefit, though, like it did with this one. In this case, the people who worked on Jungle Book also worked on Lion King — so they got better at what they do. Once you have some experience doing something, you get better at it. You get to see what your problems were in your previous movie, and that gets improved.
How did the technology change?
Well, we completely changed how we did landscapes in this movie, because we knew this movie really depended on them. One of the selling points of the film is that it looks like an epic movie shot in Africa.
We used hair shading techniques that simulate how individual hairs are hit by sunlight and how light bounces off them to create grass, and then turned that into miles and miles of African landscape. Between that and the way we used it for the animals, it was more computation heavy, and we paid the price in terms of how long it takes to render, but you get something really great out of it.
The other thing we did came out of wanting this particular type of photographic style that would make it look like a beautifully shot, live-action film. We hired cinematographer Caleb Deschanel and built tools so he can do what he had done for years with his photography. That way, he could bring his artistic sensibilities to the film without it being overly technical or computer-ish.
How did you go about doing that? Filmmaking in a computer-generated environment doesn’t typically lend itself to traditional cinematography.
We basically created a version of our interface with the environment that was a lot like how one would photograph it. Instead of using a mouse on a computer and figuring out your camera movement that way, we had an intuitive, live-action orientation with a grip to move the dolly, a crane operator, a focus puller, and so on. We typically think of those as mechanical positions, but they’re not — they’re artistic. If you’re a camera operator, your dolly grip gives you the fluid quality of movement you see in photographs, so we had a dolly grip and a focus puller within that environment.
With a focus puller, you want it to feel like your attention is naturally shifting from one thing to another. How long your eyes linger and when you shift is inspired by what you see. So what we were creating was a way to have a human feel behind all of that in this environment, and let our intuition as filmmakers take over. When you’re shooting, you can try different angles or go faster or slower based on your intuitive response to what you’re seeing. You can do all of that, see your results, fix one thing or another, and then lo and behold, you’re making a movie in that environment just like you would on a live action stage. And that rubs off on the film. It feels like it could be real because it looks like every movie you’ve ever seen.
You’ve mentioned the environment the film was shot in quite a bit, and I know virtual reality was a big part of that world. How did VR play into making The Lion King?
When you make a movie, you go location scouting, you have the script, and then you start refining all of it. You bring the actors in, plan everything out, and eventually you start to photograph everything and make the movie. VR gives you the ability to make up a set you haven’t built yet and be able to explore it and light it and put a camera on it. We could walk around the location with people, and put an animal here or there and watch it walk from point A to point B while we went over its dialogue.
Like I mentioned earlier, it lets us access the filmmaker’s intuition, because you can say, “Take two steps back and try it again over here,” or even, “You know, it would be better if we moved the whole set a little to the left, or maybe we can just move the sun over a little bit in order to highlight them at the right time.”
We had five or six people in VR to scout locations, for example. We’d go from point A to point B and scout for miles and pick the right location for the right scene. And for the actors, before they even do their lines, they can see where the scene is and exactly what their character would see. They’re not in a black room with a music stand and a script on it. They’re accessing their intuition.
Did you run into any unique challenges or advantages to using VR like this?
Well, one example of how ridiculous it is, we had five people in a room at one point. They were a couple of feet away from each other. Once you’re in VR and you start moving around, you fly like Superman from one VR location to another. You can be three miles away [in the virtual world] and say, “Hey, you should take a look at my shot.” Early on, the response was usually, “Wait, where are you?” We had no idea how to figure out where someone was or how they got there. So the next night, we had our people who write the software make it so if you point to Caleb and click on him, you will pop over to where he is. Then everybody was free to roam around and say, “Okay, I found a good shot from here,” and you could click on Jon Favreau’s icon or mine or Caleb’s icon.
At that point, we started talking like a film crew really talks. We could say, “Well, if we did that from there, let’s move the tree and maybe have the waterfall over here,” and so on. It makes it very accessible to make a live-action-looking movie. It felt like a live-action movie because that was what we intended to produce — something that feels like it could have been actually photographed. So you can lose yourself in that and just watch the story play out.
How did you approach making the animals talk this time around? Did you employ something similar to what you used in Jungle Book?
Part of it is that you have experience doing it before and you just make it better. But that that’s just part of it. When a parrot talks, it says full sentences, and you can understand it. But it doesn’t move its beak on every syllable. When a person talks, you don’t enunciate every syllable of every word. Your tongue or other parts of your body do that for you. So we didn’t go for overly articulating everything. We went for subtlety.
When somebody overemphasizes something, you tend to see their mouth dramatically enunciate a word, but when they’re talking under their breath, sometimes their lips don’t even move at all. So it seems like a daring thing to do, but that’s what we ended up doing with the animals. We did that with Jungle Book, but we just were not as good as we were by the time we got to Lion King.
We also made sure that we did not alter the physical nature of the animal. The characters were made exactly like real animals, with the same ligaments, the same muscles, and so on. That was an innovation, too, because it wasn’t a facsimile. It was a model that could only do what that actual animal could do. Once we had that established, then we worked within that model.
Disney’s The Lion King is streaming now on Disney+. It is one of five films nominated for an Academy Award in the “Best Visual Effects” category this year.
- How visual effects helped Tom Hanks’ robot costar come to life in Finch
- The best family movies on Disney+ right now
- Killer butts and gorilla fashion: How visual effects help Doom Patrol stay weird
- The best kids movies on Disney+ right now
- How GPU-fueled visual effects built Black Widow’s Red Room, then blew it up