Skip to main content

How The Lion King visual effects team used VR to go inside Disney’s CG adventure

Young Simba and Zazu in the Pride Lands | Lion King VFX

Disney is no stranger to reinventing its most popular properties, and that’s exactly what the studio did with 2019’s The Lion King, a remake of the 1994 feature of the same name that swapped traditional animation for a photo-realistic, computer-generated environment and animal characters.

Directed by Jon Favreau, the film follows a young lion cub named Simba who must embrace his destiny as ruler of the land and avenge the murder of his father. The film’s visual effects team was tasked with making an entire cast of photo-realistic CG animals talk — and occasionally sing — their way through Simba’s story, and was led by three-time Academy Award winner Robert Legato.

Digital Trends spoke to Legato about his work on The Lion King, which not only brought him back to an animal-centric feature after winning an Oscar for 2017’s The Jungle Book, but also had the filmmakers and cast rely heavily on an immersive, virtual-reality environment to make a familiar tale feel new again. The Lion King is one of five films contending for an Academy Award in the “Best Visual Effects” category this year.

Young Simba and Scar in the Pride Lands | Lion King VFX

Digital Trends: It’s been a few years since you worked on The Jungle Book. What have been some of the biggest changes in your approach and the tools available to you since working on that film?

Robert Legato: So many things were perfected that we had just started on with Jungle Book. The trouble with doing this type of work is that when you start on a film, it’s a couple years’ process, and by the time you finish it, there’s all new stuff that’s been improved or become the new standard, but you’re stuck with what you started with.

Simba, Timone, and Pumba sleeping in the Jungle | Lion King VFX

It can work to your benefit, though, like it did with this one. In this case, the people who worked on Jungle Book also worked on Lion King — so they got better at what they do. Once you have some experience doing something, you get better at it. You get to see what your problems were in your previous movie, and that gets improved.

How did the technology change?

Well, we completely changed how we did landscapes in this movie, because we knew this movie really depended on them. One of the selling points of the film is that it looks like an epic movie shot in Africa.

Disney | Moving Picture Company

We used hair shading techniques that simulate how individual hairs are hit by sunlight and how light bounces off them to create grass, and then turned that into miles and miles of African landscape. Between that and the way we used it for the animals, it was more computation heavy, and we paid the price in terms of how long it takes to render, but you get something really great out of it.

The other thing we did came out of wanting this particular type of photographic style that would make it look like a beautifully shot, live-action film. We hired cinematographer Caleb Deschanel and built tools so he can do what he had done for years with his photography. That way,  he could bring his artistic sensibilities to the film without it being overly technical or computer-ish.

Cinematographer Caleb Deschanel holding a VR camera on the set of Lion King
Cinematographer Caleb Deschanel holding a VR camera on the set of The Lion King. Disney

How did you go about doing that? Filmmaking in a computer-generated environment doesn’t typically lend itself to traditional cinematography.

We basically created a version of our interface with the environment that was a lot like how one would photograph it. Instead of using a mouse on a computer and figuring out your camera movement that way, we had an intuitive, live-action orientation with a grip to move the dolly, a crane operator, a focus puller, and so on. We typically think of those as mechanical positions, but they’re not — they’re artistic. If you’re a camera operator, your dolly grip gives you the fluid quality of movement you see in photographs, so we had a dolly grip and a focus puller within that environment.

Disney | Moving Picture Company

With a focus puller, you want it to feel like your attention is naturally shifting from one thing to another. How long your eyes linger and when you shift is inspired by what you see. So what we were creating was a way to have a human feel behind all of that in this environment, and let our intuition as filmmakers take over. When you’re shooting, you can try different angles or go faster or slower based on your intuitive response to what you’re seeing. You can do all of that, see your results, fix one thing or another, and then lo and behold, you’re making a movie in that environment just like you would on a live action stage. And that rubs off on the film. It feels like it could be real because it looks like every movie you’ve ever seen.

You’ve mentioned the environment the film was shot in quite a bit, and I know virtual reality was a big part of that world. How did VR play into making The Lion King?

When you make a movie, you go location scouting, you have the script, and then you start refining all of it. You bring the actors in, plan everything out, and eventually you start to photograph everything and make the movie. VR gives you the ability to make up a set you haven’t built yet and be able to explore it and light it and put a camera on it. We could walk around the location with people, and put an animal here or there and watch it walk from point A to point B while we went over its dialogue.

Director Jon Favreau (far left), cinematographer Caleb Deschanel, production designer James Chinlund (center), Legato (right) and animation supervisor Andy Jones (far right) look around a virtual scene prior to a VR shoot for The Lion King.
Director Jon Favreau (far left), cinematographer Caleb Deschanel, production designer James Chinlund (center), Legato (right) and animation supervisor Andy Jones (far right) look around a virtual scene prior to a VR shoot for The Lion King. Disney | Moving Picture Company

Like I mentioned earlier, it lets us access the filmmaker’s intuition, because you can say, “Take two steps back and try it again over here,” or even, “You know, it would be better if we moved the whole set a little to the left, or maybe we can just move the sun over a little bit in order to highlight them at the right time.”

We had five or six people in VR to scout locations, for example. We’d go from point A to point B and scout for miles and pick the right location for the right scene. And for the actors, before they even do their lines, they can see where the scene is and exactly what their character would see. They’re not in a black room with a music stand and a script on it. They’re accessing their intuition.

Did you run into any unique challenges or advantages to using VR like this?

Well, one example of how ridiculous it is, we had five people in a room at one point. They were a couple of feet away from each other. Once you’re in VR and you start moving around, you fly like Superman from one VR location to another. You can be three miles away [in the virtual world] and say, “Hey, you should take a look at my shot.” Early on, the response was usually, “Wait, where are you?” We had no idea how to figure out where someone was or how they got there. So the next night, we had our people who write the software make it so if you point to Caleb and click on him, you will pop over to where he is. Then everybody was free to roam around and say, “Okay, I found a good shot from here,” and you could click on Jon Favreau’s icon or mine or Caleb’s icon.

Cinematographer Caleb Deschanel films a virtual shot for The Lion King. Disney | Moving Picture Company

At that point, we started talking like a film crew really talks. We could say, “Well, if we did that from there, let’s move the tree and maybe have the waterfall over here,” and so on. It makes it very accessible to make a live-action-looking movie. It felt like a live-action movie because that was what we intended to produce — something that feels like it could have been actually photographed. So you can lose yourself in that and just watch the story play out.

How did you approach making the animals talk this time around? Did you employ something similar to what you used in Jungle Book?

Part of it is that you have experience doing it before and you just make it better. But that that’s just part of it. When a parrot talks, it says full sentences, and you can understand it. But it doesn’t move its beak on every syllable. When a person talks, you don’t enunciate every syllable of every word. Your tongue or other parts of your body do that for you. So we didn’t go for overly articulating everything. We went for subtlety.

Young Simba and Mufasa in the Pride Lands | Lion King VFX

When somebody overemphasizes something, you tend to see their mouth dramatically enunciate a word, but when they’re talking under their breath, sometimes their lips don’t even move at all. So it seems like a daring thing to do, but that’s what we ended up doing with the animals. We did that with Jungle Book, but we just were not as good as we were by the time we got to Lion King.

We also made sure that we did not alter the physical nature of the animal. The characters were made exactly like real animals, with the same ligaments, the same muscles, and so on. That was an innovation, too, because it wasn’t a facsimile. It was a model that could only do what that actual animal could do. Once we had that established, then we worked within that model.

Disney’s The Lion King is streaming now on Disney+.  It is one of five films nominated for an Academy Award in the “Best Visual Effects” category this year.

Rick Marshall
A veteran journalist with more than two decades of experience covering local and national news, arts and entertainment, and…
How No Time To Die’s hidden VFX brought James Bond to the Oscars
An "Oscars Week" badge on an image of Daniel Craig stands in a forest with a rifle in a scene from No Time To Die.

James Bond marked plenty of milestones with No Time To Die, the 25th film in the franchise under producer Eon Productions and the fifth and final performance by Daniel Craig as the titular secret agent with a license to kill. Craig's farewell performance as the iconic spy received a trio of nominations for Academy Awards, with the film earning just the third nomination for visual effects in the franchise's long history.

Under the direction of filmmaker Cary Joji Fukunaga, the VFX team for No Time To Die was led by overall supervisor Charlie Noble, and included two-time (and now three-time) nominee Jonathan Fawkner (Guardians of the Galaxy, Guardians of the Galaxy Vol. 2), who served as the supervisor for VFX studio Framestore on the film. Digital Trends spoke to Fawkner about the unique experience of working on a James Bond film, the invisible effects woven into No Time To Die, and the unique role he found himself in as time constraints and a global pandemic closed in around the film's creative team.

Read more
How Moonfall’s VFX used real physics to bring the moon down
A shuttle floats through space with the moon in the background in a scene from Moonfall.

It's not every day you get to bring the Earth to the brink of destruction, but that's exactly what director Roland Emmerich set out to do in his latest film, Moonfall.

Directed and co-written by Emmerich, the film follows a pair of former astronauts and a conspiracy theorist forced to undertake a dangerous journey to the moon in order to stop it from crashing into Earth. Why the moon was suddenly knocked out of its orbit -- and how to get it back there -- are mysteries faced by the characters in the film, but they were also a problem to solve for the team tasked with grounding the film's cosmic calamity in plenty of very-real physics.

Read more
How The Matrix Resurrections used visual effects to plug in again
A city of code from a scene in The Matrix Resurrections.

It's difficult to overstate the impact of Lana and Lilly Wachowski's 1999 sci-fi adventure The Matrix, which raised the bar for technical achievement in filmmaking thanks to its groundbreaking visual effects, editing, and stunt choreography. The film's mind-bending introduction to a world in which machines enslave humans as organic batteries by keeping them docile inside a vast virtual reality not only spawned plenty of philosophical debate about the nature of our own reality, but also a pair of sequels that continued to push the limits of what technical filmmaking and digital effects could bring to life on the screen.

Nearly 20 years after The Matrix Reloaded and The Matrix Revolutions seemingly concluded the saga of Keanu Reeves' hacker hero Neo and his fellow freedom fighter (and lover) Trinity, played by Carrie-Anne Moss, the pair returned in 2021's appropriately titled sequel, The Matrix Resurrections. Directed and co-written by Lana Wachowski, the film also brought back visual effects supervisor Dan Glass, who worked with the Wachowskis on most of their recent projects, including Reloaded and Revolutions and Lana's Netflix series Sense8.

Read more