The Lion King
Technicolor’s MPC Film and the filmmakers of The Lion King bring the beloved characters from a Disney classic back to the screen – like you’ve never seen before.
- Seeking to be true to their world-building mission, the team traveled to Africa to experience the continent firsthand during location scouting and data capture expeditions.
- The mission was accomplished through virtual production, and the evolution of a methodology MPC's Adam Valdez discussed with Disney and Jon Favreau while The Jungle Book campaign was still wrapping up.
- Technicolor’s MPC Film was also charged with producing all VFX and Animation for The Lion King – a total of 1,490 shots – and delivery of all 2-D & 3-D renders.
- Twelve-hundred MPC artists, representing more than 30 different nationalities, worked across studios in Los Angeles, London, and Bangalore.
Even before Disney began releasing the breathtaking trailers for The Lion King, film lovers everywhere were buzzing about this summer’s most hotly anticipated film. After his success with The Jungle Book, how would director Jon Favreau bring the beloved characters from another Disney classic back to the screen, in a whole new way, and in a wholly recreated version of Africa that appears to be the real thing?
As Favreau told Entertainment Weekly while on set: “How do you update [a classic landmark] without changing the personality of it? How do you take advantage of all the new technological breakthroughs but still maintain the soul and the spirit of the original? I thought that this technology would be separate enough from the animated film that it felt fresh and new, yet completely related to the original. And by the time The Jungle Book was done, we had a lot of facility with this technology, so you’re hitting that part of your stride where you’re saying, ‘Now, what can I really do with this?’”
The story behind the making of The Lion King is one of great creative collaboration, leadership, artistry, and innovation. For Technicolor’s MPC Film, the story began as early as October 2016, while still wrapping up work on The Jungle Book campaign, and several months before MPC VFX Supervisor Adam Valdez would win an Oscar for Best Visual Effects on that groundbreaking film. With Favreau and Disney, Valdez began discussing how the pipeline and methodology could continue to evolve from The Jungle Book to take their next project to yet another level.
“The time was ripe to do this, but no one was doing it,” says three-time Oscar-winner Robert Legato, Production VFX Supervisor. “This was a push to make a film that broke the conventions…and if the result happens to be good, you can see how you can apply it to other things, and that’s not as common as you’d think. Everybody does VFX movies, everybody does animated movies, everybody does live-action movies—but to mix all of them together to make something that belies how it was done is, I think, the game-changing portion of all this.”
The unprecedented level of world-building they had in mind would first require much of the team to travel to that world: Africa.
“Jon wanted to create an experience where you forgot you were watching something created,” says Valdez, “a new connection to the story through a documentary-style sense of reality. Accomplishing his vision to make something that looked entirely believable as the real thing – not abstract or stylized – required that we be true in terms of representing Kenya; in giving it enough detail that you fall into that magical sweet spot between ‘I know this isn't real’ but somehow ‘I believe it.’ That was our mission.”
For such a mission, references alone would not suffice. The team would need to see the African Savannah with their own eyes and experience the sense of place for themselves.
“Going to Kenya was about getting what you can’t get from seeing pictures of it,” elaborates Valdez. “From a helicopter, going back out over and over again for a couple of weeks, you really get a meta understanding of it as a real place; the expansive vistas, the diversity of the landscape, the colors and variety of tones, how the vegetation grows and what the light feels like.”
“For me in charge of the sets, the main goal was to replicate the natural environment in such a way that the audience believes they are in Africa on the savannah,” adds MPC’s Audrey Ferrara, DFX Supervisor. “Traveling to Africa was important to that goal. It created a common experience that enabled me to understand exactly what James Chinlund (the production designer) wanted to achieve in terms of sets and environments. It built a common memory of spaces, so that when Caleb Deschanel (the cinematographer) references the light halfway up Mount Kenya, everybody remembers exactly what he’s talking about.”
As on The Jungle Book, Favreau had assembled a team of traditional filmmakers who understood that, as Ferrara says, “the perfect shot doesn’t sell reality.”
“The early brief from Jon was that he didn't want it too perfect,” explains MPC VFX Supervisor Elliot Newman. “The too perfect sky or dappled sunlight tends to make you not believe it. He appreciated the fact that sometimes in nature you get imperfect situations. It’s the reality [not perfection] that's important.”
“All of a sudden you are converting what would be perfect in the CG world to be imperfect in our world when we photograph,” elaborates Legato. “And we found that is so much better, so that conversion became something that you sought after as opposed to trying to eliminate, or try to make perfect. You don’t want to make it perfect – you want to make it feel like a human is behind the lens.”
The process of capturing reality would commence again on a second trip to Africa, where Technicolor MPC artists would document the diverse landscape, and build a preliminary set of locations.
“A tremendous amount of research goes into creating these digital environments,” offers Favreau, “things like, recording everything that you can; research through still photography; and also high dynamic range photography, to understand how different surfaces react to light.”
“We had a team of data capture people there to photograph foliage, the different species of plants and trees, and to capture various lighting environments,” says Newman. “We created big 360-degree HDR images of the sky and the sun – and even figured out exactly how to capture the sun itself, which is quite tricky as you can imagine – and built up a huge library of these high-resolution images.”
From here, the entire project – the development of the environments and the characters – would be informed by these initial trips to Africa, where the team experienced firsthand the landscape, the light, and the continent’s stark beauty.
“The safari we went on at the onset of this project is something I will never forget,” recalls Animation Supervisor Andy Jones. “There is nothing like seeing these animals in the wild to make one truly appreciate the beautiful harmony of life that is going on there. We had a very high bar to meet – to both do justice to the original film and bring something new to the story that pays respect to its roots in Africa.”
“Because everybody was part of this trip, it really unified the team,” concludes Ferrara. “We knew from the start we were all in this together, and you couldn’t have had a more invested team. For many of us, The Lion King was already a beautiful memory from childhood. Now we had the new shared memory of Africa that would inform all of our work going forward.”
The methodology MPC VFX Supervisor Adam Valdez had discussed with Disney and director Jon Favreau would come into play as virtual production got underway in June of 2017. It would be further developed by Technicolor’s MPC, VFX Supervisor Rob Legato, Production Designer James Chinlund, Magnopus, Unity and other game engine companies, as they all came together to work on The Lion King.
“When we talk about representing the true Africa, that encompasses both world-building and character-building, and that all started in virtual production,” says Valdez. “After we all went as a crew to Kenya together, we continued to do the virtual production together, and we did the final animation and visual effects together. We were collaborative partners from beginning to end, and the technology and methodology we used enabled us to capture the creative choices made together on the virtual set – cameras, lighting, animation, etc. – and carry them through to the end. Everything lived on in this way, enabling us to learn the rules together, what worked and what didn’t. For example, what’s the best way to light a lion from a certain angle? If we figured that out in post, we could relight/ reshoot it, with the real subject on the real (virtual) set.”
MPC helped build the tools for virtual production, which is a technique that we innovated for The Lion King, using a game engine platform to emulate live action film production in a VR space – even though the film is completely digitally rendered, every environment is made digitally by the artists at MPC, and every character is keyframe animated. The tools were being refined constantly; it was a real learning process all the way through. And now MPC has a suite of tools that are available to any filmmaker based on the innovations that we made on The Lion King.
Jon Favreau, Director, The Lion King
The pipeline that evolved on The Lion King served as a “translational” system, with back and forth communication between planning, visualization, art department, production design, and virtual production at one end – and visual effects and animation on the other end.
“Evolving the workflow was crucial,” says Francesco Giordana, Realtime Software Architect at MPC. “How do you get people from different parts of the world working together seamlessly on a huge production where there isn't a real stage? How do you capture every decision made and track everything that you’ve done? How do you define what composes your shots and carry that all the way through to post-production? The pipeline and tracking system are really the backbone of virtual production.”
“It’s very beneficial to be included from day one, because I could work directly with the art department and the production designer,” says Ferrara, who as guardian of the visual rules of the environment (rocks, plants, trees, etc. and where to place them) could pass the rules on to the post-production team so they could learn them too. “The methodology we used enabled all of us to become experts on the environment, and in turn the filmmakers were able to focus more on the storytelling, the performances, and the action,” adds Ferrara.
“This was the first time we had filmmakers walking around as if on a real set – in sync and in real-time using VR – communicating to each other, pointing things out and manipulating things together,” relates Giordana. “It was a real milestone to be able to put multiple people into the same space at the same time collaborating this way – where new multi-user workflows meet old school cinematography and filmmaking.”
“Everything is designed to be done in real time, including location scouting, where you can have multiple people talking about a shot or a scene, and it becomes a live collaborative input arena,” adds Legato. “You need live input to do it, and that's the virtual part of it. When we're in VR, [it] gives you the visceral feeling of being there, of being up high or at a faraway distance. And Jon, Caleb, James Chinlund, Andy Jones and I could all walk in together as a group of filmmakers, a group of collaborators, and see things for the first time, look at things from different angles…bounce [ideas] off of each other. When you're done it’s a very successful location scout that otherwise you could never have done together in a CG world. But in the real world, or in our case a virtual world, we all can share in it and we all can respond to what we're looking at – at the same time.”
“It was really cool to do virtual set scouts with the core creative team all in one VR session walking – and flying – around the set and discussing where we wanted the action to take place,” says Jones. “VR is such a powerful tool to help everyone get on the same page for a movie of this scale. And then, my previs team would whip up some amazing blocking that we all could discuss and reposition in the game engine in real time.”
“Virtual reality is more a tool than the end result,” says Deschanel. “You have to live in it and feel what it’s like to be at Pride Rock in order to decide what you want to do and where you want to do it. But other than that, you really are doing exactly what you do when you make a movie.”
In addition to virtual production, the production was a milestone in lighting.
Filmic lighting is completely different from game lighting, so the team had to find a way for the DP to light a shot as you would on a real set.
“It was an interesting challenge to apply the rules of traditional filmmaking to virtual production,” says Giordana. “Similar to creating visual effects, you need to give some form of close approximation that is good enough to make creative decisions. If you want an African sky or sunrise, we can show you some reference images and speak that language. But it’s a different language from what your game engine offers you, so we need to build on top of the engines to implement the language and toolset you’d have on a real stage conversing with the DP.”
“No physical stage means no limitations, and that really applied to lighting,” explains Newman.” In one of the earliest tests we did – with Rafiki – we put him through five or six different lighting designs from the library of images we had built. We wanted to keep the naturalism as close as possible to a real location shoot – and to what the team had seen and captured in Africa. Seeing Rafiki was the first moment that Jon really responded to and could see the movie becoming what he had envisioned.”
While Valdez was in Los Angeles working with the virtual production team, Technicolor MPC VFX Supervisor Elliot Newman was back in London preparing for shots that would be coming from the virtual production stage. The volume alone was staggering: all told they would be delivering 1490 shots. But first, they would need to extend their toolset in order to increase the realism of the characters even further than they had done before.
“Virtual production was the real new territory on The Lion King, as we had done most everything else before, with the same team,” says Newman. “Now it was: how do we do it better and continue to refine it – everything from skin/fur shading and muscle simulation, to water, grass and trees? There would be a lot of handcrafting involved in every shot, and we put a lot research into things like: how does hair respond to natural light, and how do we shade that correctly; how do we mimic light scattered/transported through skin layers in a physically plausible way?”
“Because we committed to doing all of the work with one vendor,” says Favreau, “we were able to dedicate resources towards research and development for different simulators, for different fur tools, etc. So a lot of the things that inform the photo-realism were innovations that were developed through the research of MPC.”
Adds Legato: “They have this ability to bring all these various departments together to create all these different assets that, when gelled together, create something that is photoreal. They have incredible technical creative knowledge on how to do things and how it’s going to look when rendered. It's very difficult in CG to make up something that looks like it could be real, to keep scraping the surface until you get something that has the patina of real life – and they are gifted at doing that.”
When you see this movie, the animals that MPC created, you'll see a pretty magnificent translation of what that animal really looks like, all the different types of hair, the different lengths of hair, the coarseness – everything is just so, and when that's all applied correctly, with the muscles and everything else, it just comes to life. What MPC has been able to do is a quantum leap even over what we did on The Jungle Book.
Robert Legato, Production VFX Supervisor , The Lion King
Not only are the animals in this film photoreal, but they can act – and they give some great performances.
“Making animals act and talk in a way that doesn’t break the reality is the most challenging aspect for animators,” explains Jones. “When we hit the sweet spot where the animals’ movements do not betray the real thing – and it looks like the animal is thinking and feeling those thoughts – it really is a thing of beauty.”
He continues: “The way Jon works is to surround himself with very talented people in their respective disciplines – [who] contribute creatively and technically to help him deliver a truly remarkable film. I work in a very similar way, trusting my [team] to execute at the highest level. MPC did an amazing job pushing through an insane amount of animation work in a short amount of time. My hat is off to the entire animation department at MPC; they are real troopers and all the love and passion for this film is up on that screen for the world to see.”
“Not only are the characters being developed more refined, but so are their movements and performances,” Giordana adds. “Because of the system that we used, we could do reshoots after virtual production and final animation, with close to final assets – characters with full muscles, tendons, and even fur. For example, playing it back in the engine, you could see Simba through his final animation on the final set, and the director could actually direct the camera vis-à-vis his performance. It’s very powerful because now you're shooting with your final performance.”
“It’s very dynamic in that respect and really opens up the creative scope because you’re not locked into anything you don’t feel is working, and it’s not going to require a big, expensive reshoot,” says Newman. “We can relight it, we can move the camera, we can make changes at any point and the performance doesn’t change. It gives filmmakers more creative freedom and that gives them the leeway to make better choices.”
One of the choices they made was to make 3-D part of the storytelling process, which is not the norm on most films.
“Normally you’d be restricted by what was shot on set,” explains Holly Aldersley, MPC Stereo Supervisor. “But with Rob, they were setting up shots with 3-D in mind, and making creative decisions around it as a storytelling tool – to help shape the emotion in a shot. To make characters seem more (Scar) or less (young Simba) imposing. Or give that same sense of ‘imposing’ to an expansive place like Pride Rock.”
“We did this in native 3-D, which quite frankly – and I've done this before – is the best 3-D film I've ever seen or been involved in,” says Legato. “The optimal viewing of this is a laser projection, high definition, high dynamic range, and in 3-D – that’s the full experience and it’s really quite something.”
Technology is a great means to a great end, but you need people to bring things to life, to add the magic and spark, to humanize the technology.
“We had to get extra creative with a few shots, such as the iconic scene from the original where Simba sees Mufasa in the clouds,” recalls Aldersley. “It’s a huge moment when he appears and tells Simba to remember who you are. And you need to spend the time to get it looking just right – the huge effects, the cloud simulations, the appearance of Mufasa. Meanwhile, you’re still trying to make it look realistic. It's one of the more magical sequences, and you need that human touch to bring the magic – to walk that fine line and find the perfect blend between fantasy and documentary.”
1490 Final Shots
1,490 shots, e.g. 170,668 Frames or 7,111 Seconds or 119 Minutes of final images were delivered to Disney. 145 shots that got started were omitted in the process.
17 Heroes, 63 Unique Species
We built 17 hero characters, 63 unique species in 365variations from Aardvarks to Zebras. Across 1,490 shots, we animated 9,063 characters and 31,421 crowd agents.
with an average of 100 Frames per daily equals to 46 days of footage. 30,807 dailies, only 3.6% of all dailies made, were sent to Disney to review with Andy, Rob or Jon.
7975 Animation Submissions
We sent 7,975 animation submissions for approval, meaning every shot was reviewed 5.4 times on average to receive animation approval.
6182 Final Comp Submissions
We sent 6,182 compositing submissions for approval, meaning every shot was reviewed 4.1 times on average to receive final approval.
18,000 KM traveled
We traveled 18,000 KM around North and South Kenya with 6 photographers and 4 Videographers.
240,000 Photos captured
On those shoots, we captured approx. 240,000photos (10TB of data) and 7,000 videos (11.7 TB)
Highest Ever Photoshoot
Highest ever Photogrammetry MPC has ever shot at 17,000 feet, in helicopter at -10 degrees celcius, for Mount Kenya
66 Sets, covering 150 Square KM
We built the entire world in CG, which is an area of 150 square kilometers. It's 11 times as big as LAX or 28,000NFL football fields
921 Various Species of Greens
Our foliage library was counting 921 assets, 25 unique tree species (238 variations), 49 unique plant species (494 variations), 18 unique grass species (148 variations). 41 Hero Trees.
130 Animators from 30 Nations
Over the course of the show, 130 animators from 30nations were working in London, Los Angeles and Bangalore
42+ Hours of Reference Footage
With a total of 42 hours and 53 min of reference footage that we collected at Animal Kingdom, in the Serengeti and from natural documentaries, it's the biggest use of animal references on an MPC show.
676,578 Bugs in the Movie
There is an average amount of 1085 bugs in 628 shots. Meaning we simulated 676,578 total bugs.
100 Billion Blades of Grass
We simulated over 100 billion blades of grass using our in house technology PaX. The environment dynamics of one particular shot in the jungle took one week to simulate.
1250 People on the Movie
During the course of the show around 1250 artists, production and support staff were working on the movie. 650 in London, 550 in Bangalore, 50 in LA.