The Lion King

Q&A with Francesco Giordana, MPC Realtime Software Architect

Technicolor’s MPC Film was a creative partner with the filmmakers of The Lion King from beginning to end – from when VFX Supervisor Adam Valdez first pitched a methodology to Disney and director Jon Favreau in October 2016 while they were still wrapping up The Jungle Book – through location scouting and virtual production and on to responsibilities for all VFX and Animation.

MPC’s Francesco Giordana talks about the technology – and the milestones achieved on The Lion King.

 

Q - The Lion King has been called a milestone in virtual production. Can you talk about what that means?

A - With The Lion King, we had our first prototype of a multi-user interaction system where multiple people could work at the same time, on the same stage, sharing the same scene and manipulating it in real time together. They could each see what each other was doing, looking at the same setup in VR, and moving things around together.

It was a way of scouting and replicating a real location, but virtually. Usually this is a lone experience where you have one person on a monitor seeing only a little slice of the scene. There’s another person on another monitor seeing another slice of the scene, but they're not in sync.

For us, it was the first time we had people walking around the set, in real-time using VR, as if it were a real set, talking to each other, showing things to each other, and moving things together. It was the first time all these things came together – in a real application of new technology – VR – coming together with multi-user workflows and old school cinematography. Everybody was moving around and talking to each other and sharing a point of view – as if the set existed in reality. It was a real milestone to be able to put multiple people into the same space at the same time collaborating this way.

 

Q - The Lion King is also a milestone in lighting. How did working with game engines affect that aspect of the filmmaking?

A - Filmic lighting is completely different from game lighting. So lighting a shot like you would in a real sense was a challenge. If you want it to be a sunrise or an African sky or partly cloudy, we can show you some reference images and we speak that language. But it’s a different language from what your game engine offers you. You need to build on top of the engines, to implement that language.

Building a lighting toolkit is very important to be able to have a conversation with the DP; to understand what he wants, and show him something so that he’s able to make decisions. Not just a representation that’s kind of vague. We saw that those lighting choices survived to the final pixel, so it was very important to tackle it at that early moment.

If you look at some of the final renders now, you see it's a completely different quality, but the light direction is the same. Some of the choices were preserved. But because of the technology, we had to implement custom lighting techniques, to represent those you would have on a real stage.

You have a very specific language and toolset to implement – and it was about marrying the technology with the language. Finding a way for the DP that was good enough for him to light the shot – it was an interesting challenge, applying the rules of traditional filmmaking to something almost at the opposite end of that. Like creating the visual effects, you need to give some form of close approximation that is good enough to make creative decisions.

 

Q - What other technology development did you do in terms of the game engine?

A - Originally we were mostly handling the pipeline side. As it became more fluid, we started handling the rendering, and a lot more of the user facing tools and technology. At that point, Jon Favreau [the director], Caleb Deschanel [cinematographer], Andy Jones [animation supervisor] – they were discovering what they could do and were coming up with new ideas, new things to try.

For example: if you really stretch, you can have 40 characters on screen. But there was a crowd of wildebeests and they wanted 400. So you have to spend some time coming up with a crowd system that works in a game engine. Then you have a flock of birds and they want a few hundred of those. So it's always going one step further – and seeing something new that you haven't seen before. And suddenly you’re opening up new windows.

There's a lot of extra development that has to happen; instead of having a toolset ready on day one, you have a shell that you gradually start filling in with everything they discover they want to use. It was very productive because we had something driving it and we got a very clear view of what they wanted to use. This is the good thing about having somebody like Jon who's tech savvy and wants to use technology. But you have to respond very quickly because you can't hold up the shooting because of it.

 

Q - How important was it to adapt the technology for traditional filmmakers?

A - If you take a traditional cinematographer and put him in a VR headset, chances are they will not like it at first. But in this case, you had someone like Caleb willing to try for the first time and he ended up spending hours in it – accepting the technology and then starting to use it more.

The pipeline, and the rendering technology that we added on top, are what enabled them to manipulate things with these tools. [Game engine technology] out of the box does not offer what is needed for large-scale productions – the huge environments, lots of animals, challenging lighting scenarios, all sorts of animation edits. A lot of the value that we brought to the table was enhancing the technology so that you could render more things and get better results.

 

Q - Can you talk more about the pipeline and the methodology that evolved on The Lion King?

A - A big part of it was figuring out the workflow. How do you have people sitting in different parts of the world, working together on a big feature where there isn't a real stage? How do you make sure that when the cinematographer goes onto this ‘virtual’ stage that they have a shot to work with? And that you can take what they've done, and come back tomorrow and pick up any portion of the work you've done before – because you’ve tracked everything that happened. How do you define what composes your shots? Then take that and turn it over to post-production without having to start from scratch all the time.

The original challenge was figuring out a workflow where we define how to bring assets in – and when we’re done, be able to take them out and understand what happened to them. We have the set, the lighting, the cameras, the lenses – and we have to track all of that, handle all of the shots in a reliable way, and not miss anything that’s been done.

As we built this pipeline and tracking system, the backbone of the virtual production, we then had all of the information that we needed to build our layouts for post-production. We also discovered other things in production, such as a particular way of recording and tracking data for creating animation; our own implementation was completely rewritten and went in a different direction that gave us smooth animations and reliable animation data.

 

Q - How did that benefit the filmmakers, and especially the final VFX and animation?

A - In terms of the animated characters being developed, you're looking at more and more refined characters that you're shooting; more refinement of their movement and their performances.

After virtual production and then final animation, they had some VFX shots where they realized the cameras were not quite right. At that point you have two options: get an animator with a mouse to redo the camera; or go back to the system that we used and reshoot.

The difference is that now you don't just have a previs representation anymore. You have closer to final assets/characters, with full muscles, tendons, maybe even fur – that you can play back in the engine. For example, you can see Simba through his final animation on the final set, and the director can actually direct the camera vis-à-vis his performance. As a tool for filmmaking, it’s very powerful because now you're shooting with your final performance.