April 25, 2017

MPC’s Damien Fagnou: Meeting the Technology Infrastructure Challenge of Bringing Alien: Covenant, In Utero 360-degree VR experience to Market

  • Twentieth Century Fox, FoxNext VR Studio, RSA VR, MPC VR, a Technicolor company, Mach1 and technology partners AMD RADEON and Dell Inspiron today announced the release of ALIEN: COVENANT In Utero, a virtual reality experience.
     
  • MPC, a Technicolor company, played a key role in production of the 360-degree VR experience based on the forthcomingAlien: Covenant” movie.
     
  • MPC was able to meet a very tight production schedule by using teams around the world interconnected by the Technicolor private network.
     
  • The two minute VR experience would have taken a single processor 50 years to render, but the task was completed in a few days by using thousands of processors in MPC’s cloud-based rendering farm.
     
  • Through the development process of this virtual reality experience, MPC has utilized both AMD RYZEN and RADEON technologies based within DELL Inspiron systems.

Damien Fagnou, Chief Technology Officer of MPC, explains some of the challenges faced in creating “Alien: Covenant In Utero”, a 360-degree VR experience that has been distributed ahead of the theatrical release. In this interview, he describes the role of cloud-based computing resources that enabled half a million hours of rendering to be completed in a few days.

Through the development process of this virtual reality experience, MPC has utilized both AMD RYZEN and RADEON technologies based within DELL Inspiron systems.

 

From a CTO’s perspective what were the challenges and opportunities associated with developing the VFX portion of this project in which MPC took a leadership role, while also producing a 360-degree video VR deliverable?

Fagnou: That was certainly a great opportunity for us, and when we created the MPC FILM VR group about a year ago it was really to provide services of this type for our clients. We have been very fortunate to work with Ridley Scott Associates on this type of movie before...so we were in a good position to support the VR project the project and help them realize their vision.

We were able to re-use the pipeline and the software that had been developed for the Alien films and connect with the right creatives and members of the asset team to build the extra assets that would form part of that special experience.

 

How were you able to re-use existing things you had in place to support the VFX? Was there anything additional from a technology standpoint, any additional assets or technology infrastructure that had to be allocated in order to support not only the VFX of the theatrical production but the parallel VR initiative?

Fagnou: We have a large global software team of over 120 people and they were able to jump right in and create special rendering techniques to leverage the cinematic VR.

For this project we had to develop very specific ray tracing techniques to cope with the stereoscopic nature of the camera and with special interaction at the top and bottom so the experience is very comfortable for people using VR headsets.

And we had to leverage the pipeline to do some special effects that you will be able to see on the video in the auditory translations of that experience. It was great to be able to tap into large existing R&D resources and infrastructure at Technicolor to be able to provide those services to our clients very rapidly.

 

The 360-degree VR experience beat the theatrical release to the market. It was meant to be a bit of a teaser for the movie itself.  What kind of time pressures did that put on the MPC team and what did you have to do from a technological resource standpoint to meet what must have been a tight schedule?

Fagnou: In the visual effects world we have seen the timeframes become very tight because the client wants to iterate more, or they want to really polish their films. So we have seen visual effects timeframes go from being almost a year to nine or even six months with the final push window becoming just 16 weeks.

There was maybe a three month production window for us to complete a piece of quite complex work. We are talking about large resolution frames and about R&D that needs to be activated very quickly. And of course it still has to meet the visual effects photo-real standards.

 

Can you give me an idea of the global infrastructure and the collaborative platforms you put in place to support this particular initiative?

Fagnou: We were able to leverage the Technicolor Production Network (TPN) that connects all the global Technicolor locations, including those of MPC, to build truly global teams. We had the Bangalore team creating some of the custom assets that were in the VR experience but are not in the films.

We were able to create fully digital environments for this experience, and custom characters that also were not present in the films.

The team in London was able to create some of the character finishing to make the experience truly beautiful. In Montreal we did the bulk of the shot work for this two minute experience: all the effects and all the lighting, including the compositing. In Los Angeles we had the supervision for the show, and the director driving all that global effort.

By leveraging the TPN and the proprietary MPC syncing toolkit we were able to have one global team working on that project following the sun and meeting a really tight deadline while allocating the right resources in the right locations.

 

My understanding is that some of the motion capture work and even some of the compressing and encoding of the work was done at the Technicolor Experience Center. Can you talk about how all those pieces of the pie were managed by yourself as the CTO?

Fagnou: The Technicolor Experience Center (TEC) is a great place that Technicolor provides. Clients can come and experience all the latest technology in VR. It is also an incubator for some of our VR experiences, for both MPC film VR and also for some of the other business units within Technicolor that are creating their own great motion capture environments.

We were able to connect with the TEC so they could be part of the story and help provide an all-round service – not only for special effects -- but providing expertise in VR encoding.

All of this plays a role in how you plan the rendering process.

We talked before about this being a cinematic VR experience, and the beauty of doing that in a full computer generated environment is to make pixel perfect rendering.

360-degree VR is experienced inside the headset where you can look around in any direction in stereoscopic vision, and have a great sense of immersion and depth. In order to produce those pictures in a pixel perfect manner you have to have a very customized rendering pipeline where every single pixel of that picture will be rendered from exactly the right angle.

This posed some interesting challenges, first to integrate it into the pipeline and review it in compositing, and also in terms of what happens with the top and bottom direction where you are looking up at the sky or down to the floor.

So we developed a very customized rendering plug-in. We decided to create large 4K x 4K photographically perfect 360 degree frames and really enable a 3D experience that is very immersive and very real and that would provide the sense of presence that is so important in VR.

 

It sounds like you are working with a tremendous amount of data. What kinds of capacity and performance did you need to put in place to make sure the hardware and software and the network requirements were there to support what you have just described?

Fagnou: One of the big strains a cinematic VR experience puts on the pipeline, even a triple A visual effects pipeline like ours, is the size of the frames and the timeframe.

With 4K x 4K frames you are talking about eight times the standard visual effects frame size. So of course those require very long render times. We are looking at about 100 hours for a single frame of that VR experience and it has to be done for two minutes of video at 30 frames per second.

That represented a very large amount of rendering. It was just shy of half a million hours of rendering. To put that in perspective, on a single computer it would take 50 years to render the experience that people see, but of course we had to deliver that in a very short timeframe. We had to turn around this entire sequence in three or four days. We had to be able to work until the deadline to provide all the creative feedback to our directors.

So it requires four to five thousand computers put together into a farm to render that experience in that short a time. That puts huge pressure on the render farm.

With MPC providing secure rendering in the cloud, we were able to increase our capacity in the Montreal farm by a large amount while still providing the secure environment that all these frames require and continuing to deliver rendering power to all the other productions we are working on.

So cloud rendering really represents a great enabler for this type of experience that requires very large rendering resolutions that will only increase with better headsets and higher frame rates.

By leveraging MPC’s and Technicolor’s leading secure rendering processes in the cloud we will be able to provide the levels of iteration and quality that our clients require, and that their audiences want to experience in their productions.