April 24, 2017

Technicolor’s Bob Eicholz: Successful Immersive Experiences Require Entertainment Value Chain to Rethink All Key Disciplines

  • New immersive entertainment capabilities like virtual reality (VR) and augmented reality (AR) will enable more intimate connections to end users, by placing the audience virtually inside the experience, rather than simply in front of it.
     
  • Despite recent gains in the speed and power of processing and storage equipment, immersive experiences will require the industry to rethink the entire technical infrastructure.
     
  • The lifecycle and workflow of the production process needs to change substantially in order to move these immersive experiences forward in a commercially viable manner.

Behind all the excitement and buzzwords surrounding virtual reality (VR) and augmented reality (AR), it is easy to miss the fact that immersive reality-based entertainment is the birth of an entirely new medium.

The opportunities throughout the entertainment value chain are clear and present. Analysts like Digi-Capital are bullish on the AR/VR market, forecasting that revenue will hit $120 billion by 2020.  A new Juniper Research forecast expects the VR hardware market -- including VR headsets, peripherals, and 360-degree cameras -- would grow to more than $50 billion by 2021.

But one of the most significant challenges: there is no single, plug and play solution today that accomplishes all the necessary requirements for a fully immersive consumer experience that remains true in all respects to the creative vision.  The entertainment industry must rethink all of the key disciplines from content creation to workflows, and beyond.

To better understand the issues that need to be addressed to create an effective end-to-end ecosystem, we caught up with Bob Eicholz, Chief Technology Officer, Production Services at Technicolor.  Eicholz, is taking a leadership role in exploring and resolving the technical aspects of immersive experiences, and shared with us the challenges and opportunities surrounding the next generation of digital entertainment.

 

Bob, can you share with us your perspective on the rapidly evolving immersive experience technology landscape?

Eicholz:  This is an incredibly exciting time.  Those of us who have been around the industry for a while – back in the 2000s when we transitioned from film and photochemical to digital – kind of feel this happening again – only more so with AR and VR.  These new immersive experiences are going to affect every aspect of our pipeline and require us to rethink the entire technical infrastructure – just like we did when we went from film to digital.

We had to design whole environments of computer systems and ways of manipulating images.  Now we have to do that again.  So this is really the birth of a new medium – a new way of telling stories, a new way of having interactive experiences that makes us go back to the drawing board and re-think pretty much everything we’re doing.

 

What are the image and pipeline issues that you see as we move into this new, immersive environment for entertainment?

Eicholz:  That’s a huge issue and core to what Technicolor does.  If you look at some of the existing VR and AR experiences, they’re wonderful; they give you a sense of where it’s going.  But you see a lot of pixels, you see people that are kind of blurred and don’t really look like people, the skin texture is not there yet.

Those are all because of current technical limitations.  As those limitations go away, as we get better processing speeds, all of a sudden the image is going to become more important.   It won’t be good enough just to see a solid color.  People are going to expect the images that they see in their AR and VR experiences to look real, to actually look like a real person sitting in front of you.

This goes to the core of what Technicolor does. We work on movies like The Jungle Book where we make animals that talk look real.  And we are now doing this for emerging projects within the VR pipeline.

From a color standpoint, the current issues are: different cameras used for real-time capture images in different ways.  The color looks different in different cameras.  We need to learn about that.  We also need to learn how to preserve that color as you work on the image, as you make it interactive, as you put a character into a new environment.  We need to appreciate how to maintain the color so that it looks real.

If you just drop it in and don’t think about that, it’s okay for now but that’s not going to be good enough in the long run.  And finally, how do all the colors in the pipeline and the look of the images translate into the different devices – whether it’s an iPad or a head mounted display or HMD.

There are six or seven different types of headsets now.  So how do you create an experience that starts with the camera, preserves the image all the way through the pipeline and that gives the user a reasonably consistent experience at the end of the day when they view it.  How can we ensure that viewers are seeing what the content creator envisioned?

That’s where Technicolor plays in the movie business and the game business today, and that’s where we are now playing in the VR/AR/MR space now.

 

What are the processing and storage issues as you see them immediately evolving?

Eicholz:  Let’s take this back to the early film days again. When we went to digital, the processors simply weren’t fast enough for everything that we wanted to do.  Now they’re getting close to being there.

In AR and VR, the processors have to live in a headset, and the render pipelines that deal with the images as they’re being manipulated by artists require incredibly fast processors that generate tons of heat.

We need processers that are on a whole new universe of steroids to do what we want to do.  The same thing is true for storage.  When you look at light field technology and the amount of data that’s generated, you raise an eyebrow when you tell a technical person how much data we have to deal with.

It’s not only about storing that data but also transmitting that data over the limited bandwidth that we have on today’s network infrastructures.

The whole AR/VR experience requires a level of processing and storage we’ve never seen before in the consumer market.  If you have any doubt about this, download some of the free VR experiences that are on the Internet to your iPhone and play with it for 10 minutes.  Your battery probably goes to 50 percent immediately and your iPhone gets really hot because even though the processor in the iPhone is extremely powerful, it’s working really, really hard.

And that’s just today’s experience.  You can imagine the types of speed, heat and electricity issues that we’re going to be dealing with the next few years to take the experience to a new level.

 

How do the technical limitations you are describing affect the actual creation process – the tools that must be used to create these stories that you’re talking about?

Eicholz:  The state is pretty much the same; the technologies are in their infancy.  If you take someone who today works on visual effects in movies, and you just orient them a little bit to what we’re going to be doing with VR and AR, a creative person suddenly is going to get it.

Within the first hour, they’re going to understand what this is.  Then they’re going to be asking for all kinds of tools that don’t exist today. Let’s say you have an image that you have captured with a 360-degree camera and you bring it into an environment where they can see it through a headset or on their screen.  The first thing they’re going to say is how do I manipulate the color on that?

They’re going to say this part of the scene is too dark and I want to lighten that up.  Or I want to shift the user’s focus to this part of the scene.  How do I do that and what tool do I use to do that and what does the interface look like?

And once I do it, how do I play it back so I can see what I just did?

These tools are still only just being developed.  There are pieces of the tools that exist, but they all have to be modified for this new environment to provide a creative 360 or virtual creative environment that you should be able to create within that environment.

 

Q: What about sound?

Eicholz: The same with sound. Today, if you’re looking up at a flat screen, you can go back and forth and front and back.  But with a headset now – or a VR experience or AR experience – the object is moving around in space and the sound needs to follow that object.  So what are the tools that enable us to not just manipulate where the sound is in the consumer’s head...but how do you manipulate those sounds as the consumer moves around and how do you marry them to the location of what they’re looking at.

For example, if a user is experiencing a character in a VR environment and the character walks across the room, you have to be able to track that sound and you also have to be able to track the fact that the user may not look at the character as the character is walking.

You want the sound to follow, but you also want to -- as the sound moves across the room – change the nature and quality of that sound

We need a whole lot of tools to do things a whole lot better – and you need it to be in real time, which is the other issue we’re dealing with here.

We need the ability to be able to manipulate images, then play it back right away to see what did I just do and how does it look?  Does it parallel the original intent that I had when I made that change?  There are pieces of these tools out there today, but nothing that brings them all together and all of them have a long way to go.

 

Q:  What does the lifecycle and workflow of the production process look like and what needs to happen to move this experience forward in a commercially viable manner?

Eicholz:  It’s kind of the same thing [that happened] with film to digital.  When that happened, we had to completely rethink the workflow.  With film, before we had digital, it used to be very linear – you were cutting pieces of film.  Then all of a sudden, we had a scenario where you’re capturing images digitally, you’re doing some manipulation there and you’re taking that sequence of images through an entire pipeline right on set.

Over the years, we have developed very good workflow tools; people are still working on them. But we have very good workflow tools that enable collaboration, enable assignment to individuals for individual shots – you can parse out shots, you can give instructions, we have tools that allow for review and comment on different scenes and images within the moviemaking process.

That’s going to be the same thing with VR.  If you have live capture you’re going to have some live footage, you’re probably going to have some 3D objects in that, and all of that then needs to be distributed to the creatives who are going to make it look better.

Managing that entire flow all the way to the end where the final rendering is done and ultimately the [quality control] QC is done and the distribution.  That’s an enormously complicated workflow.