A Look at How ILMxLab Is Developing the Future of Entertainment

There’s a technology cabal located deep within the confines of Lucasfilm’s Letterman Digital Center in San Francisco. It’s here that one of Disney’s top R&D teams is experimenting with the future of storytelling through virtual reality, augmented reality and mixed reality technology. The make-up of this division, dubbed ILMxLab, is a who’s who of top digital magicians from Industrial Light & Magic (ILM), Skywalker Sound, LucasArts and Lucasfilm.

Consumers have already gotten a glimpse of what the future holds when it comes to these emerging platforms. ILMxLab released a free Trials on Tatooine interactive experience on HTC Vive in 2016, and there are several 360-degree Star Wars experiences from The Force Awakens and Rogue One available for anyone to experience. There’s even a demo of a Magic Leap mixed reality experience on YouTube that shows R2-D2 and C-3PO interacting in a living room with real people. And that’s just the tip of the iceberg. Behind-the-scenes, ILMxLab technology is helping directors bring new Star Wars films to life both early in pre-production and on set.

Rob Bredow, Chief Technology Officer at Lucasfilm and head of ILMxLab, explains why he’s excited about the future of entertainment in this exclusive interview from the VIEW Conference in Turin, Italy.

I&T Today: Augmented reality is something that’s already out there with Microsoft’s HoloLens. What potential do you see for AR in the future?

Rob Bredow: I like what John Gaeta says about it, which is “augmented reality is going to hit us like a ton of bricks.” And it really is. We see so much opportunity on the storytelling side and on the immersion side for AR, especially when you get to see it in the room around you. That’s why we’re starting now with our experiments on that front. Our goal is to be several years ahead of where the technology is available to the public and to be looking at what storytelling looks like on these different platforms, and running experiments working with world class creatives to get to the heart of what these new mediums have to offer us from the storytelling perspective.

I&T Today: Is the work you’re doing in VR helping with the AR side?

RB: Yeah, there are a lot of similarities. There are a lot of differences, too. And we’re just scratching the surface of that. But all the character work, all the AI work, all of the first-person immersion, how you interact with the person, a lot of that applies very, very well. The world building seems to be a little different, and of course interacting with the real world has its own host of opportunities and challenges too.

I&T Today: We’ve seen Pokémon Go introduce a lot of people to mixed reality. What opportunities do you see with augmented and mixed reality?

RB: We’ve been doing augmented reality and mixed reality experiments for the last few years. In fact, at ILM we’ve been doing virtual production work, which ties almost directly into that. As we started doing our mixed reality experiments we realized how closely our experiences mapped from virtual production work, which is even motion capture on stage and putting things over plates straight into mixed reality. The things we’re really interested in right now is what are the kinds of stories and the kinds of interactions you want to have in that mixed reality environment?  Games are going to be fantastic, and Pokémon Go is a great example of that. They can be more casual and they can be more involved. We’re also interested in what storytelling feels like in the environment and how characters need to react to you when they should and shouldn’t be there. We’ve done some collaboration with Magic Leap, some of which we’ve shared, but you get the sense of these characters that you might be familiar with your whole life actually walking around in the room that you’re in, and it really is a different kind of experience.

Screenshot from Trials on Tatooine | Steam

I&T Today: R2-D2 and C-3PO were featured in one of those Magic Leap experiments running on Unreal Engine 4 technology?

RB: That’s right – standard game engine technology rendered in real time. When you look at that video, C-3PO and R2 walking around the table and bringing the hologram up, that’s actually shot through Magic Leap’s hardware. We’ve put our camera behind their hardware and when you get in there and look at some of the details, you’ll actually see there are some rack focuses and things in there, and the computer graphics focus just like the real world. There are some subtle details that when you’re just watching in it on YouTube, it could look like a visual effects trick, but we promise we put a real camera behind the prototype version of their technology and that’s what we’re getting out of it. It’s much more immersive than we can even express in that YouTube video, so it’s pretty exciting.

I&T Today: Can you talk a little bit about how telling stories in VR, AR, and mixed reality is borrowing from video games as well as Hollywood?

RB: My hunch is that what we’re going to see with the native stories that flourish in VR or mixed reality, they’re going to be something different. But they’re going to borrow from filmmaking techniques and they’re going to borrow from video game techniques because we’re looking for that interactivity, we’re looking for whether it’s a true adventure or not, we’re looking for at least a sense of agency in the story. We’re also looking for great storytelling, and that’s one of the reasons we want stories told to us and have for tens of thousands of years. Sometimes when you’re actually letting someone choose their own adventure you may not have that compelling of a story. You end up on the short branch and your story just ends and you just play it again to get what you were looking for. We’re really interested in trying to find what elements to borrow from each of those worlds. My hunch is that we’re actually going to come up with new techniques that do borrow from both of those two fields.

I&T Today: For a really long time, the term convergence has been used in Hollywood. In some ways do you feel like VR or mixed reality is a realization of that, but it’s different than what people were originally thinking?

RB: Yeah, that’s right. People have been talking about how we’re going to be rendering movies on GPUs for a really long time, and actually we’re starting to do that in some of the technology that we’re doing in the advanced development group is getting us closer and closer to that goal. So we are seeing some of that convergence, but this is the first really good reason to see that convergence. When you can use the same Millennium Falcon that you’re using in Episode VII, and a few months later you can be playing it in virtual reality – and that’s the same artist who made the textures for the real thing – are the things you’re experiencing in VR on a very similar timeframe because of the shared technology and the convergence we’re talking about here.  That’s where you’re actually bringing real value to the equation, and then it’s actually going to happen.

Featured image: Rob Bredow’s portrait from Lucasfilm, licensed under Fair Use.

Picture of By John Gaudiosi

By John Gaudiosi

All Posts

More
Articles

[ninja_form id=16]

SEARCH OUR SITE​

Search

GET THE LATEST ISSUE IN YOUR INBOX​

SIGN UP FOR OUR NEWSLETTER NOW!​

* indicates required

 

We hate spam too. You'll get great content and exclusive offers. Nothing more.

TOP POSTS THIS WEEK

INNOVATION & TECH TODAY - SOCIAL MEDIA​

Looking for the latest tech news? We have you covered.

Don’t be the office chump. Sign up here for our twice weekly newsletter and outsmart your coworkers.