Project Profile: Page (1) of 1 - 10/16/11 Email this story to a friend. email article Print this page (Article printing at page facebook

Advanced Technology Succeeds with Tighter Budgets on Real Steel

By Glenn Derry

My team created something that we call Simulcam, which provides a sort of augmented reality. We pre-record CG characters and backgrounds, and feed them into a camera playback system on set. The director can then see CG elements, motion-captured characters and live actors composited together in the viewfinder or the monitor, using a real time chroma key. He can direct the movie as it will be seen in theaters, rather than guessing on set how live action will work with CG, and hoping to make it work in post.?

This is the next generation of the system that we developed for James Cameron to use on "Avatar." We gave him a tool that he could use in real time to combine pre-viz environments, motion-capture, and the typical pieces of a live action movie. It was played back for Jim, but also sent straight into the Avid for on-set editorial. Editing and metadata cataloguing was beginning even as the shots were being recorded, connected together into what we call a virtual production system.

It was a little different on "Real Steel," where, with "Avatar," we spent years experimenting with this technology, and here, it was ready to go. More important, these tools were used less for visual effects than to support traditional narrative storytelling. Yeah, there are boxing robots in "Real Steel," and I understand why when you look at the trailer you might think that it's only boxing robots, but the main part of the story is between a father and a son, played by Hugh Jackman and Dakota Goya, who's just amazing. The robots are a backdrop to this father-son story. It's a really heartfelt movie, and the approach that the director, Shawn Levy, took very much reflected that. So we were able to use the tools we developed for Jim Cameron on "Avatar" to support more traditional storytelling by someone who isn't a visual effects-oriented director. It actually made things much easier for Shawn. He was able to lean on his animators and effects team to come up with solutions to problems, and focus his attention on the performances. Speaking from a technology perspective, the Simulcam system we designed was a major contribution to Shawn's ability to focus on the actors -- and tools from Blackmagic Design are a major contribution to our ability to make our virtual production system work so well.

Virtual Location Scouting

The fights took place in four or five arenas, and you might say we scouted the locations virtually. One was a field, another was a factory, and is set in an abandoned zoo in Detroit. We would review each location, do HDRs, and build a mock-up of the area. The art department could use that to build a physical set that we put into our motion control capture environment.


This allowed us to choreograph a fight months ahead of time, so that the actual layout of the physical set is built into the CG model much earlier in the process.

Those decisions help inform the art direction because while you're choreographing, you see a shot. "That would be cool. You know what would be even cooler? If we add a doorway here...or we're going to need more people here." We take care of all of this months ahead of time, and shoot it with our virtual camera. We already know where the CG elements are, and we already know where our live actors need to fit into it -- and we know that the entire shot works.

We can go even further. "Okay, it's going to be 35mm lens, facing northwest," and a lot of other things that we would have had to work out on set anyway. We can get away with building only half an arena, and know that we only needed 600 extras to sit in sections A1 to A6. We could then know that we'd always have a full set of extras in the background of every shot. We used them to populate the stadium, and used 180-degree flips depending on what side we were shooting from.

Then once we start shooting, everything makes sense geographically, within the context of the fight itself. So, you have this fight with really cool choreography. You've predetermined what the angles are going to be that will work really well for the actual fight between a CG characters that aren't even there. And you have your live actors positioned exactly where they need to be. That's how the director has the extra time to spend with his actors, focusing on their performance. Every aspect of the logistics has been mapped out long ahead of time. It's not, "Let' just shoot now, and figure out later where it fits." There's no sense of place that way.

This way, you know exactly where you are in the fight the whole time. The camera operator, the Steadicam or crane operator -- whatever they aim at, however they frame the shot, it all makes sense, because they can see the previz CG and the live actors together in their viewfinder or on their monitor, in real time. Camera operators can see exactly what they're going to get from a set-up, all the way through post. This is imperative for giving them the freedom to do their work on set, and be confident that it's exactly what the production needs. It's exciting to watch them when see that they can work in that kind of context. With that information in the hands of the director, the DP and all the camera operators, you're basically walking onto the set with an uncut sequence in hand. "Okay this is my shot, this is my shot, this is my shot." Breaking it down this way, sort of like by the numbers, we were able to jam through these huge fight sequences very quickly. Think about the biggest, nastiest CG effects shots, how long does it take to set these up? Forever, right?

Not for us. We were picking up on days on what would normally be the slowest part of a production.

Location Tech

We did some stage work, but the bulk of the work was in several live-action locations in Detroit. The easiest thing for us was to set up a 22 foot production trailer that had in it engineering, some of the video assist equipment, an Avid Unity with two Avid editing seats, and a Truelight color station. We were also able to roll our video assist carts and other gear in and out as we moved from location to location.

Otto Nemens provided our Sony F35s, lenses and camera platforms, and they helped us set up a quad-link network connected over fiber. The cameras had one little box on the back of them, and everything came out of the camera over a single CAT4 fiber that ran back to the truck. From the truck, we would send back out to the set everything they needed for monitoring at the video assist carts, video village and all that good stuff. We were also in certain locations where we had to set up way, way down the street. We brought six miles of fiber in all.

The cool thing about fiber is that it's fiber, man. It's really durable, and the signal integrity is very high if it's set up right.

In addition to the picture mastering, we did a lot of what we call Image-Based Capture. We had a series of cameras -- in this case, six EX3s -- placed around the location, pointing to an area, and we basically did live optical flow motion capture, to capture things that weren't mo-capped during pre-production.

On a normal shooting day, then, we would have three F35s and six EX-3s connected to the truck, and a cart for the guy operating the EX-3s, and feeds for the video assist guy, the video engineer, the colorist and the editorial, plus the other video assist carts around the set, for the director and others. All of it was run through Blackmagic Design Videohubs.

Because every single node in the chain was also a networked Videohub -- we had GigE running over fiber everywhere -- we could switch any monitor from anywhere else on the network. We built a cool little custom front end application so that people could pre-set a bunch of buttons to do whatever they wanted them to do, and then just press one button to switch all the Videohubs into a particular shooting mode.

None of this had to be in any one place. This is very different from other routers, in that we didn't need to be homerunned. Everything could be dispersed in the way that made most sense physically.

Color-Timed Dailies

We used Blackmagic Design Mini Converters for all of the hard copper SDI connections in each of the carts, and then we used the Blackmagic Design HDLink Pro 3D Displayport on some of our color correct monitors. The cool thing about those is you can play out proper output transforms to the monitor you're working on. You've got an HP Dreamcolor monitor attached, for instance. You can build the proper 709 output, probed so that it's correct, and then build an output transform just for that monitor so that it matches back to the color profiles already set up during pre-production. These will match up perfectly in the DI later on.

We treat that live/CG material from the Simulcam just like you would footage coming off the video tap of a camera. For that and the live-action footage, we're cataloguing metadata as it's rolling right into the Avid while we're shooting. The footage is colortimed properly, with all the appropriate hooks that will allow you to get back to the source media, like an online-offline workflow.

As soon as we hit the stop button, we're done. We have all the footage captured, color-timed and sitting in the Avid, ready for anything. We have an assistant editor on the set to make sure that everything is coming in organized the way the editor wants it. He can start cutting on the spot, or walk away with a hard drive that has everything on it -- color-timed footage, real-time previz comps, all kinds of metadata -- and be ready to go. The material is already prepped, so basically, we're doing finished dailies as we shoot, immediately.

Counting Up The Savings

Working this way, we cut conservatively $2.5 million dollars out of our budget. The money came out of film processing, and it went into hardware on the front-end to a certain degree, but it was so much more cost efficient and so much more elegant.

I'm a proponent of doing things right. Doing so much work in setup means that the production is efficient as it can be, both with the live action shooting and budgetarily. As a result, I'm becoming more of a producer now. I understand the technology, and how it can be used with the right people to operate the equipment so that things are as efficient as they can be. Otherwise, all this tech is a waste of breath. You have to understand where the budgetary efficiencies are.

We've been learning a lot about this over the years. People think that we had an unlimited budget on "Avatar." Let me tell you, everything was budgeted. It was tight. Every single penny spent on making "Avatar" is in the final print.

I look at a film like "Real Steel," and we came in under $80 million. I don't kid myself. This is still a bigbudget picture, but it has all the same large-scale visual effects that you would see in a $200 million movie. The reason we were able to do it is because we planned it. We knew what we were going to shoot and shot it. Not one day went over -- not one -- so by the end, we actually wound up ahead of the production schedule.

Page: 1

Glenn Derry

Virtual Production Supervisor

Related Keywords:Glenn Derry, Simulcam, Blackmagic Design VideoHUb, Real Steel, Avitar, Visual Effects

Content-type: text/html  Rss  Add to Google Reader or
Homepage    Add to My AOL  Add to Excite MIX  Subscribe in
NewsGator Online 
Real-Time - what users are saying - Right Now!

Our Privacy Policy --- @ Copyright, 2015 Digital Media Online, All Rights Reserved