VR or virtual reality was the hot topic on the show floor at this year's NAB. But who is making content and how is it made you may ask? We spoke to New Deal Studios about their recent work shooting & editing 3D 360 virtual reality on features, commercial projects and music videos.
A company with an impressive list of VFX movie credits such as The Dark Knight Rises, The Avengers and Hugo is now pushing the boundaries in the editing, VFX and colour correction of VR. Unlike many production companies, New Deal Studios has gained early expertise in 3D 360 video productions - where different images are sent to each eye to create 3D depth in all directions, which is known as ‘stereoscopic 360 video’.
We had the pleasure of talking to Matthew Gratzner, the Co-Founder & Creative Director and Jeffrey Jasper, Digital FX Supervisor & CTO of New Deal Studios. You might not of have heard of New Deal, but you have definitely seen their work. The re-creation of the Montparnasse train crash in Hugo is an example, but you will also recognise their work in The Avengers, The Dark Night and Shutter Island, and their Visual Effects Academy Award winning work in Interstellar,
Matt: I cut in FCP 7 for a long time and what I realised with the adoption of FCPX was if you could get past the general nomenclature, the cutesy stuff, the ‘It looks like iMovie” - then in terms of complexity and the material it can handle, it is superior to a lot of cutting tools out there.
I understood why Apple did this, professional filmmakers are a tiny segment of their consumers, so when they changed the name, they just made it more user friendly with names like scene or event as opposed to saying it's a bin.
Jeff and I were were fortunate to get into virtual reality about a year and a half ago, and a lot of people ask me why Apple hasn't jumped on VR. I say that if you look the trends, a lot of tech software or hardware companies are all trying to jump in and they are making promises that don't happen because the technology is still being developed. Apple under-promise and over-delivers and at some point they will come out with something that will knock everybody out and everybody will say 'that's what VR should have been.'
As more VR productions came about, we found Final Cut Pro X was the best way to cut the projects. Bottom line, it handled the footage.
New Deal didn't start using Macs however, they were using Windows PCs and Unix based computers before the arrival of Jeff.
Jeff: We had tested Final Cut Pro X when it first came out and we felt it wasn't great for for us, so over the years we have tried everything under the sun.
When we were working on the Fast & Furious ride for Universal Studios and we needed something very fast on set to do editorial, I went back and tested it and was blown away by the performance and the footage it could handle as we were working in 5K RED Raw.
So when VR started, I once again went through testing and FCPX just sailed through stuff that other programs just chugged on. We're cutting 8k native on a few generations ago iMac and it just handles it without a problem.
Pretty amazing stuff, especially when other NLEs can't play back 4K consistently without problems on maxed out systems. But how did they go about actually cutting the 3D 360 material?
Matt The first VR project I was involved with was with a company called Jaunt. They came out of 'stealth mode' in 2014 with a camera that was basically 14 GoPros assembled in a prototype housing and by using their software you could make 360 3D.
I saw their demo which I said was really cool with scenes of the Golden Gate Bridge and a string quartet, but 10 or 12 seconds in I said 'What else have you got.' A producer, Vicki de Mey, had approached Jaunt with a contact to a World War 2 reenactor named Hans Beerbaum- they were going to shoot the World War 2 enthusiasts having a battle. I said 'Don't just stick a camera down with a bunch of guys running around with guns, let me come up with something that’s narrative.'
So I brought in two terrific writers I work with often, Joe Kanarek and Ryan Gaudet, and the three of us came up with a World War 2 narrative called The Mission. At this time, I was told in VR you could not cut, you could not move the camera and everything had to be first person, which as you can imagine is quite limiting.
Matthew Gratzner directing The Mission in 360 VR.
So we only had about three and a half weeks to prepare for the shoot. We wrote the script as first person and in the test realised two things.
I moved the camera around and as I had blocked the scene, I ended up watching as third person, not first. Watching back over the next few days I realised you didn't need first person, but it was almost as if you were actually 'in a Hollywood movie' and it was a new, different way to experience a film.
Shooting The Mission using a Jaunt camera.
The second realisation happened by accident to Matthew. In the process of shooting the test, there was a glitch with the camera and in the processing of the footage, the glitch was cut out. It went from one location that had been established to a different location. The idea of cutting between scenes which had previously thought to have been bad, actually worked in this case.
Matt: The main shoot for The Mission was over two days and we had a lot of actors, soldiers, a tank, motorcycles, everything you'd expect in a World War two piece. When I got the footage back, I turned to Jeff and said 'How are we going to cut this?'
I thought that I'd have to cut everything as a single eye and then match the cut up with the other eye. Jeff dropped the footage into Final Cut Pro X and it all just worked. So what I thought was going to be weeks of experimentation trying to get this done, ended up with me cutting the film in day. It's only a six minute short, but I couldn't believe it, I thought 'Am I the only guy doing this?"
A tank arrives in 3D 360 VR on The Mission.
Matt: I was working with proxies in FCPX as you would with any large format project. The challenge here was the final conform at the end when you bring in all the high resolution media. This is because you don't want to have to stitch all the high resolution footage together before you start cutting.
During the conform when I had to do tweaks or a layout with the 6K footage, I couldn't believe how smooth it was. This is an 6K version of over/under stereoscopic imaginary and it's processing it in realtime!
I was just blown away. The things we are doing seem to be so simple, so 'pushbutton' that you would have expected FCPX to be doing VR stuff for 10 years. I really couldn't believe it, it just treated the VR like any other footage.
Another example of this is a music video for the indy rock band, Galvanized Souls, I wrote and directed. I used speed ramps extensively. We were shooting 60FPS and by using optical flow in Final Cut Pro X, the final result looked like we were shooting at 200FPS. Once it had rendered, it played in real time. It treated VR just like any other footage.
Shooting 360 VR for the Galvanized Souls music video.
Galvanised Souls- New Generation (Watch in Chrome to get the 360 VR)
Matt: We are cutting with latlong files which are like cutting with giant panoramas. The reason is because I directed in 360 knowing where all the action happens and where I'm trying to lead the audience's eye - I really have to see all the action unwrapped. You may catch something in a take that might be better behind you and knowing where the action happens with the quadrants unfolded is a lot easier. Once I get a cut that I feel is working, I'll watch it in 360. I'll export a movie file, then compress it down to an mp4 to watch on Gear VR.
I still don't what is going to track with an audience, but what I have found is -
Imagine a scene where you would normally get from A to B in 10 cuts, in VR you have to figure out how to do that with one cut. That doesn't mean things have to be made up of one continuous long shot, but if things are stylistically cut up, it can be hard to follow.
You can be looking in the wrong direction on a cut and it can be very, very disorienting. My idea of VR Filmmaking is very much like a John Ford or Howard Hawkes approach to shooting a movie where you have to know at all times where the camera is and where the actors will be. You can't fix that in post.
You have to have a clear understanding of how the script is going to translate to the shoot and how it will be cut together. The more cuts you have, the more disorientating it will be.
Shooting 'Mutiny' using the Nokia OZO camera.
So with Matt doing a proxy cut, what happens afterwards and what about creating VFX in 3D 360? This is where Jeffrey takes over:
Jeff: Matt would do the proxy cut and from there it goes to final stitching and I do the VFX and compositing on those. We are working at full resolution in Nuke for compositing and with Modo which has a spherical stereoscopic camera for CG.
We have been fortunate to have been working alongside The Foundry with their tools called Cara VR which have just gone Beta. At the beginning I created my own tools in Nuke, but The Foundry's tools are significantly faster and easier to use. Out of Nuke I'll run a full resolution EXR image sequence and a ProRes 422 file that goes back to editorial. For example on our Nokia OZO work, Matt then will do a replacement of the 8K Nokia OZO footage with the final VFX 8K footage.
Jeff shot the model ship on an iPhone and composited that into the scene on Mutiny.
Matt: With any type of project, VR or any digital filmmaking, you want to try and keep the project inside the software as much as possible. We were approached early on to do the colour on dedicated color grading systems and we tried it out. There was a weird colour shift though that I didn't like.
I used the colour board in FCPX for the edit and I found I liked that a lot better. I was also concerned that everytime I took the project out of FCPX, there would be some kind of generation or quality loss. So what was nice was not only FCPX handling the format, but it being able to do the colour correction as well.
On the music video I mentioned, I even used the multiscreen transition built-in to FCPX and that worked perfectly just like it was a piece of video, no different! We've also been using the Dashwood VR tools that allow you to reorientate the sphere or add graphics - all on the timeline.
It all meant that I could finish the picture without ever taking it out of the FCPX timeline.
A huge thank you to Matt and Jeff from New Deal Studios for taking the time to talk with us.
Take a look at this video from The Foundry where Matt & Jeff talk about using their tools in making 3D 360 VR
©2016 FCP.co