fbpx

Explosions, fire, paint, slow motion & 3D 180 - What more could you want? Daniel Fabelo runs through the Final Cut Pro X workflow he uses editing for the popular YouTube channel The Slow Mo Guys. 

Introduction:

In late December, I was approached to work with The Slow Mo Guys on a new project for the Oculus TV platform.

If you didn’t know already, The Slow Mo Guys has over 13.5 million subscribers on YouTube and some of the coolest slow motion footage on the internet using various types of Phantom cameras.

The Slow Mo Guys is the creation of Gavin Free and Dan Gruchy. I met Gavin over 5+ years ago at Rooster Teeth Productions and we have worked together on various projects since.

My experience with VR content is limited, having done just a few, never released experiments with 360 content at Rooster Teeth. But, it was enough for me to understand some of the fundamental concepts with VR production.

That said, this was a whole different thing altogether. And, the results were well beyond what I imagined: watching 3D fire up close is mesmerizing.

What follows is a broad story about how we created 3D VR180 slow motion content and what we learned along the way. Hopefully, it leads others to experiment and work more with 3D VR. Here’s a trailer that—at least in 2D—shows the scope of the project: 

Why FCPX?:

For the most part, it comes down to speed controls, organization, and ease.

Firstly, Gavin uses FCPX for all his conventional videos. As one might imagine, speed ramping is used often with Slow Mo Guys content. Taking video from 2000fps to 29.97fps at the most satisfying moment is an important part of the process. Even the sound effects are all manipulated. This can be done in the timeline with great ease.

Second, most Slow Mo Guys videos are pretty straight forward, often only utilizing two cameras: a DSLR to address the audience as well as the Phantom camera. However, doing 8 episodes with multiple cameras, two Phantoms and numerous timelines, organization is more important than smaller projects.

Lastly, I’m simply a happier editor in FCPX.

Laura James, our stereography and workflow consultant, is also Apple based and works on Stereo180 and 360 content a lot. Here’s what she had to say about FCPX:

“When Apple switched to FCPX, before it was fully functional again, I had a distinct need for multicam features which were not introduced up front upon release. So I had to make the switch to Adobe Premiere. Over the next year or so I made a genuine effort to learn and appreciate Premier Pro. The minute Apple added the needed features back into FCPX, I made the jump back and have never looked back."

"The stability, speed, and intuitiveness of basically everything in FCPX makes it my absolute go to for Flat, 360 and stereo180 content."

“When I first started editing 360, I set up an Oculus DK2 for reviewing stitch lines and overall viewer experience. Then Apple updated to High Sierra and I had to switch to Vive for review, which is driven by Steam.

It is not as stable as the Oculus DK2 was, but it is fully functional and in my opinion a requirement for any 360 or stereo editor. The stability, speed, and intuitiveness of basically everything in FCPX makes it my absolute go to for Flat, 360 and stereo180 content.” 

FCPX SMGFire1

The challenge was not necessarily the creative endeavor itself, but building a workflow for 8 episodes of 3D VR180 content in FCPX.

Eric Cheng, a pioneer in the space and Head of Immersive Media at Facebook, mentioned that Windows-based editing is heavily favored in immersive video production. But, I have been using FCPX exclusively since 2015. And, other than a few tutorials online (including several from Eric himself) there is not much information in this space. Nonetheless, we had Eric’s full support if we could deliver a high quality end product.

That led us to perform a few tests in all sorts of programs: testing masks, stitching software (including Mistika VR and Z Cam VR), and even if my system could take the mammoth 6K resolutions. In fact, I determined that, for rendering, I needed to purchase an eGPU to take my MacBook Pro the rest of the way. (I am, after all, a mobile editor now.)

The Challenges of Creating 3D VR180 content:

For this project we used the Zcam K1Pro and 2 Phantom Flex 4Ks on a stereo mirror rig. The on set 3D alignments were done by Keith Driver. His support and guidance proved invaluable throughout the production. 

FCPX Cameras Comp

Working with a K1Pro is very different from working with regular film cameras. Since the camera can basically “see” itself, we had to hide the crew and position it to optimize Gavin and Dan’s experiments.

We made sure the K1Pro was raised to 5 feet 4 inches every time we repositioned. It also had to be level. Very level. Our subjects typically had to be 5 feet away.

We slated and synced the K1Pro normally. However, keep in mind, there is no need to jam sync. Nonetheless, it is very important when recording field audio that you have a slate as the K1Pros audio is usually 3-4 frames out of sync with the picture. Additionally, since we cannot slate the Phantoms, I kept a log of each file name Gavin saved so that it matched the K1Pro take. 

For a closer look at the stereo rig and the on set production, check out Gavin’s BTS video on his second channel:

Ingest and Organization:

Once ingested, I could not just bring the K1 footage into FCPX. First, I batch stitched the K1Pro footage on set in VR180 Creator.

The program takes each lens, or “eye”, from the camera and stitches it together. The output stereo layout is Top/Bottom, field of view 180 degrees at 2880x5760. Though the mask is different from software to software, this was the fastest and most efficient program to use. Besides, we would mask out the K1Pro lenses later anyway. 

FCPX VRCreator screenshot

Once in FCPX, I proxied, synced and organized the Library. Media lived in one Event, sound effects files in another and Projects in their own.

I used Smart Collections and Folders to separate out the episodes for ease. For the Phantom clips, it is actually a different workflow than a regular Slow Mo Guys project because we have two Phantom clips for each take. 

Once Gavin sent me the converted clips, now ProRes, I created a multicam of each set of Phantom clips. As a result, once I get to the editing phase, the speed ramps and effects will easily ripple to the other camera angle in my multicam. And both need to be exported and sent to Laura for her to create the stereo.

(Note: It’s very possible to have one system do this work, however, at the time, I did not have the knowledge base to take on both the editing and the stereo work.) 

FCPX Phantom Multicam Screen

Now that the multicams are synced, I create a traditional stringout for each episode. The Project timeline is 3840x2160 and Monoscopic.

I open each multicam and Flip Angle B using the Flip Effect; this is because the mirror rig captures one of the angles flipped on the X-Axis. I then duplicate the multicam and place it (in sync) on top and make sure the bottom clip is Angle A and the top is Angle B.

FCPX Phantom Stringout Screen

Stereo Alignments: 

As you can see, there is a disparity between each shot. So, I do my stereo alignments to adjust for imperfections from the mirror rig.

This is a multi-step process that includes pinning the corners and using the Distort tool to align the Y-axis. I scale each image to ~102%, drop the opacity of Angle B to 50%, and align the X position if needed.

That is to say, I align the horizons of each image along the X-Axis first. Then, I do some cross corner distorting, aligning objects in the background (e.g. pebbles or trees or bits of gelatin in the air). It is like you are overlapping the objects, but really you are making sure the top and bottom of each pebbles line up horizontally.

If I am off by more than a few pixels, this will cause the viewer’s eyes to hurt in headset. Once I feel like my alignments are good, I can then adjust each layer to Red/Cyan, respectively, and put on those classic 3D glasses to check my work. Pretty cool! 

FCPX Screen3DOnSet

Once everything is aligned, I then match the color for each Phantom clip. Why am I doing this so early in the process? Mainly because it is easier.

Plus, once I send it to Laura she has everything she needs. The color corrections are straight forward so I just use the Color Board in X. (Note: It is still possible to do minor color adjustments and sharpening if needed once I have the stereo footage later on.)

Done with color, I copy and paste all these color and alignment settings back into the multicam clips. Remember, we are still in a stringout timeline. So I have to get all this work into each clip.

In my stringout timeline, made of those multicams, I make sure to delete Angle B from my timeline and remove the attributes from Angle A. Instead of two multicams on top of each other, I now have a single clip. Switching between each angle reveals all the effects and alignments.

Editing the Phantom Footage:

As with any Slow Mo Guys video, I find my speed ramp moments and cut points if there are multiple shots together. FCPX makes working with speed ramps so easy and I use the Speed Blade tool all the time.

I edit with Gavin and Dan’s reactions in mind as well, making sure the speed ramps do not interrupt their voice over. And, this is where I determine if we will need a mono PIP (Picture in Picture). If so, that file is sent  separately to Laura so she can place it “underneath” the phantom footage.

FCPX Phantom Edit Screen

At this stage, I also sound design the clips, then send to Gavin for approval. After all, it’s easier from him to review in 2D.

Once approved, I can export to Laura. I do this by sending Pro Res HQ clips of each angle and each PIP as needed, very similar to a typical VFX workflow.

Creating Stereo Footage in FCPX:

The workflow for creating beautiful stereo clips out from the Phantom footage is a bit convoluted.

For working in FCPX, stereo footage needs to be in top/bottom format. When you turn a “flat” image (phantom footage) into stereo, you end up duplicating the same eye Top and Bottom. Then, you need to manually create the second eye, doing the same thing. Once you have the eyes created, you then use the Crop Tool to crop the top half of one “eye” and the bottom half of the other “eye” all within a compound clip.

Projection/distance changes have to be done inside the compound clip. Once the compound is in the main timeline, convergence can be adjusted. This is best done with a headset for viewing connected to your computer. 

FCPX Phantom PIP Screen

Adding the PIP is not just a matter of dropping a flat image into the mix. Where to place it and what size ends up being incredibly important, along with convergence to make sure it “feels” right; or, at least, tolerable to the viewer and if it is in the right plane for the stereo.

At the end of the day, we felt like placing our PIP in the position where we normally see a menu in VR made the most sense (beneath the main image and centered). This meant it was not occluding the main imagery or causing a lot of disparity; as opposed to if it's placed on top of an image that has a lot of stereo stuff going on.

Once you place it, you must watch it in a headset to confirm stereo comfort, stereo window, convergence, etc...

Editing in the 360 Timeline:

Now, I can work with the K1 footage. Easy enough really. Each Project is 5760x5760 and 360 Stereoscopic at 29.97 fps.

The K1 footage, however, is 5760x2880 because it is 180 degree content. That’s okay. We’ll deal with that later.

Once I have my selects in the Viewer, I dump them all in the timeline, reduce the X Scale to 50% and the Left Trim to 1% of all clips. This squeezes the stretched footage to 180 with a lot of empty space on either side (the full 360). 

FCPX 360 Timeline Screen

Gavin and Dan chat a lot with the audience. And none of it is scripted. So, the takes are often long as they explain their experiments and we watch them work.

Typically, Slow Mo Guys content utilizes jump cuts to transition between experiments. In this case, that’s a little too jarring in a VR environment. Except for a few fun moments, we decided to put Fades to Black between every take. The fades simulate blinking, each less than a second long. And, it seems to work well in headset. 

FCPX Fire Grenades 4

Once I receive the 360 stereo clips from Laura, I place each Phantom clip into my 360 timeline, and copy and paste my sound design from the other timeline.

I make sure the Roles are extra organized because almost every clip is slowed down and manipulated for Phantom footage. That won’t transfer in the AAF to the sound mixer (using X2Pro Audio Convert), so each Role essentially becomes a stem for him to work with.

I color the K1 footage, making sure to really amp up the saturation, and add our custom 180 mask.

Exporting and Cropping:

Here’s the part that gave us some trouble at first, even after all our testing: How do we get this 360 timeline back to 180? And how do we maintain quality? Well, I’m still not convinced this is the best strategy given the time it takes, but we made it work.

I export a Master File h.264 out of FCPX which exports very close to our target bit rate of 80 mbps.

After 2-3 hours of exporting for each episode (I know, right!?), I drop that render into Compressor. I keep the bit rate the same but crop 1440 pixels on either side. This takes another ~2 hours but gets us our 180 video. 

FBEncoderScreen

Finally, we then drop this render into FB360 Encoder with the final mix, a 3rd order 16 channel mix with a headlock stereo track (to keep certain sounds in place when the viewer moves her head), making sure to select the Facebook 180 option and we’re done!

How to Watch and Review in Headset:

One thing FCPX users don’t have going for them (at the moment at least) is straight forward viewing of VR content. But, there are a couple of ways to do this. One method, which Laura explains via her website, is using SteamVR and a Vive headset.

The other method I used is called “side loading” and involves manually transferring your exported test clips or full episodes to the Oculus Quest.

You’ll need a program called Android File Transfer. When you connect your headset to your computer via USB, peek into the headset and allow permission for computer to access. (You may need to do this step 2 times as the first time you plug it in a ‘can’t connect, allow permissions’ warning will pop up.) Drag the desired video file into the "Movies" folder. In headset, go to Oculus Gallery, select ‘internal’. Click on your video file which will take you to a theater mode and the video will be on the big screen. Click on the icon next to the timeline and it should expand with some choices for viewing. Select 3D180 and you are good to go. 

FCPX Dan Gav Comp

Ultimately, I cannot speak to other Windows or Premiere based workflows for VR180 content, as I haven’t tried them.

We definitely made mistakes and learned a lot through trial and error. However, I believe that the natural ease of FCPX, coupled with potentially better render times, makes working with VR content at least manageable, if not more fluid.

That said, a powerful graphics card is a must when creating the stereo, or even laying in the custom lower thirds. 

If nothing else, we’ve proven it can be done!

For more information on the limited series, check out this interview between Gavin and Oculus on their blog; and check out the episodes on Oculus TV.

 

Daniel FabeloDaniel Fabelo is a writer/director best known for his work with Austin, Texas based production company, Rooster Teeth.

He has directed numerous projects including the sci-fi feature Lazer Team 2, digital series Haunter and the popular YouTube series, Immersion, all of which were edited in FCPX.

Daniel has also been an editor since graduating from the University of Texas at Austin, and continues his work as a LatinX creator in film and television. Feel free to reach out to Daniel at his website: www.danielfabelo.com

 

Log in to comment