Ever wanted to know the complete workflow for a series from shooting, through editing in Final Cut Pro X and then delivery? Sarah M. Deuel and David Payan have documented the entire post-production process of Rooster Teeth’s paranormal reality TV series, Haunter.
INTRO:
This article describes how we designed and implemented the Haunter series post-production workflow using the unique capabilities of FCPX. We provide an outline of the overall workflow and then describe each stage of our process in detail.
Our descriptions span the entire project, from Haunter’s on-set process, NLE organization plan, assignment of metadata during the editorial process, and set up of remote libraries to our collaboration for final VFX, sound, and color. We also note troubleshooting issues we encountered and features we hope FCPX can improve upon.
Why FCPX?
Our project involved 100+ hours of footage and required a quick turnaround from production to delivery. We needed a post-workflow that would allow us to work collaboratively and expeditiously in conjunction with other post-production departments. FCPX met our needs, in part, because of its metadata capabilities. By opening up the conversation during pre-production, we were able to set up a system that several editors could follow to collaboratively create 16 episodes.
One main advantage that FCPX has over any other NLE is its ability to create and parse through mass amounts of metadata.
Although this ends up being an advantage in the long run, many editors still approach FCPX with unreasonable amounts of caution because they’re used to the old way of thinking about NLE user interfaces and workflows. It is a method that uses a plethora of bins and timelines to endlessly categorize media. Although this method works, it’s unnecessarily tedious and a bit dated.
Here is a metaphor that the Haunter post team likes to use to help draw a comparison about why metadata-based organization triumphs over the old bin structure:
With bins, you’re essentially creating a library bookshelf. Similar to the way librarians organize books, first by genre and then alphabetically by an author’s last name, bin organization is specific, limiting, and often requires more time or a larger post team to help navigate and find media within these limited categorical structures.
With metadata-based organization, you’re creating a search engine. FCPX works like the search engines you’re familiar with, using metadata to sift through seemingly infinite amounts of information to pull the most relevant results for specific inquiries at lightning-fast speeds.
As long as you develop a list of specific search engine optimization techniques upon ingest (keywords, note formatting, roles, subroles, markers, etc), your editorial team will have the ability to use things like smart collections, FCPX’s browser search bar, the Timeline Index, or separate database libraries to quickly find any type of clip or bite that they may need.
Daniel Fabelo (Showrunner): "Admittedly, I have long-lasting experience with FCPX, editing with it myself in all sorts of projects. Then, as a director, we chose to cut Lazer Team 2 in FCPX. But, it proved the most powerful with unscripted content. So, as co-creator of Haunter, I pushed to use FCPX for our 16-episode run. As a director, I find it so comfortable. Gone are the days of waiting on editors to perform simple edits, searching for footage lost in bins or “selects” timelines. It’s so much faster than other NLEs.
As a showrunner, I love it even more because, though the deadlines do not change, I know I’ll have more iterations in the editing room. Because the Libraries are so well organized on the front end, and editing is generally faster in the timeline, I am able to see more experiments, watch more rough cuts and find our story within a huge amount of footage more efficiently than other NLEs. Furthermore, the team remains small and nimble, and onboarding is still relatively easy. In all aspects, I find FCPX to be the most comfortable platform for editing or directing."
1. OVERVIEW
About the Show
In their search for the paranormal, Geoff Ramsey and the Achievement Hunter crew use some of the most unorthodox methods ever used to find ghosts: lure them, antagonize them, and provoke them – even recreating the spirits’ deaths. Everything and anything you’re not supposed to do!
Haunted by his own ghost encounter, Geoff Ramsey challenges the Achievement Hunter crew to explore a haunted location and draw out the lingering paranormal spirits. The Achievement Hunter crew will travel to the world’s most famous haunted houses, asylums and abandoned locales to recreate elements of local lore and scare the pants off one another.
Because the spirits from beyond feed off of fear, anger, and anxiety, they will come up with creative ways to prank and surprise their teammates... if the ghosts don’t do it first!
2. ON-SET EDITORIAL
On-Set Post Production
DIT
When ingesting footage from camera cards, use software such as Hedge to ensure that all of your media copies over without any errors.
For the unscripted portions of our shoots, each character was given a handicam so we ended up naming their individual cards by character name and card number so that we would have that metadata associated with clips upon import into FCP X. (So instead of labeling Geoff’s card as A001, we labeled it as GEOFF_001 instead).
After Season 1, we experimented with creating manually-transcoded low-res proxies. When time permitted, our DITs would transcode each card into a standard definition h.264 proxy, making sure to retain aspect ratios of each.
- We wanted these low-res h.264 proxies for 2 reasons:
The h.264 codec at a low-res helped us keep file sizes at an absolute minimum. - The small proxy file sizes enabled us to create self-contained “remote libraries” that we could easily pass along between editors working out of office. We will discuss our remote library workflow in detail later in this article.
Production Sound
Make sure iXML data is recorded into your wav files so that each microphone track of audio is named after the mic type or the character’s lav. This will make sure that each track comes labeled into FCPX (as a subrole) and will allow for smoother turnovers for sound later.
Script Supervisor
Because Haunter was a show that had a variety of scripted and unscripted portions within its format, there are two different ways we required our Script Supervisors to take notes:
For the scripted part of our production, we sent script supervisor’s a “Shot Log Template” from the third party application SHOT NOTES X. This makes it possible to quickly apply the notes directly to the footage in the NLE later.
For the unscripted “ghost hunt” portions of our shoots and b-roll shots, the Script Supervisor created a document, broken down by each slated scene, that noted major story beats; memorable sound bites; moments that struck a chord of interest with the director and/or showrunner; or any technical problems encountered while rolling.
Our ghost hunts often contained 1 hour+ long takes that were each slated. This is how our Script Supervisor structured this portion of notes.
Article link for some further insight: Off the Grid.
3. ORGANIZATION
Ingest
In our workflow, our editorial workstations were set up by our tech team to be connected via NFS server. When we received drives prepped by our DITs, we used Hedge once again to backup the drives to our server. Once the backup was complete, we created ingest libraries for each episode. This is where much of organization + metadata creation took place before transferring media to “working” editorial libraries.
The Ingest Library
Intro
The ingest library will be the Assistant Editor’s “Master” organizational library where all of the footage will be ingested and organized into Events that mimic the folder structure from the day of the shoot. This is useful in the event that we ever need to backtrack in an attempt to recover a missing media clip from our Editorial Library -- in this sense, our Ingest Library will eventually become a search tool for Assistants and Editors.
Setup - Library Preferences
Start off by creating a library and setting up the following preferences in FCPX:
By checking the “Leave files in place” setting shown above, FCPX makes sure to reference your media using your designated NFS server location instead of creating duplicate media (either within the library itself or to another connected location).
Checking the “Assign iXML track names” is important to ensure that all of the on-set sound recordist’s media comes into FCPX pre-labeled with a subrole that is appropriate for turnovers later in the pipeline (as mentioned in the ON-SET POST PRODUCTION section above). This will also help keep your timeline organized throughout the editorial process.
Notice how I don’t have “Create proxy media” selected under the “Transcoding” section. I’ve found that when importing large batches of footage, it is much more efficient to prompt FCPX to begin transcoding proxies AFTER all of your media has been imported. This keeps the NLE’s power focused on one big task at a time which, in turn, keeps time spent waiting for mac’s glorious beachball at a minimum.
Setup - Library Settings
Once your preferences are set up, move on over to the inspector tab to adjust your library properties. Select “Modify Settings” next to the “Storage Locations” setting to make your adjustments:
Set the “Media” location to the designated folder on our NFS server where all of our video source footage existed. This ensured that FCPX generated all proxy media in it’s own subfolder within the directory we designated for our video media.
Set the “Motion Content” to “In Library” so that the Titles, Effects, generators, and Transitions you use can pass between editors.
Set the “Cache” to the “Movies” folder on your Mac. This will keep your cache files separate and easily accessible for cleaning later down the line.
Set the “Final Cut Backups” to a designated folder titled “zz_Old” that existed in our NFS server folder structure. This ensured that all of our backups existed in one place for quick revival if/when they were needed.
*Note*: For quick library creation, we suggest that you duplicate this library and use it as a template for any subsequent Ingest libraries.
The Ingest Process
Ingest - Importing Media
Once all of your library settings and FCPX preferences are setup, you’re ready to import your media and generate proxies.
Before navigating to the import window, you should create Events that match the folder structure used by the DIT on-set.
Next, you’ll select an event and import the media from its corresponding folder on the server. (Right click for larger images)
*Tip*: When the import window pops up, make sure to keep the “Create optimized media” and “Create proxy media” boxes unselected. This prevents slowdown from FCPX transcoding proxies in the background while you’re importing all of your media.
Once all of your media is imported, highlight all of your Events > Right Click > Select “Transcode Media”. A box will pop up and you’ll have the options to “Create optimized media” and/or to “Create proxy media”. Choose “Create proxy media” and FCPX will transcode proxies for all of your media.
*Note*: For some media formats, FCPX will automatically begin transcoding optimized media upon import. You can pause this in the “Background Tasks” window until you finish importing all of your media.
Ingest - Batch Renaming & Scripty Notes Using Shot Notes X
Using the third party application, SHOT NOTES X along with the now filled out “Shot Log Template” that we provided for our Script Supervisor before the shoot, we were easily able to batch rename; add in preliminary metadata; and attach scripty notes to our clips.
The process for doing this is quickly explained in a 3-minute video here:
*Note*: As noted earlier, for Haunter, we had our Script Supervisors only provide notes for the portions of the shoot that were shot with more of a narrative approach along with any B-roll shots. For the unscripted ghost hunt portions of the shoot, we didn’t use Shot Notes X for our ingest process. An explanation for how we went about renaming and notating our ghost hunt footage will be mentioned later.
Sample of Media in Browser:
BEFORE SHOT NOTES X
AFTER SHOT NOTES X
Sample of Media in the Info Inspector:
BEFORE SHOT NOTES X: AFTER SHOT NOTES X:
Once all of the metadata from the more scripted portions of the Haunter shoot were added by Shot Notes X, we manually selected our unscripted clips (in relevant batches) and added metadata to the Reel, Scene, Take, Camera Angle, and Camera Name sections of the Info Inspector window.
With these metadata fields filled out, we were now able to batch rename all of our media using a custom naming convention.
To do this, we selected all of our unscripted clips, clicked on the “Apply Custom Names” button > “New”. Using the “Naming presets” window we created a new preset called “Media Naming Convention” with the following settings and hit ok:
The product of the above steps is a neatly filled out browser complete with matching clip naming conventions and metadata:
Ingest - Spatial Conform
Now that all of our media was renamed, we selected all and set the “Spatial Conform” to “Fill” in the video inspector window. *This saved us from a bunch of headaches when prepping turnovers/conforming for color.*
Roles & Subroles
Ingest - Roles & Subroles
Before creating multicams, we set up a series of Roles and Subroles that we intended to use throughout the editorial process. We then batch assigned these roles to all of our media.
This not only added another layer of searchable metadata to our library, but also was paramount in keeping our libraries/timelines organized and ready for quick Color, Audio stem, or VFX turnovers when needed.
Here’s an example of the roles we created for Haunter:
Here are some Audio Role examples that contain subroles imported instantly using the iXML metadata that we had our sound recordists provide:
Ingest - Multicams
Once we finished renaming, adding metadata, and assigning roles to our media, we created multicams. This was fairly simple given that all of our clips were named appropriately using the naming convention defined above.
With all of our multicams generated, we went ahead and created a smart collection to gather them for easy sifting.
Keywords, Note Formatting, & Markers
Because of our tight turnarounds and small team size, we decided to mark up our multicams using a mixture of Keywords, Markers, & formatted Notes.
Keywords
Keywords served as one of the foundations for our metadata building process. In order to keep everything in our event browser clean-looking and simple to navigate, we used a system of emojis to categorize key moments of importance for the show:
- ✨ - strong story beat moments
- ⭐ - fair moments that might be used in the edit or in bonus content.
- 🎥 - good ghost hunt b-roll shots
- 🔈 - good sound bite moments (with or without good video)
- 👀 - any time a character exhibits fear (head turns, screams, etc)
- 😂 - comedic moments + jokes (even if they don’t land)
- 👨 - anything that shows off a person’s characterization or is important to the persona of a character (ex: Jeremy being aggressive, Geoff being a scaredy cat)
- 🔮 - used whenever there’s a moment that can be perceived as physical evidence of paranormal activity
“ROOMTONE” - moments with clean roomtone audio - “FOOTSTEPS” - moments with clean footstep audio
- “BONUS” - BTS moments, deleted scenes, etc
- “BLOOPERS” - mishaps, funny BTS, etc
Formatted Notes
After getting through a couple of episodes of Season 1, it became apparent that there were some consistent pulls of moments that our emoji keywords weren’t parsing out efficiently enough. We then created a formatted system of note taking to include in the notes section that corresponded with our keywords.
- “Tech - Setup” - noted whenever a Tech device is properly introduced/setup
- “Tech -” - noted when tech is being used to hunt for ghosts
- “Setup -” - noted whenever new objectives are stated, a new location is introduced, A new tactic for making contact with ghosts is introduced, etc
- “Transition -” - noted whenever there are moments where teams communicate with one other over radio, another team is mentioned for intercutting, or for general transitional points during the ghost hunt
Building on this idea of formatted notes, we decided that it would be much more efficient to log soundbites in a similar manner. *Later on, this gave us the ability to build a massive soundbite library that we were able to organize per character via smart collections.
Formatted Notes - Soundbites
Aside from using the “🔈” keyword for more general soundbites, I made sure to keep a consistent format for any character soundbites that I noted so that they will be searchable despite which emoji keyword they end up falling under.
Here’s the search format that we created for each character:
Gavin | search for... | Gav: “ |
Geoff | search for... | G: “ |
Jack | search for... | J: “ |
Jeremy | search for... | Jer: “ |
Lindsay | search for... | L: “ |
Michael | search for... | M: “ |
Ryan | search for... | R: “ |
Trevor | search for... | T: “ |
Formatted Notes - Tech
To make pulling specific tech usage moments easier, I made sure to include the name of the device used in all of my notes where applicable.
Examples:
By doing the above, we effectively created an easily searchable database that allowed us to use FCPX to pull up a variety of specific moments including: key story beats, sound bites, reactions, comedic moments, transitional moments, etc.
Markers
In addition to keywords and notes, we also used markers and to-do markers to note crucial moments in our multicams. Often times these were technical mistakes that we needed to attempt to fix in post or they were moments that deserved a second pair of eyes or ears.
Metadata Example:
4. METADATA: A SEARCH ENGINE
The Power of Metadata
The Power of Metadata: Smart Collections
For Haunter, we used FCPX’s smart collection feature extensively throughout the editorial process.
Smart collections allowed us to build out quickly accessible search presets that helped us pull up instant lists of very specific things WITHIN our 1hr+ long multiclips.
Below is an example of how we used smart collections to create a database of sound bites organized by character:
In the example above, I was able to pull 254 different sound bites, from all of the footage we had shot with a character named Gavin, in as little as 10 seconds.
This feature is so powerful that I could perform the same feat, in the same amount of time, for virtually anything I had logged using our search engine optimization standard that we setup at the beginning of the project.
The Power of Metadata: A Story Producer’s Best Friend
In addition to the above, we also utilized a third party application called PRODUCER’S BEST FRIEND that allowed us to share PDF excel sheets that broke down multicams with everything that happened, using our emoji system as a guide to point out specific story beats without every needing to open the NLE.
Example of metadata notes exported using Producer’s Best Friend:
5. LIBRARIES
Remote Libraries
Earlier, we mentioned how we created low-res h.264 proxies with the intent of using them for the purposes of working while out of office. As you probably already know, FCPX is a bit finicky when it comes to proxy creation and it is often advised that you create proxies within the NLE despite the limited control it gives you over things like codec and resolution.
After some experimentation, we figured out a way to utilize manually transcoded proxies within another set of libraries we ended up creating: remote libraries.
Remote libraries became sort of like our shuttle editorial libraries (for lack of a better way of describing it). If we were hard at work within an editorial library and wanted to continue the work without needing to be tied to the server in our editorial office, we created a remote library where the work could continue, and then we shuttled the timeline back into our editorial library with all of the new work when we were back in the office.
The process was pretty straightforward albeit tedious at times.
Creating Remote Libraries
1) Before we did anything, we made sure to create proxies with minimal file sizes using Compressor. We found that a near standard definition resolution using the h.264 codec was what worked best for our specific show. We created a new folder to store these low-res proxies, and kept them tucked inside our server until we needed them making sure to keep the media organized by episode and then by camera card name.
2) Next, if we were working on a specific episode, we would simply duplicate that episode’s editorial library and change where we wanted our media to be stored:
Once this is changed to “In Library”, you’ll want to consolidate your media so that all of your footage becomes packaged within the library itself.
3) When this process is complete, right click on your library > select “Show Package Contents” and then you’ll see a folder break down of your project with each of your events displayed as its own individual folder.
Navigate to your media folder, open the “Original Media” folder, and delete all of the video files making sure to avoid accidentally deleting any of your audio files.
4) Next, you’ll want to go back into the your media folder and open the “Transcoded Media” subfolder. You’ll want to delete anything in this that isn’t the “Proxy Media” folder. Once that’s complete, simply copy the low-res proxies you made earlier and paste them here allowing finder to overwrite existing files.
5) Now, simply open up the library like normal, turn on proxy mode, check to make sure that none of your clips have a yellow warning sign. If they don’t, copy the library to a portable external drive or send it to yourself via the cloud / an online sharing application. You’re now ready to edit from home. Note: Ignore the red “missing media” alert you’ll see on all of your clips when in original media mode. The original media has been scrubbed from your remote library to keep the library size at a minimum but will relink when you transfer timelines back into your editorial library.
Tip: While working in a remote library, do not alter any metadata (this includes: clip names, notes, favorites, rejections, etc). Doing so will duplicate your media when you transfer it back into your editorial library.
The Archival Library
Early into Season 1, we noticed that FCPX libraries that become over ~1.5 GB in size experience crashes and slowdowns more often than smaller libraries. For this reason, we created a 3rd type of library: The Archival Library. Archival libraries enabled us to move old timelines out of our editorial libraries, reducing their file sizes without any loss of work.
Within each episode’s archival library we created an event to hold all of our media followed by events for each phase of editorial (Assemblies, Rough Cuts, Fine Cuts, Locked Cuts, Finishing, Finals).
To transfer timelines into our archival libraries, we moved old timelines into an event we labeled “Transfer”, copied the transfer event from the editorial to the archival library, and sorted the media within our archival library. Once this was complete, we deleted the transfer event and it’s contents from our editorial library.
In the event that we ever needed to refer back to old cuts, we would just open the archival library and transfer the old timeline back into our editorial library via a transfer event.
Transferring Timelines
Now that you’re ready to head back into the office and continue working within your editorial library, you’ll want to prep your timeline to be transferred.
- Inside your remote library, create a new event titled “Transfer”.
- Move your updated timeline into the transfer event and make sure to rename it so that it has a name unique to any existing timeline in your editorial library.
- Open the original editorial library that you had duplicated to create the remote library.
- Copy the transfer event from your remote library over to your editorial library and boom! All of your media should relink to the original media automatically. If it doesn’t, restart FCPX and that should force a relink.
6. COLLABORATION FOR FINISHING
Turnovers for Davinci Resolve
Prepping turnovers for Davinci Resolve was pretty straightforward. Before we even started our conform process, we duplicated our final locked cut timeline to ensure that if we broke anything while prepping for color, it wouldn’t ever effect our locked cut.
Once we had our timeline duplicated, we overlayed a ProRes 1080p export of our locked cut and disabled its video. This was important in order to ensure that we still had an accurate reference on the timeline while we underwent the next step of the process: decluttering the timeline to remove any components that weren’t necessary to send off to our colorist.
To do this, we detached all video and audio, eventually scrubbing all of our audio edits, deleted markers that didn’t pertain to color notes, deleted any FCPX specific items like chapter markers, disabled text, broke apart compound clips, and flattened our secondary storyline where applicable.
Our timeline went from looking like this:
To looking like this:
*Note*: The large areas of gap clips in our color timeline were sections with night vision handicam footage that we decided to not send to Resolve. Instead, we made some minor color enhancements to those portions within our FCPX locked cut timeline.
Now that we had a simplified timeline that only contained elements necessary for our color process, we started our next step: conforming for Resolve.
Spatial Conform - Set It To “fill”
The first, and most important thing, I want to highlight is how absolutely necessary it is to make sure that the spatial conform for all of your clips is set to “fill”. If it’s set to anything else, Resolve has a different way of interpreting those settings and you will end up with color renders that have dramatically mismatched framing.
MAKE SURE ALL OF YOUR MEDIA HAS THE SPATIAL CONFORM SET TO FILL BEFORE THROWING ANYTHING ON A TIMELINE.
Because we were working with multiple editors, some relatively new to FCPX, there were a few occasions where we had to manually make time consuming adjustments in order to match framing once their clips’ spatial conform setting was switched back to “fill” after they had changed it to something else. This was a headache and made the conform process take 2-3 x’s longer. The good thing is that this is a problem that is very easy to avoid if you’re aware of it early on.
Speed Changes
For Haunter, we decided to treat any kind of speed change (whether constant or variable) similarly to the way we’d treat a VFX shot. We grabbed those clips, exported them as full resolution ProRes 4444 files with 10 frame handles, and then cut them into our color timeline replacing the clips with speed effects.
Flips, Flops, Filters, Rotation, and Effects
In our conform process, we made sure to remove any kind of image rotation, flips, flops, filters, and/or effects from our media. Most of the time, these effects do not smoothly translate over to a Resolve timeline. We moved these clips with removed effects to the secondary storyline so that both our editorial team and our colorist knew to pay special attention to these clips throughout the finishing process.
Flattening Multicams
For Haunter, we had a massive amount of multicam media cut onto our timelines. Sometimes these multicams contained as many as 16+ cameras in an hour long clip. Unfortunately, FCPX has yet to add in a flatten/collapse multicam feature into the NLE. This is definitely something that needs to be improved on.
There is however a work around that incorporates doing a preliminary roundtrip to resolve in order to flatten the multicams and then back to FCPX to finish prepping your project for your colorist. Because our tight turnarounds didn’t allow us much time to spend in the conform process, we decided to keep multicams intact. The main drawback of this was in storage space when handing off our turnovers to our colorist.
Here’s a link for the workaround.
Sending to Resolve
With our color timeline prepped, our next step was to consolidate all of our media onto a drive for our out-of-house colorist.
To do this, we created a new library with an event titled “PREP FOR COLOR”. We then copied our color prep timeline over to this new project and exported a .fcpxml of this timeline to be included in the drive. This essentially pulled all of the media relevant for the color process over into its own library where it could easily be consolidated onto a drive.
Now that we had an isolated library with our color timeline and an .fcpxml of the project, we moved onto consolidating our media. The process for doing this is pretty straight forward:
You simply create a folder titled “CONSOLIDATED” on the drive that you plan on handing off to your colorist and then go into your library properties and change your storage locations settings as follows:
Once you have these settings appropriately set, you can begin consolidating your media. You do this by clicking the “Consolidate” button under the “Media” storage location setting.
After this step, we included the following in the drive that we sent off to our colorist:
- A copy of the FCPX library that we used to consolidate our media
- The .fcpxml of our timeline
- A “track list” that contained a brief blurb about why we had clips in the secondary storyline (rotation changes, flips, flops, etc.)
- A folder with all of the media necessary for the FCPX library within the folder we titled CONSOLIDATED
- A ProRes 422 HQ 1080p export of our locked cut with timecode
*Tip*: I suggest that you consolidate at the end of the day or on another workstation. It will take a while for your media to be copied over to your drive and you won’t be able to use FCPX too smoothly while this process is running in the background.
Turnovers for Music Composition
Because we made sure to implement assigning Roles early on in our ingest process, creating turnovers for our music composer was fairly simple and straightforward.
Just like with our color turnovers, we duplicated our locked cut timeline and sent a reference export of our locked cut with timecode over to the composer. In our duplicated timeline, we then used the Timeline Index to quickly export .wav stems of our various roles.
We did this by simply enabling and disabling specific audio Roles in the Index, exporting, and then repeating the process for each different stem.
Here’s what it our Index looked like when we exported a Dialogue stem for our composer:
Turnovers for Audio Mixing (Pro Tools)
We worked with an in-house sound designer/re-recording mixer that used Pro Tools requiring us to deliver AAF/OMF files of our locked cut timelines. Because we were working in-house, we did not have to consolidate media onto a drive like we did for color.
Unfortunately, FCPX lacks the option to export these file types straight from the NLE (another improvement that FCPX should incorporate in later versions) so we had to use the third party application X2PRO AUDIO CONVERT.
Once we had the application installed, we checked out the preferences and found the following to work best for us:
General:
Media Handling:
Roles:
Once we setup our X2Pro settings, we went back into FCPX to begin prepping our audio turnovers.
Similar to our process for color turnovers, we worked in a duplicate locked cut project and simplified our timeline to make sure that the only assets that existed were relevant to our re-recording mixer. Essentially, we detached all video and audio, shift + deleted video tracks to place a large gap clip in our video track, deleted any irrelevant markers, and deleted ALL CHAPTER MARKERS.
*Note*: Pro Tools will not import any file that has FCPX chapter markers associated with it. This includes your reference exports so make sure they’re removed completely in any asset handed over to your re-recording mixer.
Our timeline went from looking like this:
To looking like this:
Once we had a simplified timeline ready for or re-recording mixer, we went ahead and exported a .fcpxml and imported that into X2Pro.
To keep things neat for our mixer, once we imported our .fcpxml into X2Pro we went ahead and organized our roles bunching them together by type.
Unfortunately, X2Pro’s user interface is a little dated. You have to highlight each role and use the up and down arrows (located on the right) to move your selected role up and down. It’s tedious but only takes around 5 minutes to do.
When you finish organizing your roles/subroles, hit the start button and X2Pro will start creating an .AAF version of your timeline to be used with Pro Tools.
7. TROUBLESHOOTING AND OBSERVATIONS FOR IMPROVEMENT
Troubleshooting
Although there are many aspects of FCPX worthy of praise, we did encounter a couple of setbacks throughout our editorial process. After spending some time troubleshooting these issues, we found solutions that were relatively quick and easy to implement. To save you from the headache that we endured, here are the issues we found and the resolutions we came up with for each:
Media turning black in the Browser and Timeline
The strangest issue we encountered had to do with our timelines and browser suddenly being populated by black audioless clips midway through the editorial process. When revealing these clips in the finder, they seemed to be linking up correctly to the source media so why did it seem like our media was all offline or unlinked? After spending a couple of hours trying to figure out the answer to this question, I went out on a whim and decided to try to uninstall and reinstall RED’s Apple Workflow Installer. A few minutes and 1 computer restart later… and voila! This seemed to fix the issue turning all of our black clips back to the original media that we had started cutting with.
Pro Tools Codec Errors
For the first half of Season 1, we kept running into an issue where our re-recording mixer couldn’t import our reference exports into Pro Tools. If I remember correctly, the error message he kept getting stated something about the codec being faulty. We spent hours trying to transcode using various flavors of ProRes at different resolutions but the issue remained.
We eventually tried to delete all of the FCPX Chapter Markers that we placed on our timeline and re-exported. That was the simple solution to hours of lost time waiting to try new exports.
Things To Be Improved
- VFX compatability
- Improved metadata search functionality
- Flattening multicams
ABOUT ROOSTER TEETH
"From humble origins in a spare bedroom, Rooster Teeth created a new form of entertainment for a changing media landscape. The series Red vs. Blue sparked the growth of a passionate global fandom, and Rooster Teeth built on that foundation, creating a dynamic entertainment business built through subscriptions, advertising, consumer products, live events, and more.
Known for our powerful comedy brand and fan-focused approach, Rooster Teeth is a truly unique pioneer of the digital content industry. Rooster Teeth has a massive global footprint of more than 45 million subscribers to its YouTube Network, 5 million unique monthly visitors to its RoosterTeeth.com hub and 3 million registered community members. Operating since 2003, Rooster Teeth is now a subsidiary of Warner Media."
Sarah M. Deuel studied film at the University of Texas at Austin where she had many opportunities to foster her deep-rooted appreciation for storytelling. She worked on many projects ranging from large budget films, such as Song to Song, to independent documentaries like Becoming Leslie. As a senior editor at Rooster Teeth Productions, she has worked on a wide range of comedic and reality-based projects. From the sci-fi action comedy feature film Lazer Team 2 to the paranormal reality TV Series Haunter, her projects have continually demonstrated Sarah's dedication to her craft, as she has imbued both characters and narrative with genuine interest.