We seem to be so close to coupling two Apple technologies together to achieve something really cool. But at the moment, they are miles apart.
First off, if I knew more about programming, I wouldn't be writing this article, I'd be making a plugin for Motion and/or Final Cut Pro that did what is about to described!
Secondly, (Here's the big spolier) at the moment to our knowledge, this isn't possible with the iOS/FCP/Motion combination, without doing a huge multi-app workaround.
What are we on about?
Using the tracking data from an iPhone to automatically place objects in 3D space on the accompanying video in Final Cut Pro or Motion. Imagine being able to shoot a piece to camera and then have graphics pop up in 3D around the presenter.
Now, if you are thinking this would be extremely difficult to do, you would be right, until a few months ago when FXhome released CamTrackAR. This is an iOS camera app that captures tracking data when it shoots. All made possible by the clever harnessing of Apple's VR technology available in the iOS/iPhone.
That data then gets imported into the companies' Hitfilm NLE or third party apps like Blender & After Effects (using an in-app purchase) with compatible tracking data format.
Then in your NLE or compositing app of choice, you can add your 3D models or text floating in space etc.
So if Apple is providing the hardware to run not only the capture of this data, but also the reconstruction and 3D compositing in a Mac, why can't we use this data in Motion? It has 3D capabilities including controlling a camera, 3D objects and more.
The answer is right at this moment you can't import data tracking. You can link tracks from other trackers within Motion, but there is no access to 'external' tracks. What a disappointment!
We could see hundreds of uses for this, from YouTubers and flying graphics jazzing up their monologues, to tagging items in a tour around a property for sale.
After at least five miners of serious Googling, we are pretty sure it can't be done. The closest answer we got was Motion expert Fox Mahoney exporting a track out of Motion Just.
There must be a way to take this tracking data and build a 3D Motion project automatically with the camera footage included. Apple and FXhome have done the majority of the hard work, we just need some programming or XML glue between the two apps.
Do you know a way? We would love to hear your possible solutions in the comments below.
In the meantime, just to get you more depressed, here's the demo video of CamTrackAR.