We take a further exploration into the video possibilities of the new iPhone 13. ProRes RAW? Sidecar depth info?
It's no surprise that our most popular article this year is the one preceding this; the new iPhone and its ability to change the focus of footage (after it has been shot) in Final Cut Pro and iMovie.
Not the we had any prior knowledge, but a quick tweet before the event turned out to be reality!
So what if you could access the depth of field control on movies recorded on a new iPhone in Final Cut Pro? If it does computational video that is….#fcpx #finalcutpro #apple #video #videoediting #vlog #vlogging #film #filmmaking
— FCP.co (@FCPdotCO) September 14, 2021
Soon as the event was over, we went hunting through the press release and found confirmation that you will have the ability to 'refocus' (change or move the depth of field, Robin) in post production.
This is a big deal, changing the pixels in an iPhone movie after it has been recorded. Just think about that for a second.
Final Cut Pro has just had the biggest PR/social media push since its less than successful launch. Media everywhere is reporting that FCP and iMovie will be able access the depth information from iPhones for focusing.
Not that everybody will be able to do it. The phone has to be an iPhone 13 and you have to record at 1080 30FPS. This is because the phone will record all the depth information using the TrueDepth camera at the same time to allow the focus changes. There's a lot of maths going on in the process and as phones get more powerful and the software more developed, we would expect Cinematic Mode to be available in more formats. Let's hope the next one up is at 25FPS.
The Verge is reporting the depth information is stored as a sidecar file.
The detective demo video looks good when objects have sharp edges, when it comes to hair, it does look slightly false.
You will also need an 'updated' version of Final Cut Pro or iMovie. Which leads us to think that a release of a new version of Final Cut Pro will come out around the date of the FCP Global Summit. The Mac will also have to run macOS Monterey which will need to be released before then for everything to work. (iOS 15 was released yesterday BTW with updated iMovie and Clips apps to be ready for Cinematic Mode.)
A few other features were not fully explored either. ProRes recording will not only increase quality, but it also allows the shooting, editing and delivery of a ProRes master from just the phone.
iJustine is claiming the iPhone 13 Pro will record ProRes RAW.
The new macro shooting mode will not only trigger off a slew of random closeups into bloggers' daily output, it will also help broadcast shows getting quick closeup shots without having to change lenses on their main camera. It is a crop of the wide angle lens though.
But wait a minute... If you need macOS Monterey for the clever stuff, will that become available in Motion as well? Sure, we would like to be able to change the depth of field in Motion, but what about getting access to the depth map for compositing? That would open up a whole new range of possibilities... and plugins!
This is just the tip of the iceberg when it comes to processing iPhone video in post. Apple has a habit of under-promising and over-delivering. Remember WWDC 2020 and the automatic analysis of video footage to find and tag items? We made a list of possibilities back then that we think still stands. We got the depth information, but we didn't think about changing focus in post! If you want extra proof of how video processing will offer more creative choice, then check this WWDC video about transferring a picture's style onto video.
This also brings us on to the question of the possible start of a divergence in abilities between Intel and Apple silicon machines. The neural engines in M1 powered Macs will surely perform the image processing on Cinematic Mode movies way faster than the Intel version, which will have to resort to CPU/GPU number crunching. if it does it at all.
It has taken me eight years to lose a bet
Back in 2013, Blackmagic were releasing new cameras at a rapid rate and the AJA Cion had just been released. I had a bet with Wayne Andrews of Matrox about Apple releasing a pro camera. I said they wouldn't, he said it was inevitable.
AJA CION courtesy of Kevin Fitzgerald on Twitter @BrownEnvelope
On the 14th of this month, he won, because I didn't have the forethought to realise that Apple's plan was to incrementally upgrade their iPhone camera year after year, to the point where it became a very capable pro camera. Granted, it's not great on sound, but it can now do things a traditional broadcast camera can and do advanced optical and software processes a traditional broadcast camera cannot.
It is going to be an interesting few months up to the FCP Creative Summit.
We will leave you with some first impressions of the iPhone 13 from the usual bloggers.