Take five teams of filmmakers, give them iPhones to shoot on, Final Cut Pro X running on shared storage to edit with and 36 hours to complete a three minute film. Sam Mestman tells us about We Make Movies' latest community film project.
As many of you guys know, I run a film community called We Make Movies that is, as far as I know, the world’s first community funded production company If you’ve ever wondered why I know anything about making content, it’s mostly because of the work I’ve done there. It’s my lab, my passion, and the only reason I really care about editing and post production. And here’s the thing… At We Make Movies, there’s people at all levels who are all trying to make things they care about… and unfortunately for many of our members, there’s a huge barrier to actually having the courage to go out and make movies themselves.
We’ve been told for generations that we need a large crew with a “pro” camera and millions of dollars (these days, at least hundreds of thousands) in order to make something that looks “professional”. We’re taught that you need to go to film school, and you need all this “gear”, and all these “things” in order to tell a story an audience will appreciate.
Because of this, most of the people who haven’t gone to film school or gradually worked their way up the ladder are afraid to get out there and shoot something. They’re afraid of what they don’t know, afraid they don’t have enough money, and are generally afraid of embarrassing themselves and falling down a hole of complexity and debt with nothing tangible to show for it that they can be proud of.
The learning curve for a filmmaker until now has been way too steep, and storytelling for the average person has unfortunately become too complex and difficult which has made the idea of making a movie a fantasy for most people.
However, we did something a couple of months ago at We Make Movies that I think is proof that the idea that you need lots of things you can’t afford in order to properly tell your story is already starting to change.
The reason that’s the case is because of what’s probably sitting in your pocket right now… and it’s called an iPhone… and the revolution it’s creating for storytellers is called Mobile Filmmaking. The types of images you can get now off an iPhone and some associated mobile filmmaking products are simply incredible.
Just a few years ago, I made a professional documentary (as in there was a budget and I got paid to do it) on FCP7 with DVX100’s in standard definition that produced images which weren’t even remotely of the same quality that you can get right off your iPhone now without even really trying.
Not only that, but with a couple of accessories, you can now attach a tripod, lavs, a mixer, boom, and damn near anything else you thought possible to your iPhone and make a fully functioning, portable narrative rig that allows you to make movies you can screen in a theater and not have anyone really care it wasn’t shot with a “pro” camera.
Your phone is now good enough where you no longer have to be afraid that you don’t have enough money to tell your story, and I think this is going to change the world of narrative storytelling forever, and it opens doors and possibilities that I don’t think anyone really would have believed possible just a few years ago.
I’ve been wanting to experiment with mobile filmmaking a lot more because at We Make Movies we have a community of people at all levels of experience, and I’m always trying to find a way to make it easier for them to go out, get better, and make something that isn’t going to destroy their life savings. Getting people to go out and shoot something fun together with something that was already sitting in their pocket seemed like a great way to remove a lot of first time filmmaker’s fear of making something and putting themselves out there.
And so, in October, We Make Movies, with the help of the Santa Monica Apple Store and a host of other manufacturers had its first ever Mobile Filmmaking Challenge for WMM members.
Here’s how it worked:
5 teams of 4 filmmakers apiece all showed up to the event and got an iPhone (about half got and iPhone 6 and the other half got the new 6s), and were then presented with a host of mobile filmmaking gear options from Manfrotto, Apogee, Rode, MXL, and iOgrapher that they could use to assemble a rig that they would then go out and shoot a 3 page script that each team had already written in advance in a little under 6 hours.
After they finished shooting they came back and met up with myself and Eric Altman from Lumaforge, where each of the teams offloaded their footage from their phones onto one of our ShareStation mobile units, and then used their remaining time that night to prep their projects’ footage in FCPX off of a new Retina Macbook Pro.
Then they went home for the night, and got from 10-4pm on a Sunday to edit their short, do a bit of color, titles, music and sound work, and then deliver their films to be screened that night as part of the October We Make Movies Screening Series at The Three of Clubs in Hollywood
Not only did all 5 teams complete their movies in less than the 36 hours allotted to them, but many of the filmmakers were first timers who had never edited (or even made) a movie before, and on their first project were all easily working off shared storage, syncing dailies, learning how to organize their media, and in general, seamlessly picking up the fundamentals of how to make a movie.
So… with all that aside, here are some answers to some questions you might have about all of this:
First off… let’s get this out of the way… how were the movies?
Well, to be perfectly honest, some were better than others, but all of them were finished, and all of them screened in front of a live audience the day after they were shot. Everyone had a blast, and I think what the experiment proved more than anything else is that the iPhone, when combined with the right hardware, software, and peripherals is more than all you need to go out and tell a story you care about in a professional way.
More than anything, people left feeling empowered to do more and make more stories.
What peripherals did teams get the options of using?
We used gear from the following companies:
Lumaforge:
- Shameless self promotion for my company here, but all of the teams edited their movies off of one of Lumaforge ShareStation Indie (formerly Mobile) units. The SHARESTATION is the only shared storage solution that is specifically designed and optimized for FCPX performance and libraries.
iOgrapher:
- Dave Basulto from iOgrapherwas there with his industry standard iOgrapher iPhone cases, which are one of the coolest must-have products in the mobile filmmaking world. He also brought some custom iPhone lenses and the Canon lens mount adapter which allows you to put Canon or Nikon lenses on your iPhone… which is sort of unbelievable.
Apogee:
- Rob Clark from Apogee brought along a bunch of their new ClipMic’s which are a revolutionary new way for people to record lav based audio (more on this below).
MXL:
- The new MM-4000 Mini Mixer - Perry from MXL brought along his new mobile mixer that allows you to mix down four inputs into a single channel for your DSLR or phone.
- MM130 - Hand held Microphone for IOS and Android
- MM-CM001 - A mounting bracket for your phone that makes it tripod/mic/monitor mountable
Rode:
- smartLav+ - A iOS compatible professional lav mic
- Micro Boompole Pro - A really lightweight boom pole
- Sean from Rode also showed off a very cool mic that’s not available yet that’s designed to go on that pole. Those two things together are a bit of a game changer for mobile filmmaking sound.
Manfrotto:
Robert Magness from Manfrotto gave the teams a host of monopods and tripods to use.
What are some things you should look out for when you try and make a movie with your iPhone?
Sound is the most important thing in my opinion… and the real question you’ve got to ask yourself is how you plan on getting everyone’s dialogue recorded properly and then synced easily. We used multiple different rigs across the various teams with varying degrees of success.
If you’re planning on working with FCPX for the edit, here’s what I’d recommend you do with your rig:
- Use the Filmic Pro iOS app to record your video on the iPhone.
- Get some sort of tripod/monopod to use to keep things steady. Manfrotto makes some great solutions for this if you’re looking for a good place to start.
- In order to mount your iPhone on the tripod/monopod, and for a variety of other reasons, you’re going to want to get the iographer. Seriously… just do it. In addition to allowing you to easily connect to a tripod, it makes it easy to mount an onboard mic, or use different lenses for your iPhone (you might not know this, but with an iographer, you can even mount cinema lenses on your iPhone if you really want to).
Also, you can use a Gimbal for the iPhone for tracking shots, believe it or not. Here’s a cool one you can check out: ikan FLY-X3
In terms of sound… we definitely had a lot of trial and error with this, and if I could go back in time, I’d tell all the teams to do things this way:
-
Plug in a high quality boom mic directly to the iPhone you’re shooting with no matter what you choose for your other mics. This can easily be done with an XLR to mini adaptor that you can use with an app like Filmic Pro to get your audio. If you do things this way, no matter what, you’re going to be recording usable footage with high quality sound already baked into, and will save you a lot of time getting up and running in post. Get a good boom, use it correctly, and it’s likely all you’ll need for the most part. And even if you’re going to use some of the additional methods below, having a good mic attached directly to your phone will guarantee the best possible sound to sync your lavs to in FCPX when the time comes to make your sync clips or multicams.
-
Keep things simple and try and only connect one mic to the iPhone shooting video. If you need to use more than one mic for a given scene, have a person dedicated to sound, and have them use a secondary iphone(s) to record audio or a dedicated mixer.
-
When it comes to Lav mics (or secondary sound in general) with the iPhone, you have 3 viable options if you want to do it right:
-
Hire a sound guy with a mixer and do it the traditional way, and then sync your sound with your guide track from either a boom or a in-camera guide track from an attached mic (or as a last resort, using a slate, but this can be unnecessarily time consuming in this day and age with waveform syncing). This is always the best way if you can afford it and will present the fewest challenges in a synced sound workflow.
-
Use a second iPhone for your secondary sound (with your boom connected to your iPhone that’s shooting video), and get something like the MXL mixer to connect up all of you lavs to, and then have someone mix down your individual Lavs that way, and then sync up all your audio automatically using waveforms in FCPX. If you’ve recorded good sound and have a proper guide track on your primary iPhone, everything should sync up without issue this way.
-
If you really want to get creative, have your actors be in charge of recording their own sound. Apogee makes an amazingly cool product called the ClipMic If you’re smart, you can have your actors use their own iPhones to record their dialogue through a lav while you record from a boom to a primary iPhone. When you’re wrapped for the day, just get everyone to send/airdrop you their audio from the shoot, and if you give each person’s mic their own distinct camera angle metadata, you have everyone’s mic seamlessly sync up in an FCPX multicam. Some words of caution on this, though: in order to not burn yourself later in post, you need to make sure your actors are properly trained for this and are teched properly ahead of time. When done right, this can be a revolutionary way of managing audio… but if you do it wrong, this can quickly turn into a disaster in post.
- Keep things simple - especially on your first go round with some of this gear, the fewer new things you hook into your iPhone, the fewer things that are likely to go wrong. On your first go round, I’d recommend keeping it simple by just using Filmic Pro, an iographer, a tripod, and a boom mic connect directly to your phone… and see how you do, and add additional gear once you’ve got the basics down.
How was it using Final Cut Pro X with this process?
One word: Awesome. The fact is that this would have never worked with any other NLE. Avid, Premiere, and Resolve would have been too complicated for the filmmakers to pick up in that timeframe. They never would have finished their films. iMovie would not have worked because we had to do some complicated things on the syncing side, and honestly, many of the filmmakers would have found iMovie capabilities to be limiting in the long term.
Final Cut Pro X is the only NLE that is approachable enough for everyday people yet powerful enough for filmmakers to really push the boundaries and finish at a pro level on the post side of things.
How were the teams using the ShareStation?
Well, basically, all 5 of the teams edited off of it over gigabit ethernet using a standard Thunderbolt to ethernet adaptor with the ShareStation quietly living in the room as they edited. There was no server room or sneakernet necessary. All of the teams worked from the latest Macbook Pros on El Capitan with their libraries living off the ShareStation. Media was consolidated to the same folder outside of each bundle with a centralized cache on the server, which made more easy backup, archival, and reconnectivity when teams grabbed their projects to put on their own drives after they finished editing.
In terms of performance with FCPX… it was extremely responsive, without beach balls, and basically proved beyond a shadow of a doubt that working with FCPX using shared storage is not only possible, but it can be a simple, quiet experience that saves time and delivers better performance than users would ever expect from your average LaCie, Graid, or Pegasus drive. Pretty much, none of the teams really realized or cared they were working with shared storage… it basically was just a box that sat in the room and made it easier for a group of people to make their movies.
How do you see this sort of storytelling affecting the content creation business moving forward?
Well, I think it’s pretty clear what it’s already done on the home movie side/hobbyist side of the spectrum. Shooting a video on your iPhone is no different in difficulty than what it used to be to shoot pictures with a disposable camera 10-20 years ago, and because of this, more people are making videos than ever before. In fact, there is more footage currently uploaded to Youtube in a single hour than a human being can watch in an entire lifetime.
For pros, the projects that get all the headlines are done with Dragons, Alexas, and Sonys. The next tier is Canon, Blackmagic, and the rest of the DSLR crew, and it’s common knowledge that if you’re a pro, you’re going to want to use a pro camera from one of those manufacturers. And, well, if you know what you’re doing, for most narrative storytelling purposes, I think that’s absolutely still the way to go.
If you’ve got money, a budget, and the luxury of time, and you’re telling a narrative story, the first thing you’re going to want to reach for is probably a camera from one of the big guys. If you know what you’re doing, you will get a better image, and your work will look more professional. A case can really start to be made, though, for the iPhone as a B camera, or as something to be used in non standard, hard to shoot in places where time is of the essence and you need to move quickly. The fact is that is that an iPhone rig is far faster and easier to use than most DSLR’s, and I think the camera on the new 6s is more than acceptable to shoot a documentary or news piece with. For one man bands, Preditor types, and students, an iPhone Mobile rig is a no brainer.
Also… one last thing to think about before I wrap this up. I think the biggest difference all of these new mobile filmmaking options present is that it makes teaching and learning the fundamentals of filmmaking and storytelling cheaper and easier, and opens up visual storytelling to people who simply could have never afforded to do it before.
The hardest thing about making a movie to this point has always been finding the cash to go out and do it… and the price of failure for the indie filmmaker has been staggering to this point… and I think that is the thing that is about to start changing.
When failure becomes less expensive, it becomes easier to get better at something. People can go out and fail cheaply on a 3 minute short… then make a good 3 minute short. Then make a good 10 minute short, and then make a good 22 minute pilot, and then a good 90 minute feature.
Content creators no longer need to put their budgets all in one basket. A typical graduate short film shot of film at NYU used to run you around 20k (probably even higher now). You can pretty much make the same thing now in 4k on an iPhone for a few hundred bucks, and it’ll still be just as bad (it’s a student film after all), and you’ll still learn just as much from the experience that will inform your future films.
That’s the biggest thing about all of this to me. It’s now cheaper to fail, and I think that fact alone is going to lead to a whole new generation of inspired, incredibly talented filmmakers who don’t need anyone’s approval whatsoever to go out and make the movies they want to see.
I find that really exciting.
And with that in mind, a huge special thanks goes out to Kim Stecz from the Santa Monica Apple Store, Tim David from Apple Education, and the guys from iOgrapher, Apogee, MXL, Rode, Lumaforge, and Manfrotto who all collectively made it easier for a bunch of We Make Movies Members to go out make a movie over two days.