I was watching a bunch of YouTube videos on the iPhone 15 camera for cinematic footage, and while it’s certainly not anywhere near the actual quality of an Alexa or Red, it still looks pretty good. There’s a thing shot by Claudio Miranda that looks amazing, but largely because he had a professional production design and lighting team. And he’s Claudio Miranda.
But another video showed off an AI depth scanner (I think it’s called Brice) tool that can take the footage and then add in artificial dof to almost make it appear like it was shot with a larger sensor.
That combined with color grading and maybe AI upscaling made me think - would it be possible to make an all-in-one app that makes those changes “live”, like a super advanced LUT? Essentially making AI enhancements to “fake” the look of professional cameras, so even though the camera components are inferior, the footage would have the same quality?
Basically you’d pick a preset LUT or design your own (for the color grading), and then the app would - in real time - apply that LUT along with the “faked” depth of field effect, so when you look through your phone screen at your footage, it would look closer to Alexa/Red footage than iPhone footage. As opposed to shooting the footage then going in after the fact and applying the AI depth scanner to every shot manually. Also in real time could apply an AI upscaler like Topaz for the detail, then re-add some grain or whatever to make it look more filmic.
Honestly whoever invents this app will be my hero, and it’ll let low budget filmmakers shoot really good looking films despite the lack of a professional camera.
[link] [comments]