Inside the AI Filmmaking Circle
A weekend with 9 filmmakers who are figuring out that blurry thing 'AI film'.
The online discourse about AI and filmmaking is often dismissed as ‘AI Cheerleading’. There’s too much optimism and too little context.
And there’s some truth to that. The enthusiasm outpaces actual use cases. We can admit the space resembles, at times, a mutual admiration society.
The many impressive tech demos still make for rather crude tools in terms of actual moviemaking. AI filmmaking is a promise. It’s an exciting outline of something nobody can quite make out. Artists across the world are tracing its shape through YouTube and Twitter powered iteration. But it remains a blurry shape.
Today I'd like to add my own details to that blurry shape, and write about a weekend spent with nine AI filmmakers. If that term, AI filmmaker, raises a skeptical eyebrow, hopefully this piece offers a clue as to who they are, why they’re excited, and what this noisy birth ward might become.
The Contest
Last month Cinema Synthetica recruited nine filmmakers, split them into three teams, and gave them identical scripts with a mandate to 'make it' in 48 hours. Or more accurately ‘generate it’, either by prompting image models or AI-rotoscoping real footage.
Most contestants had traditional film backgrounds and typical stories of thankless Hollywood grinds— success that left them working steadily, but for someone else’s vision who’s better in a room. There are, it turns out, very few director chairs available for normal folks. Some contestants owned small production companies, some were waiting for projects to be greenlit, and others still had no industry experience at all.
The common thread among them was a desire to get ideas out of their heads and into shareable video formats. And a belief that this process need not cost millions or be permissioned by a suit.
The filmmakers gathered at 6am at Todd Terrazas’ house (a sort of mayor of AI in the LA area), where Max Einhorn handed out scripts. When I arrived a few hours later, all three teams were already working.
So what does AI filmmaking look like? It looks like people at computers.
The Zom-Com
Upstairs the first team adapted the script into a Zom-Com, a love story between two zombies in undead paradise. They had filmed footage on the beach and were rotoscoping the footage into animation using style transfer, a process that applies an art style to a video, frame-by-frame. They simultaneously praised and ridiculed the results (a common situation with AI). ‘Undead flesh’ is a hard skin tone to achieve and then hold for multiple frames.
Explaining this to me was Nem Perez. Nem’s a commercial and music video director. Not so long ago he was building websites at an ad agency, so he’s quite comfortable with digital-first image-making. He’s somewhat of an impresario in the AI film world too. He masterminded the T2 Remake, a feature length film consisting of 50 AI-generated scenes directed by 50 different filmmakers. It’s quickly becoming a landmark achievement in the space.
Next to him Jagger Waters groaned about flickering zombie skin. Jagger’s a writer who’s been through the normal highs & lows of screenwriting: pilots picked up, dropped, and flung around according to studio executive whims. Jagger has the sharpened wit you’d expect from a LA comedy writer, and it’s taken comparatively little time to be appreciated in the AI space. Only one month prior, Jagger won a big AI contest in Las Vegas. She’s excited about generative AI as a way to get her writing produced with less third-party interference.
Meanwhile Adriana Vecchioli, director and actress, re-worded her prompt to coax better zombie complexion. Adriana left France, frustrated by the lack of ambition in the largely government-subsidized film industry there. She likes the optimism of American filmmakers, but is adjusting to the commercial interests that determine projects here. She hopes AI will allow her to pursue the $100-million ideas that are too big for French cinema and too niche for Hollywood.
The Facelift Ward
Downstairs was a different vibe. Three men sat at separate computers at separate desks performing what we might call Avatar Reconstructive Surgery. Their film followed two characters across 20,000 years of historical settings. They were currently training a model in Leonardo.ai on photos of themselves, trading joking about the crappy quality.
Reza Safai, an actor and producer, explained the process to me while judging AI-generated images of himself. Reza produced indie sweethearts like A Girl Walks Home at Night. He also directed a scene from the T2 remake. And you may have seen him go viral after he ‘fixed’ the recent Apple iPad commercial. At that moment he laughed at an image of himself with particularly bad AI face warp.
Across the room Matthew Wernhardt conducted an AI facelift, swapping Reza’s face onto a new body. Matt has a pedigree of big brand video work. His day job is in healthcare but he moonlights as an AI filmmaker on Twitter where he became one of the 50 directors behind the T2 remake.
The final surgeon was Jonathan Muller, a character animator who’s worked on films like Wish & Ice Age. He was currently smoothing the seams in a rather rough face transplant of Reza onto a caveman. Jonathan also directs. His last short film (directed in his spare time using Unreal Engine), took over two years to finish. Using generative AI, subsequent efforts have taken only a few weeks. He’s overwhelmed at what 3D artists and animators can do with AI.
The Garage Affair
Exiled to the garage at a plastic folding table, the third team worked on a love story between a man and an AI recreation of his lover.
Ryan Reeb, a musician turned VFX artist with Marvel blockbusters on his resume, gushed with enthusiasm about Wonder Dynamic’s trackerless motion-capture software. He was using it to replace a real person with a CGI character that effortlessly tracked their movements. In another browser window he was turning thirty photographs into a NERF (a 3D representation of a space). He wanted to direct a crane shot in the generated 3D space (and save thousands on a jib rental).
Kiri Margaros was next to him, prompting in MidJourney using a homemade technique for consistent characters. Kiri’s day job involves building AI Agents with LLMs, though she’s no stranger to image models. She recently created some of the AI visuals for Grimes’ Coachella show (she was not responsible for the live syncing).
Stooped at the table’s end, the 6’7 director Jared Cotton edited the film together. Jared founded a production company a year after he graduated college and has been producing commercials and series for almost two decades since.
48 Hours Later…
In the wee hours of the morning, the teams outputted their final films. A few days later, the films premiered at AI on the Lot to over 850 people where the Zom-Com won. Interestingly, this film had less technically accomplished shots than the other films, but it also had an actress and a writer on the team. A small footnote that these skills remain, and will remain, very relevant.
These nine artists represent the people figuring out AI filmmaking. It’s not cost-cutting producers. Or money-hungry studios. It’s people who really, really want to make stuff.
Now the only thing standing between an artist and a film is their willingness to sit down and make it. This final statement is, of course, optimistic. The future belongs to the people who determine it. So if you’re interested in AI filmmaking, get in. It’s a blurry shape being defined by the people practicing it. And no invitation is necessary.
Great report Mike! That Terminator 2 trailer was crazy!
I enjoy all of Mike's articles!