CyberFilm AI is rewriting the Hollywood script 🎥

Plus: CEO Russell on AI x creativity and the future of Hollywood...

CV Deep Dive

This week, we’re featuring SAGA - an AI creative tool developed by CyberFilm AI.

CyberFilm is a three-person startup co-founded by Russell and Andrew Palmer, two brothers with backgrounds spanning tech and film. In 2021, right before the GenAI explosion, they teamed up to create SAGA - an AI-powered tool that enables storytellers to maximize their creativity whilst writing screenplays, creating storyboards and pre-viz animations.

In just 10 months since launching their V1, SAGA is already being used by prominent filmmakers across Hollywood - including Stewart Lyons of Breaking Bad and Better Call Saul, as well as thousands of screenwriters around the world. It aims to democratize the creative process for filmmakers everywhere, and has received backing from Jason Calcanis, via his LAUNCH fund.

Russell walks us through all-things SAGA, including the company’s origins, his vision for Hollywood, and the challenges posed by the 2023 Writers Guild of America strike

Let’s dive in ⚡️

Read time: 7 mins

Our Chat with Russell 💬

Russell! Welcome to CV AI. Tell us about your background and what led you to start SAGA?

Hey! I’m Russell, the co-founder of SAGA. My background is in EECS, and I've been a Product Manager for the past 15 years at companies like Microsoft, Viv Labs AI with the Siri founders, and JPMorgan's AI Lab. My co-founder is my brother, Andrew, who’s an Assistant Director on movie sets in Toronto, Vancouver and LA. 

Our passion for film started as kids - we grew up using our dad's camcorder to make claymation stop-motion films with our toys. In high school, we were always making short films and music videos with friends. My brother went back to film school after finishing CS to follow his dream as an artist - he performs music, paints, writes and self-publishes novels, and founded an award-winning indie Production Studio. And I actually had a stint as a child actor too!

In 2021, we founded SAGA - an AI-powered tool to help screenwriters and storytellers create blockbuster films with indie teams and budgets. You can think of SAGA as your writing room partner or “script doctor” - it helps you craft scripts, develop characters and visualize your ideas in real-time, and is available at writeonsaga.com

What led to your decision to become a founder, especially coming from the world of Big Tech?

My dream was always to move to Silicon Valley and start a company, after learning as much as I could at Big Tech and other startups. During the pandemic, I took these online Stanford AI Product Management courses and we had to write a paper on AI disrupting an industry. I wrote it on Hollywood with my brother’s help - examining both the negatives and the positives and how it could help grow the industry. And my professor was like, ‘A+, you’ve got to quit your job and go do this. This is it’. 

We took YC’s Startup School course and began customer interviews with Andrew’s colleagues, building a prototype when the GPT-3 API was first released. By the end, I was all in - I quit my job, built a small team, launched our first product in April 2023, and then a month later the writer strike happened. But we'll get into that later I’m sure…

GPT-3 was a watershed moment for a lot of AI startups. How did that impact your trajectory early on?

It was everything. In summer 2020, our professor got access and showed us a demo. I remember asking him to make it tell a story, and prompt it with 'it was the best of times, it was the worst of times' - to see whether it would come up with a Dickensian masterpiece or something lame. And it told this amazing original story, which blew me away

From the beginning, it was clear GPT-3 knew the beats of Save The Cat and The Hero With A Thousand Faces, but was trying to mash up several techniques with no opinionated structure, resulting in unclear messy stories. So we just started experimenting with fine-tuning when it came out in Beta, doing prompt-engineering and other new techniques to make it work better. Finally, at the end of 2021 we produced a working prototype of SAGA

My brother showed it to his writer friends and would say “Try typing an idea you have, like the premise or logline of a movie idea, and describe your protagonist and maybe an Act 1'” and SAGA would give them all these ideas for other complementary characters, or an Act 2, or twist ending. And writers loved it. It was clear this would be an amazing tool for helping you get unstuck creatively. People say “it's like it read my mind, I'm actually a little jealous I didn't think of that, it’s perfect for my story”.

You also added image generation capabilities into SAGA - was that driven by user conversations, or the technology improving, or both?

You might remember that in mid-2022 - a few months before ChatGPT - image generation was the most popular thing in AI. In some User Interviews writers would say “look, I don’t need much help with the writing. I went to film school, I’ve already got a script. But, I do have trouble communicating my ideas visually.” If a picture is worth a thousand words, a set of storyboards for your script is even more valuable. Writers know there’s a low chance any director will read their entire 100-page screenplay - but if you present your story with concept art, or as a graphic novel, it simply pops.

A lot of people told us: “I can’t draw” or “I can’t afford to hire an illustrator for $500 a day”, they all want to communicate the emotional impact of each scene and share that with whoever is reviewing the script. So as soon as models and APIs for Image Generators were released, we immediately added new storyboard features.

Jason Calcanis and LAUNCH played a significant part in SAGA’s early days, including the naming rebrand. Tell us about his contributions.

Jason has been immensely helpful. As we were building the storyboard feature, we were going through his Founder University. I got the chance to pitch him live on This Week In Startups and he invested $125k after-the-fact - becoming our first angel. We’ve benefitted hugely from his breadth of knowledge about startups, branding, and operations; he’s worked with thousands of startups, so we’re fortunate to have him advising us.

Interestingly, he’s the reason we changed our app name from CyberFilm AI to SAGA. On TWiST, Jason gave us a ton of positive feedback - but at the very end, he was like “I hate the name CyberFilm.AI! It sounds like 80's Blade Runner, it's too hard to remember. You need something memorable - 1 word and 2 syllables, epic-sounding and related to storytelling” He was the one who advised Travis to change UberCab to Uber, and similarly wanted us to have a stronger name. We decided on SAGA and rebranded during our Fall relaunch, and with the new storyboard feature added our customers responded instantly.

Who were your early customers? Are you finding more traction with established screenwriters, or those who are aspiring?

A large portion of our customers are either aspiring filmmakers or screenwriter-adjacent - those who work on movie sets, but not specifically on writing the script. A typical user might do stunt choreography or costume design - however, the common thread is that they want to try their hand at storytelling. Everyone we talk to says “I’ve got 4 or 5 movie ideas I’ve had for years, and I just never sit down to draft them out”. And so we say: just start writing with our app - you’ll get 3 days free to see what it comes up with, and you can keep all the outputs.

With SAGA, the first step is entering your initial story idea into our UI. SAGA will then build on your ideas to generate any other elements you ask for. If you know the logline and protagonist, but you need help with things like a clever title or B-story to reinforce the theme, you can specifically ask SAGA to generate those elements for you. So now, you can go from an original plot to the characters, the acts, the beatsheet, and the script - a full feature film script - and finally, to hundreds of storyboards for your movie. All in a matter of days.  

How do you expect SAGA - and other generative AI tools - to transform the act of writing and filmmaking in the next year?

I’ll give you an example: my brother Andrew will get an idea from an indie studio, and then say “I'll get back to you in ten days”. And he returns with a 100-page screenplay, including all character sheets and storyboards, and it's practically ready to film. His indie-filmmaker friends, with inexpensive quality cameras or even smartphones, can now generate the scripts and storyboards for their ideas in days. With SAGA, they’ve got almost everything they need to submit to a festival like Sundance.

The final step is really getting the visual effects right. We think 2024 will be the year of generative video. Right now, if you need a car chase or an explosion scene, you can’t film that yourself - but you also can’t afford to hire ILM (Industrial Light & Magic) or a VFX shop. We think tools like Runway Gen-2 and Pika Labs will really help with that, and we’re also starting to integrate Stable Video Diffusion into our app.

Soon, you’ll use SAGA to create an AI-enabled movie, you’ll put it on YouTube, and if it’s good you’ll go viral. Who knows, Hollywood might even come to you and say “hey, would you like to direct a bigger budget studio feature next?” And that’s the dream we want for our customers. For people like my brother and his friends to become the next Christopher Nolan or Greta Gerwig.

And how do you view SAGA’s position in the constellation of fast-growing AI creative tools like Pika, Runway or even KREA? Are they explicit competitors, or is there a way these tools all work together?

First off, I always look to partner first, but I see SAGA as a suite or “super-app” for filmmakers. If Pika or Runway ever released APIs, we’d definitely love to collaborate and showcase them in our product.

That said, I do think those apps are missing elements of storytelling, and their user's outputs today mostly fail what I call the “content Turing test” and I wonder, what makes a video a movie? Can AI create films that people would enjoy without knowing AI was used? That’s the test. I’ve watched nearly every public AI-generated movie, and most are still just collections of video with sound and trippy visuals. They’re fit for music videos, but even then they’re clearly missing a story with narrative and character arcs.

We’re the first AI tool for filmmakers that started with the story and text, and then moved into image, storyboarding and cinematography - and we’re now going into pre-vis and animatics. We’ll get to photo-realistic video eventually, but we’re not aiming to be a fully synthetic video generation tool. We want you to go out and film things on your phone using real human actors, and intersperse VFX from our app as needed. 

On that note, tell us about your perspective on the Hollywood strike that took place in 2023? Surely that must have been an existential time for you?

Yes - but we definitely saw a storm coming early. This seemed likely if you were thinking about AI and filmmaking prior to 2023. Experts said that AI would replace manual labor before creativity, but actually, I’d always felt it would be the opposite - partly because stories are structured. Since Ancient Greece, we’ve re-used around 15 major story archetypes - and it made sense that ML models would be able to understand these structures through labeled or even unsupervised learning. That’s how humans learn to tell new stories as they grow up, learning the ones they hear and forming their own mental models to write new ones, as AI does.

When we started the company, the first thing I did was write a newsletter called Movie Technology 2025. I covered a lot of ideas that I thought were going to play out over the next 5 years - and they all seem to be coming true today. And one of them was the impact of AI startups scraping the web to train AI models. For the record, we love OpenAI and Stability AI - we built SAGA on top of them! But we were hoping to engage in discussion between the tech and artistic worlds, especially after so many scandals recently in Silicon Valley, where some like to “move fast and break things”. It was pretty alarming when image generators started spitting out images with a fake Getty logo. 

So, we’ve aimed to position our company as artist-friendly from day one. We give ownership of all AI generations to our users - this is what the WGA (Writers Guild Association) wanted from the start. We also don’t use your movie ideas to train SAGA to make other people’s movies better. We really believe in a future where AI will help people, but in a way that doesn’t threaten livelihoods or artistic integrity. And I think we're seeing that - for example, OpenAI went and paid Getty Images to license all their images. That's the future we want to see more of, where artists have control over consent and compensation, and hopefully we can help be a part of the solution and help lead the way

To that end, tell us about notraining.txt, inspired by robots.txt. You’ve proposed this as an alternative that can help innovation continue in a responsible way.

The Internet is intended to let you download images from people’s websites, that’s why they put them there - that’s what “servers” are for. Copyright law gives you fair use. However some people need to put their work online in order to sell it but don’t want it used in certain ways, typically adding watermarks to say “don’t use this without asking and paying me”.

A few years ago when the Diffusion Image Generators came out, there were discussions online and we would propose notraining.txt - a metatag for online content which blocks from inclusion in training sets by those who respect it, inspired by robots.txt which opts out of search crawlers. This seems to have caught on in Generative AI discussions, and is one area I wish AI companies were more proactive in otherwise we end up with hackers releasing Nightshade and fighting rather than working together. It sounds like OpenAI and others will re-use robots.txt for those who want to opt out, which seems achievable and we celebrate their leadership in AI Ethics.

We also proposed building an opt-in image dataset combined with public domain images, then swapping out LAION 5B with the Open-Source Stable Diffusion on top and paying dividends through blockchain smart contracts to proven copyright holders. That way, it’s a little like fair-trade coffee - maybe users and app companies pay a little extra for this image generator, but they know money is going to the artists whose work is included. 

How has having one foot in Hollywood and the other in Silicon Valley shaped the way you navigate building SAGA?

It’s definitely a competitive advantage. Andrew and I founded SAGA because movies changed the trajectory of our lives. We don’t have dreams of replacing Hollywood with robots run by billionaires. We definitely want to be friendly to it - even if they’re a little hesitant towards us. We want to eventually earn their trust and respect by making real movies with stories that pass the content Turing test - where an audience will watch it, not because it's an AI gimmick, but because it's actually a good movie.

Our app doesn't copy artist styles by name. We talk with members of the WGA AI committee to share our mission and show how the app works as a tool. We even paused our releases and marketing during the strike out of respect for both sides during the negotiations. We want to be a Hollywood-friendly AI company, it’s in our founder DNA.

What’s your perspective on how AI is going to shape filmmaking for both the studios, and the next generation of creative talent?

In the past, film executives had to be very selective with which movies were made, probably because of the limited number of theaters and shelf space at Blockbuster. You went with winners or existing stories with a built-audience like Spiderman or Lord of the Rings, to lower your risk of making a bomb and losing your career.

And when streaming finally hit, the cloud was infinite and Netflix could add 100 movies a day if they wanted, and use AI to selectively find the audience for each. In the age of AI, I think we’ll see a new explosion in filmmaking. We’re already seeing it with Gen Z - they spend so much time watching and making videos, even if it's just stunts or a TikTok dance.

Now, imagine these kids started learning narrative storytelling. They’d have an idea for a movie, and would be able to simply type it into an app and grow it from there. We’d still encourage people to complete film school and read books like Story by Robert McKee, but those won’t be prerequisites anymore. I think that will also help grow more diversity in Hollywood - when quality film production is democratized and anyone can make a movie and distribute it on YouTube letting audiences decide with their views.

We’re already seeing some of this today! Tools like TikTok and YouTube are raising the next generation of stars.

Exactly - and Mr. Beast is the best example of this. He's just a kid from Kansas, and nobody could have foreseen his rise into a media mogul. But he kept making content, putting it out there and growing, and now he's a star. So rather than having to go to the best film schools or film festivals, or risking everything to move to LA, we’d love to see similar opportunities for kids making cinematic films.

And it’s role-agnostic - if you were purely doing screenwriting, you can now be your own cinematographer and producer and director. Hollywood loves “slashies”, these combinations of titles, and AI is going to be the thing that makes everybody anything. That’s when you’ll have full creative control

Everyone in Hollywood knows - the worst thing is when some executive who controls the money gives you notes and bends your creative vision. When you have no leverage, you have to listen. But AI gives you the resources - anyone can make a movie, for better or for worse. Perhaps 90% will be average or unwatchable, but we’ll definitely discover new talents from all over the world. They could form a new wave of Hollywood directors. And that's just such an exciting future to us

What would you say has been the hardest technical challenge of building SAGA? 

An exciting challenge has been keeping up with the pace of innovation from the past 18 months. We originally built on the very first GPT-3 API, and they released a beta for fine-tuning. We’d be learning how to fine-tune and it wasn’t really working, and we’d have to talk to OpenAI’s support a lot.

Then they’d say something like “GPT 3.5 is coming out, you don't need to fine-tune anymore”. So we'd build on something and then it would be irrelevant a few months later due to how quickly things were changing. It sounds like fine-tuning is back now, we probably just needed more data files originally. But it was incredible and challenging - just being at the forefront of it. 

Even today, we use the DALL-E 3 API and added the Stable Diffusion XL API, but character consistency is a problem we had to solve so users don’t need to learn Prompt Engineering. The latest feature we're adding is a pre-viz animation, but it's limited to 2 seconds and doesn't have good prompt directions yet for things like camera movement, so we're keeping it in Beta until the tech catches up to our user's needs and we can build a solution on the Open Source code and models ourselves.

Anything else you want our readers to know about SAGA? 

We’re starting LAUNCH Accelerator next week with our Angel investor Jason Calacanis, for our Seed / Series A roadshow. We’re super excited about the progress we’re seeing and possibilities for Generative Video & Hollywood in 2024, and anyone interested in connecting please reach out at [email protected] I’d love to hear from you! 

We’re hiring a VP of Generative Video. We’re seeing progress with models like SVD 1.1 but need a lot of cinematic control over camera and object movements for our pre-viz animation and animatics features, and longer clip lengths that build on top of each other. So we're looking for a Data Scientist / Machine Learning Engineer with experience in these open source models to join our team and help build that out. 

Conclusion

That’s a wrap for our fifth Deep Dive of 2024! Follow WriteOnSAGA to keep up more about his work, and sign up to expand your creative horizons.

Read our past few Deep Dives below:

If you would like CV to ‘Deep Dive’ a founder, team or product launch, please reply to this email ([email protected]) or DM us on Twitter.