Impressed by the director’s work, and i know Adam Smith Institute and Control AI have high standards. I think popular movies/documentaries can be powerful force for persuasion and change.
This is an offer to donate this amount to the project on the condition that it eventually becomes active. Otherwise, your funds will remain in your Manifund account.
To create a cinematic, accessible, feature-length documentary. 'Making God' is an investigation into the controversial race toward artificial general intelligence (AGI).
Our audience is a largely non-technical one, and so we will give them a thorough grounding in recent advancements in AI, to then explore the race to the most consequential piece of technology ever created.
Following in the footsteps of influential social documentaries like Blackfish/Seaspiracy/The Social Dilemma/Inconvenient truth/and others - our film will shine a light on the risks associated with the development of AGI.
We are aiming for film festival acceptance/nomination/wins and to be streamed on the world’s biggest streaming platforms.
This will give the non-technical public a strong grounding in the risks from a race to AGI. If successful, hundreds of millions of streaming service(s) subscribers will be more informed about the risks and more likely to take action when a moment may present itself.
Making God will begin by introducing an audience with limited technical knowledge about recent advancements in AI. Perhaps the only thing some may have used or know about, is ChatGPT since OpenAI launched their website in November 2022. A documentary like this is neglected, as most other AI documentaries assume a lot of prior knowledge.
In giving the audience a grounding in AI advancements and future risks they may pose, we deep dive into the frontier: looking at the individual driving forces behind the race to AGI. We will put a spotlight on the CEOs behind the major AI companies, interview leading experts, speak to those worried in political and civil society.
The documentary will take an objective and truth-seeking approach. The primary goal being to truly understand if we should be worried or optimistic for the coming technological revolution.
We think advanced AI and AGI, if developed correctly and with complementary regulation and governance, can change the world for the better.
We are worried that, as things stand, leading AI companies seem to be prioritizing capabilities over safety, international governance on AI cooperation seems to be breaking down, and technical alignment bets might just not work in time.
We think at minimum a documentary made for people who do not yet know about the risks, aimed at a huge audience (like a streaming service), might help the commons have a better understanding of the risks. Hundreds of millions of people watch their content from streaming services.
At most, we might catalyze a Blackfish/Seaspiracy/Inconvenient Truth-style spirit in the audience, so that one day they might protest/get in touch with their legislator/join a movement, etc.
Prof. Rose Chan Loui, UCLA Prof. and Legal Expert [CONFIRMED]
Jack Clark, Anthropic Co-Founder [IN DISCUSSIONS WITH ANTHROPIC’S PRESS TEAM]
Kelsey Piper, Vox [WE WANT]
Helen Toner, CNAS & former OpenAI board member [WE WANT]
Daniel Kokotajlo, former OpenAI staff [WE WANT]
And more of this type.
We are aiming for film festival acceptance/nomination/wins and to be streamed on the world’s biggest streaming platforms, like Netflix, Amazon Prime, and Apple TV+.
To give the non-technical public a strong grounding in the risks from a race to AGI.
If successful, hundreds of millions of streaming service(s) subscribers will be more informed about the risks and more likely to take action when a moment may present itself.
As timelines are shortening, technical alignment bets are looking less likely to pay off in time for AI, international governance mechanisms seem to be breaking down - and so our goal is to influence public opinion on the risks so that they might take political or social action before the arrival of AGI. If we do this right, we could have a high chance of moving the needle.
Some rough numbers:
Festival Circuit: We are targeting acceptance at major film festivals including Sundance, SXSW, and Toronto International Film Festival, which have acceptance rates of 1-3%.
Streaming Acquisition: Following festival exposure, we aim for acquisition by Netflix, Amazon Prime, or Apple TV+, platforms with 200M+ subscribers collectively. Based on comparable documentary performance, we estimate:
Conservative scenario: 8M viewers (4% platform reach)
Moderate scenario: 15M viewers (7.5% platform reach)
Optimistic scenario: 25M+ viewers (12.5%+ platform reach)
Impact Metrics: We will track:
Viewership numbers across platforms
Pre/post viewing surveys on AI risk understanding
Media coverage and policy discussions citing the documentary
Changes in public opinion polling on AI regulation
Theory of Impact: If successful, we will create an informed constituency capable of supporting responsible AI development policies during potentially critical decision points in the next 2-5 years.
In order to seriously have a chance at being on streaming services, the production quality and entertainment value has to be high. As such, we would need the following funding over the next 3 months to create a product like this.
Accommodation [Total: £30,000]
AirBnB: £10,000 a month for 3 months (dependent on locations for filming and accommodating crew).
Travel [Total: £13,500]
Car Hire: £6,000 for 3 months.
Flights: £4,500 for 3 months (to move us and crews around to locations in California, D.C., and New York) .
Misc: (trains, cabs, etc) £3,000 for 3 months.
Equipment [Total: £41,000]
Purchasing Filming Equipment: £5000
Hiring Filming Equipment
£36,000 (18 shooting days)
Production Crew (30 Days of Day Rate) [Total: £87,000]
Director of Photography: £19,500
Sound Recordist: £18,000
Camera Assistant/Gaffer: £13,500
Additional Crew: £36,000
Director (3 Months): [Total: £15,000]
Executive Producer (3 months): [Total: £15,000]
MISC: £25,000 (to cover any unforeseen costs, get legal advice, insurance and other practical necessities).
TOTAL: £226,500 ($293,046)
Mike Narouei [Director]:
Former Creative Director at Control AI (Directed multiple viral AI Risk films amassing over 60M+ total views over nine months.
Directed & led a 40-person production team on a £100,000+ commercial, generating 32M views/engagements across social media within one month.
Artistic Director for Michael Trazzi’s ‘SB-1047’ Documentary.
Work featured by BBC, Sky News, ITV News, and The Washington Post.
Partnered with MIT at the World Economic Forum in Davos, demonstrating Deepfake technology live in collaboration with Max Tegmark, covered by The Washington Post & SwissInfo.
Collaborated with Apollo Research to create an animated demo for their recent paper.
Shortlisted for the Royal Court Playwriting Award.
Directed a number of commercials for clients such as Starbucks, Pale Waves and Mandarin Oriental.
Watch Your Identity Isn’t Yours - which Mike filmed, produced, and edited when he was at Control AI. The still above is from that.
Connor Axiotes [Executive Producer]:
Has been on TV multiple times, and has helped to produce videos and TV interviews.
Wrote multiple op-eds for big papers, and blogs. Have a look here for a depository.
Produced viral engagement with millions of impressions on X at Conjecture and the ASI.
He worked as a senior communications adviser for a UK Cabinet Minister, making videos, and interacting with senior journalists and TV channels in coordinating high-stakes and pressure environments.
Wrote the centre-right Adam Smith Institutes’ first ever AI Safety policy paper called ‘Tipping Point: on the edge of Superintelligence’ in 2023.
He worked on a Prime Ministerial campaign and a General Election as part of the then Prime Minister’s operations team. Below he works for the Prime Minister in a media capacity in 2024.
No film festival acceptance.
No streaming service interested in the project.
No-one wants to talk to us for interview [which we are definitely not seeing right now].
£37,000-ish from private philanthropic funders.
Eric Schmidt
1 day ago
Impressed by the director’s work, and i know Adam Smith Institute and Control AI have high standards. I think popular movies/documentaries can be powerful force for persuasion and change.
Esben Kran
1 day ago
I've worked directly with Connor and his dedication to making this truly impactful is inspiring. I am not aware of any project of a similar scope or aim happening right now which is surprising in the face of AI risk awareness. Highly suggest funding this (and similar) projects.
Greg Colbourn
1 day ago
I've provided some seed funding for this (outside of Manifund). We really need broad public communication on AI risk to get it further up the political agenda. Something like Seaspiracy -- where they go down the rabbit hole -- but for AI, would be amazing.
Holly Elmore
2 days ago
I believe this kind of truly entry level broadcast communication on AI risks is key to clueing the public, who already don’t want to take the massive risks that AI companies are taking, into what’s really going on.