Yo
NOTE: the timeline & budget given below is now out of date. Check for updates in the comments for more up to date information.
Scott Wiener, Member of the California State Senate, SB-1047 sponsor
“Congress has not passed major tech regulation in more than a quarter century. And so in the absence of congressional action, California has a responsibility to lead.”
We will produce a one-hour feature documentary about SB-1047, which would:
Serve as a comprehensive reference on the bill's history and implications.
Present a balanced view of perspectives from both proponents and opponents, bridging ideological divides.
Contribute to informed discussions about future AI Policy by providing in-depth, impartial analysis.
Enhance public understanding of AI regulation challenges, fostering more informed public discourse.
Offer policymakers, researchers, and the public a nuanced resource on various stakeholder perspectives, supporting well-informed decision-making processes.
We are seeking $35,000 to finish the documentary before January 2nd, 2025 (10 weeks).
With $55,000, we would be able to finish the project in only 6 weeks (Dec 5), while the bill is still fresh in people’s minds, and do additional interviews.
The project can further benefit from additional funding up to $30,000 ($85,000 total), which would reimburse the time and money already spent on the project.
Ryan Kidd has already provided $4,000 through another Manifund grant, which is why the Manifund target here is $81,000 instead of $85,000.
We have currently completed 24 interviews (including 18 longform) with:
All the sponsors and co-sponsors of the bill (longform, 1-3h)
Scott Wiener (Member of the California State Senate, SB-1047 sponsor)
Dan Hendrycks (Director, Center for AI Safety)
Nathan Calvin (Senior Policy Counsel, Center for AI Safety Action Fund)
Sunny Gandhi (VP of Political Affairs, Encode Justice)
Teri Olle (Director, Economic Security California)
Other proponents of the bill (longform, 1-3h)
Zvi Mowshowitz (Founder and CEO, Balsa Research, writer of “Don’t Worry About The Vase”)
Holly Elmore (Director, Pause AI)
Flo Crivello (Founder and CEO, Lindy)
Journalists (remote, 1-2h)
Garrison Lovely (Freelance Journalist, NYT contributor)
Shakeel Hashim (Freelance Journalist, prev news editor at The Economist)
People who were initially critical and ended up somewhat in the middle
Charles Foster (Lead AI Scientist, Finetune)
Samuel Hammond (Senior Economist, Foundation for American Innovation)
Gabriel Weil (Assistant Professor of Law, Touro Law Center)
Opponents of the bill (longform, 1-3h)
Dean Ball (Research Fellow, Mercatus Center)
Timothy B Lee (Writer, “Understanding AI”)
Leonard Tang (Founder and CEO, Haizelabs)
Lauren Wagner (Advisor, Data & Trust Alliance)
Zachary Kallenborn (Non-resident expert, CSIS)
Opponents of the bill (short-form, 15 minutes average)
Jeremy Nixon (AI researcher, founder of AGI House)
Ed Choudhry (CEO, Barricade AI, former Hacker Dojo Executive Director)
Era Qian (Founder, Edge Intelligence ML)
Andrew Côté (Founder, Hyperstition Incorporated)
Div Garg (Founder and CEO, MultiOn)
Michael Tsai (Chairman, Bay Area Sister Cities Commission)
You can find a list of our interviewees and a summary of our discussions with them here: https://docs.google.com/document/d/1SsHeiDOK-kY2tVAYLG1U1sxCRarATNYx-xPsLQqLCr4/edit?usp=sharing
Publish a one hour feature documentary about SB-1047 (California law requiring safety protocols and accountability measures for developers of advanced AI models to mitigate potential catastrophic risks) by 2025, featuring prominent characters from the SB-1047 saga, including proponents and opponents of the bill.
The documentary will be distributed on YouTube, through the Director’s Channel The Inside View. We are aiming for 100,000 views within two months (for reference, the Director’s previous short film attained ~40,000 views with a minimal budget, indicating significant potential reach). Even though AI Policy might initially appear less engaging than short-form content, we believe the higher production value, overall quality of the interviews and story around this specific bill will make it appealing to a broader audience.
While the documentary will be initially released on YouTube, we remain open to exploring additional distribution channels such as releasing on Netflix or Hulu later on, based on reception and opportunities.
Both sides (proponents and opponents) should better understand each other's positions, to inform future AI Policy debates. Concretely, we could test this by polling our interviewees for the documentary and aim for an average score of 7+ out of 10 on a "position understanding" scale after watching the documentary.
We want this documentary to become a reference for understanding what happened with the bill. For instance, one could imagine it being referenced by prominent figures across different sectors, such as:
Tech industry leaders like Paul Graham
Academic researchers studying AI governance, such as GovAI
Journalists covering AI regulation at major publications, such as TIME
State legislators considering similar bills in other states
Policy think tanks analyzing AI governance, such as the AI Policy Institute
Given our extensive interviews with key figures in AI Policy (Senator Scott Wiener, all the co-sponsors for bill, AI Policy researchers and journalists), we expect organic distribution through their networks. Many interviewees have already expressed interest in sharing and discussing the documentary. We'll also leverage these connections to organize targeted screenings with policy organizations and think tanks to maximize the documentary's impact on future AI Policy discussions.
Dan Hendrycks, Director, Center for AI Safety
“Regulation shouldn't be written in blood."
With a $35k budget (Documentary out Jan 2 2025)
Oct 24-Nov 21 (4 weeks):
Edit the most important parts of our interviews into a ~2 hours long "first cut", including a trailer
Nov 21-Jan 2 (6 weeks):
Hire and work with video editors and other post-production professionals (sound mixer, colorists, sound designer, etc.) to go from a ~2h first cut to a ~1h final cut (this 2:1 ratio between first cut and final cut is standard in documentaries)
Update based on feedback from my interviewees, core audience on X, alongside filmmakers with AI documentary experience (eg. Dagan Shani - "Don't Look Up - The Documentary") and prominent AI Safety communicators (eg. Rational Animations' Writer)
With a $55k Budget (Documentary out Dec 5 2024)
Oct 24-Nov 7 (2 weeks): first cut
Primary Plan
Hire video-editor for the first cut (cutting editing time in half)
Pursue additional high-profile interviews with time freed by video editor (Pelosi's office, Fei-Fei Li, Newsom)
Fallback Options (if high-profile interviews don't materialize)
Pursue alternative SF-based interviews from our existing network
Reallocate remaining production budget to enhance post-production quality
Nov 7-Dec 5 (4 weeks): final cut
Hire and supervise a premium video editor to reduce final cut time by 2 weeks, and get better story telling, increasing reach significantly
Sunny Gandhi, VP of Political Affairs, Encode Justice (bill co-sponsor)
“The lobbying machine that tech has created in DC has always been regarded as one of the most successful in history because it has gotten government to do absolutely nothing.”
For reference, we estimate a fair compensation for the Director’s time to be around ~$130k/year, based on two factors:
Average salary of a film director in San Francisco
Director’s opportunity cost of going back to working as a ML engineer in France
If working on something impactful, we could imagine paying the Director 40% less than market rate, so ~80k / year, similar to what is described here.
Given that, we expect the project to need $35k to reach completion in 10 weeks:
$20k for production (film crew) and post-production (video-editor, colorist, sound mixer)
$15k to pay for the Director’s time, including taxes ($80k yearly salary, 10 weeks)
With $20k more in funding, we would be able to finish the project faster (6 weeks instead of 10) with more interviews:
$10k more in video editing, during post-production, which would enable us to spend:
$5k to hire a video editor to help with the first cut (first cut in 2 weeks instead of 4)
$5k more on video editing in the final cut, hiring a more experienced video editor (final cut in 4 weeks instead of 6). (For more detail about why hiring a more expensive video editor would be cost effective, see the explanation in the detailed budget breakdown)
$10k more in production, paying for more interviews (production crew, flights, lodging)
$0 more for the Director’s time, but given that the timeframe would be 6 weeks instead of 10, this would mean the Director would effectively receive a $130k / year salary for 6 weeks (instead of a $80k salary for 10), enabling the Director to be more productive (paying for transport, delivery, general outsourcing), and make less trade-offs.
Finally, the project can further benefit from additional funding up to $30k, which would go toward reimbursing previous production costs and paying for the time spent on the project so far:
$10k would reimburse our incurred production costs (hired production crew, equipment, flights)
$20k would reimburse two months of the Director’s time, including taxes ($130k salary)
To summarize, a minimal version of the project would cost $35k. A total of $55k would enable us to finish the project in 6 weeks instead of 10, and with $85k we would be able to break even on the costs already incurred.
A more detailed breakdown is available here: https://docs.google.com/document/d/1StyJhM4R8AxsCd7rhA8fqOXDpLQMFvv7P-Qq8fkBhuI/edit?usp=sharing
Nathan Calvin, Senior Policy Counsel, Center for AI Safety Action Fund (bill co-sponsor)
“We did everything and we tried to put out all of the best text and substance that we could, but then ultimately just one person with his own beliefs is going to make a decision.”
Michaël Trazzi (Director, Editing Supervisor) runs the YouTube channel The Inside View, which has been running for 2.5 years, releasing more than 60 interviews, including prominent researchers such as Evan Hubinger, Owain Evans, Collin Burns or Neel Nanda. Previously, he worked for 3 years in AI research, including AI Safety research at the Future of Humanity Institute. He has a dedicated audience of 5600 subscribers, amounting to a total of 250k views. This year, he made a 12-minutes short movie depicting a hypothetical AI takeover scenario, realized with only publicly available footage, which reached ~40,000 people with ~2k likes (5% engagement rate), meaning it has been well-received.
Rachel Shu (Cinematographer) helps with camera work, photography and sound. She has previously directed and co-directed several documentaries, including one in North East Syria, one about Covid (Open Philanthropy grantee, ongoing) and one about Vibecamp. She has recently been focusing more on cinematography, for instance by recently filming an interview with Larry Summers (Joe Walker Podcast). Rachel Shu's compensation is included in "production crew" costs.
Liam Elkins (Production Coordinator) is a documentary production coordinator and New York based filmmaker. His most recent projects include the Academy Award nominated documentary short The ABCs of Book Banning and an upcoming feature for HBO Documentaries to be released in 2025.
Mike Narouei (Video Editor, rough cut) has directed & edited multiple short-films published on Youtube about AI, including two short films on deepfakes (reaching 2.6M views and 200k views respectively).
Dean Ball, Research Fellow, Mercatus Center, and writer of “Hyperdimensional”
"Almost no human creation worth its salt was made from pure thought."
Failure to access key individuals for interviews. We already have high profile interviews completed and several endorsements, making further access easier. Having a high-profile funder would make the project more legible and obtaining further access even easier.
Post-production challenges: With the scope of this project, insufficient funding to give this project its proper due in the final edit (through editing, and other post-production work) would erase much of its potential. We could mitigate this risk by diversifying our funding sources (eg. selling rights to a production company).
Bias or perceived bias: The video aims to present multiple perspectives on a controversial topic. If it's perceived as too biased towards one side, it might lose credibility. We could make sure to send earlier drafts to prominent proponents or opponents of the bill for feedback to minimize this risk.
Teri Olle, Director, Economic Security California (bill co-sponsor)
“SB 1047 was a really smart, strategic common sense piece of legislation that would have made California a leader.”
Missed opportunity for public education: The video aims to provide in-depth discussion on SB-1047 and AI regulation. Failure would mean a lost chance to inform the public about this important topic.
Financial loss for the creators, waste of interviewees' time: Given the expenses already incurred, failure to complete the project due to lack of funding would result in financial loss for those involved. It would also mean that the time that the interviewees gave us would to some extent go to waste (about 35h total).
Missed opportunity to influence policy: The video could have potentially informed future AI regulation efforts. Its failure might mean future AI Policy discussions will see participants have less common knowledge of the other side.
Holly Elmore, Executive Director, Pause AI
"Just because tech elites want to make this technology, and they can do it, doesn’t mean they should get to decide what happens to the world."
Michaël Trazzi (Director, Editing Supervisor) has raised $10,259 through another Manifund grant to make podcasts and video explainers about AI Alignment. This money has mostly already been spent and will not be used for the documentary, since we consider the documentary project to be out of scope and requiring its own budget.
Note: Ryan Kidd has already contributed $4k to the SB-1047 documentary project, which is why the final target is $81k and not $85k. He mentioned “minimal costs of the SB 1047 documentary ($15k)”, which roughly corresponds to the $20k to reach completion in this version of the proposal (not including paying the Director’s time).
Zvi Mowshowitz, Founder and CEO, Balsa Research, writer of “Don’t Worry About The Vase”
"When SB 1047 was vetoed, I saw a lot of people gloating online about how they had won and how this was a great day. And I told them, remember this day, for you will rue it."
You can be included in the credits of the documentary if you donate to the project (opt-out available for donors who prefer to remain anonymous), following these tiers:
$10,000+: Featured as "Executive Producer" with prominent placement at the start of credits. Organizational donors at this tier can request logo placement.
$5,000+: Listed as "Co-Executive Producer"
$1,000+: Listed as "Associate Producer"
$100+: Listed in "Special Thanks"
Michaël Rubens Trazzi
8 days ago
Nov 26th Update:
- After 4 weeks of editing, hiring multiple editors & other contractors (sound mixing, production coordinator, video labeling), we presented a 1h05m rough cut of the SB-1047 documentary at "The Curve" on Sat 23
- We observed an increase of 1.4 points of understanding after watching the documentary, which satisfies our goal or increasing the understanding of different positions held on SB-1047 (we polled 21 participants before and after the screening)
- We observe a weak correlation between the initial position of participants on the bill, and how likely they are to recommend the documentary to a friend. The fact that the correlation is weak is positive, showcasing that the documentary was recommended (& not recommended) by people with different positions.
- I no longer believe that it is possible to finish this project in "6 weeks" as previously proposed for the funding level that we reached
- My current estimate for when the movie will be out is now at best late January 2025, if not early February
- I believe that many of the key players who would have benefited from seeing the documentary earlier rather than later have already benefited from watching the rough cut at the curve, or will hear about it from there and could see it upon request or at targeted screenings.
- My new goal is now to have a final cut published before a new bill is announced or just after it is announced (estimated date for when the next bill is announced: February)
- My previous budget estimation were too optimistic and did not take into account the cost & time of motion graphics / animation, the cost of hiring a composer / music supervisor, and the cost of hiring an experienced video editor to go from a rough cut to a final cut for 2+ months (which seems to be the minimum I am currently being quoted for).
- Therefore, we are still funding constrained, and I expect that any funding above $55k will be invested towards paying contractors for the post-production (say for animation, music and video editing), with video editing and animation being the two main costs, and music being slightly lower. (Note that video editing was to some extent included in the original manifund project, but animation was not included at all, so this is where I'd expect most extra marginal funding to go).
Blake Borgeson
15 days ago
Love that you're doing this. Thank you for putting in so much work and time already! Love that you're aiming for a balanced presentation--I think we'll all learn more that way.
Nevin Freeman
26 days ago
Thanks for doing this, I'm hopeful it will allow more busy folks like myself to understand the bill. I hope you guys include a clear and thorough explanation of the bill itself as well as clear articulations from opponents on why they opposed it – seems like that's going to be the hardest part to get right given your list of interviews so far. Cheers!
Max Chiswick
about 1 month ago
Excited for this, especially "fostering more informed public discourse". Nice production quality!
Swante Scholz
about 1 month ago
Looks promising. Love the selection of interviewees. I think we need more documentaries like this about topics related to AI safety.
Looking forward to seeing the finished documentary!
Cameron Holmes
about 1 month ago
A token donation to signal my support for this and my continued support for your work more generally.
The previews and outline seem very impressive so I expect this may be impactful and I'm looking forward to watching it.
I find the point about this becoming a reference is the most compelling:
The AlphaGo documentary remains relevant through capturing a watershed moment for capabilities progress / hints to mainstream safety concerns, despite the significance of the object-level capabilities/technology waning.
Similarly I could imagine a lot of value from this project materialising 2+ years out, in a world that (hopefully) sees increasing momentum and broader public attention and discourse around safety policies. This documentary could lend credibility by capturing the seminal/bellwether event and frame for future conversations.
Austin Chen
about 1 month ago
I'm very excited to fund this project!
Important subject: SB 1047 was very high profile, generating a lot of discourse on how AI policy should be set. Even though it didn't pass, capturing that knowledge and sharing it seems very impactful, to inform how we as a society approach future bills.
Great interviewees: I'm impressed that Michael has recorded footage with so many of the main characters of SB 1047: sponsors, proponents and opponents alike. I recognize and follow many of these folks, and am looking forward to seeing them speak on camera.
Rewarding initiative: Michael saw this opportunity and then just started making it happen, without waiting for funding or approval from grantmakers. In doing so, he's taken on some financial risk, forgoing 2 months of pay and funding expenses out-of-pocket. He's now asking for retro funding, which I am very happy to pay down; I want to encourage a norm of doing awesome, ambitious things without waiting for permission.
I think the salary he's asking for is very modest, especially given his opportunity costs and the uncertainties/downtime involved with temp work.
Investing in video: EA and AI Safety have historically been very good at communicating its message through longform essays (see: 80k, LW/EA Forum), decently through podcasts, but fairly weakly through videos. Funding this is also an investment in building up our capacity to produce more high-quality video content in the future.
My main concerns:
Interest in SB 1047 might already be fading, and will probably drop even more as things happen in AI and AI policy. (This is part of why I'm pushing Michael to get out the documentary ASAP). Video production can take a long time, and any delays will reduce the reach and impact of this documentary.
I'm not very sure what makes a video "good". At a glance, the quality of the production and the quality of speakers seem very high; but will the video itself be interesting? Will it be informative? I'm personally not well placed to evaluate this.
Perhaps clips/shortform videos optimized for Twitter/YT shorts/Tiktok would be a better use of this material. Eg I don't have time to watch many Dwarkesh videos, but the short clips are great. Perhaps worth doing both!
(Conflicts of interest: Rachel Shu is my housemate and has done videography for Manifest; Michael has crashed in our guest room while filming this)