The Gist
- California AI legislation. California SB-1047, also known as the Safe and Secure Innovation for Frontier Artificial Intelligence Models Act, aims to hold AI developers accountable for preventing catastrophic harm through stringent safety protocols and compliance audits.
- Industry reactions divided. The bill has sparked significant controversy among tech leaders, with supporters like Elon Musk backing it, while critics argue it could stifle innovation and create excessive burdens on smaller developers.
- Focus on prevention. Unlike other AI laws, California SB-1047 emphasizes preventative measures, requiring developers to demonstrate fail-safes and shutdown mechanisms to avoid potential AI-related threats before they happen.
Recently I reported on the Colorado AI law, the most extensive law addressing AI development and usage in the United States to date. The AI legislation crown, however, may go to another state, if its own unique take on an AI bill passes.
California SB-1047 (The Safe and Secure Innovation for Frontier Artificial Intelligence Models Act) is designed to hold AI developers accountable for preventing catastrophic harm through an AI model. The bill has attracted high attention and visibility from the tech leaders most responsible for AI technology. As a result, controversy over its potential impact on AI development has emerged, with prominent AI players drawing words over the bill’s view of AI safety and liability.
California SB-1047 is on Governor Gavin Newsom's desk as of this week.
Background and Key Provisions of the California SB-1047 AI Bill
In February 2024, State Sen. Scott Wiener (D-San Francisco) introduced the Safe and Secure Innovation for Frontier Artificial Intelligence Models Act with Senators Richard Roth, Susan Rubio and Henry Stern as co-authors. California SB-1047 calls for the formation of a government operation agency that would establish CalCompute, a public cloud framework for advancing AI safety and equitability.
The purpose of the agency and CalCompute is to stay ahead of the development direction of launched AI models that would have the most influence on public services and institutional operations. The agency would be set to issue an initial assessment of California’s current infrastructure and starting cost for establishing CalCompute.
SB-1047 also requires an AI developer to address various safety requirements before and after an AI model’s development. Developers are required to take preventative measures, such as:
- Establish a safety and security protocol for their model. They must also retain an unredacted copy of the safety and security protocol for as long as the covered model is made available for commercial or public use.
- Implement the capability to promptly and fully shutdown an AI model if it poses a serious threat to a commercial or public system. The capability is documented through a written and separate safety and security protocol.
- Annually perform a compliance audit through an independent third-party auditor and retain an unredacted copy of the auditor’s report.
The retained safety and audit documentation must be retained as long as the covered model is made available for commercial or public purposes or foreseeably public use plus 5 years. The bill also requires developers to give the California Attorney General access to the unredacted auditor’s report and safety protocol documentation upon request.
California’s SB-1047 Sets High Standards for AI Model Development and Safety
AI models covered by SB-1047 are those that require an extremely large amount of computing power (more than three times 10²⁵ operations per second) and cost over $10 million to develop, based on current cloud computing prices. The computing power threshold, according to Senator Weiner, is similar to the operations threshold mentioned in President Biden’s Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence. The value was chosen to represent a potential future computing milestone in AI development, giving developers time and capacity to develop models much like automakers who must meet future EPA vehicle gas mileage targets for their cars and trucks.
The Safe and Secure Innovation for Frontier Artificial Intelligence Models Act differs from the other AI legislation like the Colorado State Law because of its focus on AI mode failsafes being a must-do task in a model’s development. The bill looks for developers to provide proof of testing their model and demonstrating a shutdown mechanism before launching a model for public or commercial consumption.
The bill is also modeled on recent data privacy legislation. Discussing the bill on a Bloomberg video segment, Senator Wiener noted that SB-1047 was influenced by the EU Act and Biden’s Executive Order, but with an aim on prevention, whereas actions stemming from EU Act and Biden’s Executive Order focused on disclosure requirements when an incident with data occurred.
In crafting the initial legislation, Senator Weiner felt that California should take a stand for the government addressing the influence AI can bring before a major catastrophe, like disrupting a power grid, happens.
Related Article: Generative AI: Exploring Ethics, Copyright and RegulationSupporters and Opponents of California SB-1047
Because the bill is being authored by a state senator who has Silicon Valley within his jurisdiction, leaders from the tech industry have quickly turned their attention to the bill’s progress toward passage.
Mark Surman at Mozilla has stated he felt the bill was in the right direction but also voiced concerns it will harm open source.
Marc Andreesen of Y Combinator has been the most prominent critic. His firm Andreesen and Horowitz launched a website, stopsb1047.com, to sway industry players and the public against the bill.
A few state senators, including former speaker Nancy Pelosi, are opposed to wording in the bill. Seven senators sent a letter to Governor Newsom asking him to veto SB-1047.
The concern is that the legislation wording is too vague, creating an unjustified administrative burden on startups and smaller developers to meet documentation requirements. They feel that SB-1047 would have a chilling effect on the innovation of open-sourced AI models and drive investment dollars out of California.
Senator Wiener maintains that academics and industrialists from both perspectives have weighed in and that all perspectives have been incorporated into the bill’s structure. In the Bloomberg video, Wiener noted that some of the most vocal detractors have passed along misinformation about the bill, such as the claim that the bill creates a liability for developers. Weiner explained that the bill is meant to supplement existing responsibilities and liabilities that developers already have, with the state attorney general gaining specific rights and access to an AI model and its safety and audit information.
Senator Wiener has noted many high-profile signs of support: Anthropic, the AI maker behind the popular generative AI solution Claude.ai, and two OpenAI workers.
A surprise supporter has been Elon Musk. Musk, who is developing an AI model for his X platform, stated in a post on X that he thought the legislation was a “tough call and will make some people upset.” However, he states that he thinks “California should probably pass the SB 1047 AI safety bill.” It is an interesting statement, given that Musk has moved the headquarters of his companies SpaceX, Tesla and X from California to Texas.
The provisions of SB1047 call for the first reports from developers and the government operating committee during 2026.
Senator Weiner will not speculate on the bill’s fate on Governor Newsom’s desk as further deliberations are finalizing the bill before the state legislature’s vote. He did state he is pleased by the groundswell of support that has happened so far.
California SB-1047 is one of many AI bills that have been brought up in state legislatures, with each meant to address the technology’s potential effects on jobs, disinformation and public safety. In the meantime, the world will have to await for which U.S. state will carry the AI legislation crown.
Learn how you can join our contributor community.