
The creator of California’s SB 1047, the nation’s most controversial AI security invoice of 2024, is again with a brand new AI invoice that would shake up Silicon Valley.
California state Senator Scott Wiener launched a new bill on Friday that will defend workers at main AI labs, permitting them to talk out in the event that they assume their firm’s AI programs could possibly be a “essential danger” to society. The brand new invoice, SB 53, would additionally create a public cloud computing cluster, referred to as CalCompute, to present researchers and startups the required computing assets to develop AI that advantages the general public.
Wiener’s final AI invoice, California’s SB 1047, sparked a full of life debate throughout the nation round find out how to deal with huge AI programs that would trigger disasters. SB 1047 aimed to prevent the possibility of very large AI models creating catastrophic events, comparable to inflicting lack of life or cyberattacks costing greater than $500 million in damages. Nevertheless, Governor Gavin Newsom finally vetoed the invoice in September, saying SB 1047 was not the best approach.
However the debate over SB 1047 shortly turned ugly. Some Silicon Valley leaders mentioned SB 1047 would hurt America’s competitive edge within the international AI race, and claimed the invoice was impressed by unrealistic fears that AI programs might result in science fiction-like doomsday eventualities. In the meantime, Senator Wiener alleged that some enterprise capitalists engaged in a “propaganda campaign” against his bill, pointing partially to Y Combinator’s declare that SB 1047 would ship startup founders to jail, a declare specialists argued was deceptive.
SB 53 primarily takes the least controversial components of SB 1047 – comparable to whistleblower protections and the institution of a CalCompute cluster – and repackages them into a brand new AI invoice.
Notably, Wiener shouldn’t be shying away from existential AI danger in SB 53. The brand new invoice particularly protects whistleblowers who consider their employers are creating AI programs that pose a “essential danger.” The invoice defines essential danger as a “foreseeable or materials danger {that a} developer’s improvement, storage, or deployment of a basis mannequin, as outlined, will end result within the dying of, or severe damage to, greater than 100 folks, or greater than $1 billion in injury to rights in cash or property.”
SB 53 limits frontier AI mannequin builders – possible together with OpenAI, Anthropic, and xAI, amongst others – from retaliating towards workers who disclose regarding data to California’s Legal professional Common, federal authorities, or different workers. Below the invoice, these builders could be required to report again to whistleblowers on sure inner processes the whistleblowers discover regarding.
As for CalCompute, SB 53 would set up a bunch to construct out a public cloud computing cluster. The group would encompass College of California representatives, in addition to different private and non-private researchers. It might make suggestions for find out how to construct CalCompute, how giant the cluster must be, and which customers and organizations ought to have entry to it.
In fact, it’s very early within the legislative course of for SB 53. The invoice must be reviewed and handed by California’s legislative our bodies earlier than it reaches Governor Newsom’s desk. State lawmakers will certainly be ready for Silicon Valley’s response to SB 53.
Nevertheless, 2025 could also be a harder 12 months to cross AI security payments in comparison with 2024. California handed 18 AI-related payments in 2024, however now it seems as if the AI doom movement has lost ground.
Vice President J.D. Vance signaled on the Paris AI Motion Summit that America shouldn’t be considering AI security, however relatively prioritizes AI innovation. Whereas the CalCompute cluster established by SB 53 might absolutely be seen as advancing AI progress, it’s unclear how legislative efforts round existential AI danger will fare in 2025.