AI Safety Advocates Split on Governor Newsom Panel’s Report
Advocates for strict guardrails over AI developers to prevent their models from causing catastrophic harms, such as releasing biological weapons, are split over initial recommendations from a working group assembled by California Gov. Gavin Newsom (D).
Some safety groups back the March 18 preliminary proposals issued by the team of academics, such as audits by third parties to assess a model’s risk. Other tech critics say the group should add tougher compliance rules to hold AI companies more accountable.
State Sen. Scott Wiener (D) wants to shape his new bill (SB 53) based on the report, after Newsom ...