California Just Became the De Facto National AI Regulator. Again.
By Don Ho, Esq. | April 3, 2026
Governor Gavin Newsom signed a new executive order on April 2, 2026, tightening AI procurement standards for every company that wants to do business with the state of California. On the same day, Axios reported that multiple AI policy sources confirmed those standards will effectively become national policy, because no company building AI products is going to maintain two versions of their compliance program: one for California and one for everybody else.
This is the California Effect in real time. And if you're a GC, compliance officer, or business operator deploying AI anywhere in the United States, you need to understand what just happened.
What the Executive Order Actually Does
Newsom's order targets AI companies that sell to or contract with California state agencies. The new procurement standards require vendors to document how their AI systems were trained, disclose what data was used, and demonstrate that outputs meet specific accuracy and bias benchmarks before any state contract is awarded.
The order also creates a new review process for "high-risk" AI deployments in state government, covering everything from benefits eligibility determinations to law enforcement tools. Vendors in those categories will need to submit to third-party audits and ongoing monitoring requirements as a condition of doing business.
None of this is optional. If you want California's money (and it's the fifth-largest economy in the world), you meet the standards. Period.
Why This Becomes the National Standard
The White House released its National Policy Framework for Artificial Intelligence on March 20, 2026. It's four pages long. It proposes that Congress pass legislation preempting state AI laws that impose "undue burdens." It directs no agency to take a specific action. It sets no compliance deadlines. It establishes no penalties.
In other words, it's a wishlist, not a law.
Meanwhile, California is actually doing things. The state has 40 million residents, a $4 trillion GDP, and procurement budgets that dwarf most countries. When California sets a standard, companies comply because the math is simple: the cost of maintaining a California-specific compliance program is lower than the cost of losing California as a customer.
We've seen this movie before. California's emissions standards became the de facto national vehicle emissions policy. The California Consumer Privacy Act forced companies to adopt privacy practices nationwide rather than run two data programs. Now the same dynamic is playing out with AI.
Multiple AI and tech policy sources told Axios that Newsom's executive order itself may lack strong legal teeth on enforcement. That's beside the point. The power isn't in the penalty. The power is in the market access. Companies will self-comply because they can't afford not to.
The Federal Preemption Problem
The White House framework explicitly proposes preempting state AI laws. But preemption requires actual legislation, and Congress hasn't passed a single one of the 40+ AI bills introduced since 2023. The Senate Commerce Committee's bipartisan framework from February 2026 attracted 200+ public comments and then stalled. There is no federal AI statute. There is no federal enforcement mechanism. There is no timeline.
The administration wants to stop states from creating a "patchwork of 50 different regulatory regimes." That concern is legitimate. Compliance complexity is real. But the solution to a patchwork isn't an empty framework. It's a law. And until Congress passes one, states are going to keep filling the gap.
Newsom knows this. He's positioning California (and himself, as a 2028 presidential contender) as the inverse of the Trump administration's approach to AI regulation. Whether you agree with that positioning or not, the operational reality is the same: if you deploy AI products and you have any customers in California, Newsom's standards are now your standards.
What Georgia, Colorado, and Texas Are Doing
California isn't alone. Georgia Governor Brian Kemp has three AI bills on his desk right now, all passed by the legislature before the April 6 adjournment deadline. SB 540 covers chatbot disclosure and child safety. SB 444 prohibits insurance coverage decisions from being based solely on AI systems. SR 789 creates an AI study committee.
Colorado's AI Act, enacted in 2024, already imposes governance, risk assessment, and documentation requirements for high-risk AI systems used in employment, insurance, and consumer services. Those obligations are live and enforceable.
Texas has enacted AI-specific statutes targeting healthcare AI, requiring human oversight in clinical decision-making and limiting AI's role in medical necessity determinations.
The pattern is consistent: states are not waiting for Congress. They are writing AI law in real time. And California, with the biggest market and the most aggressive governor, is setting the pace.
What to Do Now
If you sell AI products or services to any government entity: Assume California's procurement standards will become the baseline. Start documenting your training data sources, accuracy benchmarks, and bias testing protocols now. Don't wait for the formal procurement requirements to be published.
If you deploy AI in hiring, insurance, healthcare, or benefits: You're already subject to state-specific requirements in California, Colorado, New York, and Texas. Map your current AI deployments against each state's requirements. If you haven't done a compliance gap analysis, you're behind.
If you're a GC advising a board: The question is no longer whether AI regulation is coming. It arrived. The question is whether your company's AI governance program can withstand a state AG investigation or a procurement audit. If you can't answer that question with specifics, start building.
If you're holding out for federal preemption: Stop. Even if Congress passes something in 2027 (optimistic), the framework explicitly preserves state authority over procurement, fraud, and consumer protection. Those are the exact categories where state AI enforcement is concentrated. Federal preemption, if it ever happens, won't cover the areas that matter most.
The California Effect is not new. What's new is the speed. Newsom moved from executive order to procurement standard in a single action. Companies that wait for "clarity" before building AI compliance programs are going to find that clarity looks like an audit they're not ready for.
Don Ho, Esq. is Co-Founder & CEO of Kaizen AI Lab, advising companies on operational growth strategies and the legal aspects of AI integration in their businesses. When he's not navigating the intricate web of AI business policies and regulations, he's probably on dad duty or drinking a cuppa of Taiwanese oolong.