Government & Policy | 4 min read

White House AI Policy Framework Recommends Congress Preempt All State AI Laws

The White House wants Congress to wipe out state AI laws deemed too burdensome—and Democrats have introduced legislation to do the opposite, setting up a defining fight over who governs AI in the United States.

Hector Herrera
Hector Herrera
A Office featuring document, related to White House AI Policy Framework Recommends Congress Preempt
Why this matters The White House wants Congress to wipe out state AI laws deemed too burdensome—and Democrats have introduced legislation to do the opposite, setting up a defining fight over who governs AI in the United States.

White House AI Policy Framework Recommends Congress Preempt All State AI Laws

The White House released a National Policy Framework for Artificial Intelligence in late March that asks Congress to preempt—that is, override and nullify—any state AI laws that impose "undue burdens" on AI development. The framework also recommends governing AI through existing federal agencies rather than creating a new regulatory body. Democrats have responded with the GUARDRAILS Act, which would repeal the White House framework and explicitly protect states' rights to regulate AI. The legislative fight over who governs AI in the United States is now formally underway.

The framework's details, analyzed by legal experts at Holland & Knight, represent the clearest statement yet of the administration's regulatory philosophy: keep AI governance at the federal level, rely on existing agency authorities, and remove friction from the development pipeline by standardizing rules nationally rather than allowing a patchwork of state-level requirements.

What the White House Framework Actually Says

The framework has three core positions:

1. Federal preemption of state AI laws. The document calls on Congress to pass legislation that would supersede state AI regulations deemed to create "undue burdens" on AI developers. This is the most contested provision. Roughly 40 states have introduced or passed some form of AI legislation in the past two years—covering areas from algorithmic hiring decisions to facial recognition to AI-generated political advertising. Under the White House proposal, much of that state law would become unenforceable.

2. Existing agencies, not a new AI regulator. Rather than creating an agency analogous to the EU's AI Office, the framework proposes that existing bodies—the FTC, FDA, EEOC, financial regulators—apply their existing authorities to AI in their respective domains. This is the lighter-touch approach, and it has precedent: the U.S. never created a dedicated internet regulator, relying instead on sectoral agencies.

3. Innovation-first framing. The framework explicitly positions U.S. AI competitiveness against China as the primary policy objective, with the argument that regulatory friction is a strategic liability. This framing shapes every recommendation in the document.

The Democratic Response: GUARDRAILS Act

Senate and House Democrats introduced the GUARDRAILS Act directly in response to the White House framework. The bill would:

  • Repeal the White House framework entirely
  • Protect states' existing AI laws from federal preemption
  • Establish minimum federal AI accountability standards rather than deferring entirely to existing agencies

The act's name is a statement of values: its sponsors argue that the administration's framework removes guardrails without establishing federal alternatives, leaving consumers and workers with less protection than they have today.

Why Preemption Is the Real Fight

The preemption question is not new to AI. It played out over decades in financial services (federal vs. state banking law), telecommunications (the FCC's authority to preempt state net neutrality laws), and pharmaceutical regulation. The pattern has generally favored federal preemption when industry argues for regulatory uniformity.

The AI version is more contested for a specific reason: Congress has not yet passed comprehensive federal AI legislation. If the White House framework's preemption recommendation is adopted before a federal floor is established, states lose their ability to regulate without a federal replacement in place. Critics call this "preemption without protection"—removing the existing patchwork without installing anything in its place.

Proponents argue the existing patchwork is itself the problem. If a company building an AI hiring tool has to comply with different rules in California, Colorado, Illinois, Texas, and New York, the compliance cost becomes prohibitive—particularly for smaller companies that can't maintain fifty-state legal teams.

What This Means for AI Developers and Enterprises

If you're building AI products for U.S. markets, the outcome of this debate will determine your compliance landscape for the next decade. The White House scenario: one federal standard, enforced through existing agencies, with relatively clear (if broad) rules. The GUARDRAILS Act scenario: a minimum federal floor plus active state regulation in high-protection jurisdictions.

If you're a large enterprise deploying AI in hiring, lending, healthcare, or other high-stakes domains, federal preemption simplifies compliance planning. But it also reduces the leverage you have with state attorneys general who have historically moved faster than federal agencies on consumer protection enforcement.

If you operate in the EU and U.S. simultaneously, the divergence between U.S. and EU AI regulation continues to widen. The EU AI Act is now in force and highly prescriptive. The U.S. framework moves in the opposite direction. Global AI governance harmonization—a stated goal of multiple international bodies—is becoming less likely, not more.

What to Watch

The legislative calendar matters here. The GUARDRAILS Act has limited probability of passing in the current Congress, but it signals the floor of Democratic opposition to preemption—meaning any bipartisan AI legislation will need to address the preemption question explicitly to attract Democratic votes.

Watch for committee hearings in May and June on the White House framework. The administration will need to define what "undue burden" means in legislation—a term vague enough to mean everything or nothing, and the definitions will be the real policy battle.


By Hector Herrera | April 25, 2026

Key Takeaways

  • 1. Federal preemption of state AI laws.
  • 2. Existing agencies, not a new AI regulator.
  • 3. Innovation-first framing.
  • Repeal the White House framework entirely
  • Protect states' existing AI laws from federal preemption

Did this help you understand AI better?

Your feedback helps us write more useful content.

Hector Herrera

Written by

Hector Herrera

Hector Herrera is the founder of Hex AI Systems, where he builds AI-powered operations for mid-market businesses across 16 industries. He writes daily about how AI is reshaping business, government, and everyday life. 20+ years in technology. Houston, TX.

More from Hector →

Get tomorrow's AI briefing

Join readers who start their day with NexChron. Free, daily, no spam.

More from NexChron