
California Just Finalized Its AI Regulations. Here's What Your Business Actually Needs to Do.
Two-Sentence Summary
California finalized new rules that require businesses using AI or automated tools to make decisions about people — like who gets hired, what ads they see, or whether they get approved for something — to explain what they're doing, let people opt out, and document the risks. The rules are already in effect for risk assessments, with AI-specific notice and opt-out requirements kicking in on January 1, 2027, and the agency enforcing them has already issued millions of dollars in fines.
On September 22, 2025, the California Office of Administrative Law approved a package of regulations that had been in development for nearly three years. One week later, the California Privacy Protection Agency — now known as CalPrivacy — put out a press release that most businesses completely ignored.
That was a mistake.
These regulations cover cybersecurity audits, risk assessments, and something called Automated Decisionmaking Technology, or ADMT. If your business uses AI to make decisions about people — who gets hired, who sees which ads, who gets approved for what — you're probably covered. And the compliance clock is already ticking.
Here's what's actually happening, sourced entirely from the agency's own published documents. No speculation, no paraphrasing of paraphrases, no "we think this means..."
What changed in California's AI and automated decision-making rules, and when do they take effect?
The California Privacy Protection Agency finalized ADMT, risk assessment, and cybersecurity audit regulations on September 22, 2025, effective January 1, 2026. Risk assessment compliance began immediately on that date. ADMT notice and opt-out requirements begin January 1, 2027. Cybersecurity audit certifications are due to CalPrivacy between April 2028 and April 2030 depending on company revenue.
The California Privacy Protection Agency Board adopted these regulations on July 24, 2025. The Office of Administrative Law approved them on September 22, 2025. They went into effect on January 1, 2026.
But here's the part that matters: the deadlines are staggered depending on what kind of obligation you're dealing with.
Risk assessments — compliance began January 1, 2026. This is live right now. If your business processes personal information in ways that present significant risk to consumers' privacy, you need to be conducting and documenting risk assessments today. Summaries and attestations must be submitted to CalPrivacy by April 1, 2028.
ADMT requirements — compliance begins January 1, 2027. You have about nine months. If your business uses automated decisionmaking technology to make significant decisions about consumers, you will need to provide pre-use notices and offer opt-out rights.
Cybersecurity audits — the deadlines depend on your company's revenue. Businesses over $100 million must submit certifications by April 1, 2028. Between $50 million and $100 million, you have until April 1, 2029. Under $50 million, April 1, 2030.
Who is covered by California's CCPA ADMT and risk assessment regulations?
These regulations apply to any business that does business in California and meets at least one CCPA threshold: annual gross revenues over $25 million, buying/selling/sharing personal information of 100,000 or more consumers or households per year, or deriving 50% or more of annual revenues from selling or sharing consumer personal information. The $25 million threshold is gross revenue nationally — not California-specific — so many businesses outside California are covered.
This applies to you if your business meets the California Consumer Privacy Act thresholds. In practical terms, your business is covered if it does business in California AND meets any one of these: annual gross revenue over $25 million, buys/sells/shares the personal information of 100,000 or more consumers or households per year, or derives 50% or more of annual revenue from selling or sharing consumers' personal information.
That revenue threshold is gross revenue, not California-specific revenue. If your business makes $25 million nationally and has even one California customer, you're subject to the CCPA.
What documents does California's CCPA ADMT regulation actually require businesses to create and maintain?
California's finalized regulations require three distinct document sets: (1) risk assessments for each processing activity presenting significant privacy risk, with an attestation and summary due to CalPrivacy by April 1, 2028; (2) ADMT pre-use notice templates and opt-out process documentation, required before January 1, 2027; and (3) annual cybersecurity audit certifications, due on a revenue-tiered schedule between 2028 and 2030.
This is where it gets real. Here's what the regulations require you to create and maintain, broken into the three categories.
For Risk Assessments (deadline: now)
You need to conduct and document a risk assessment for each processing activity that presents significant risk to consumers' privacy. The regulations specify that this includes, at minimum, processing personal information for the purpose of profiling consumers, selling or sharing personal information, and using automated decisionmaking technology for significant decisions.
Each risk assessment must weigh the benefits of the processing against the risks to consumers' privacy. You need to document the assessment itself, the outcome, and your rationale. By April 1, 2028, you must submit an attestation to CalPrivacy confirming that required assessments were completed, along with a summary of your risk assessment information.
That means right now you should have: a risk assessment framework document that explains how you evaluate processing activities, individual risk assessments for each qualifying processing activity, and a system for tracking and updating these assessments over time.
For ADMT (deadline: January 1, 2027)
When your business uses ADMT to make a significant decision concerning a consumer, you must provide a pre-use notice. This notice must be provided before or at the point of using the ADMT, and it needs to explain what the technology is, how it's being used, and what kind of decision it's involved in.
Consumers must also have the ability to opt out of ADMT in certain circumstances. This means you need a documented opt-out process and a mechanism for consumers to exercise that right.
The documents you'll need here: a pre-use ADMT notice template, an opt-out mechanism and process documentation, internal policies for how your organization handles opt-out requests, and records of when and how notices were provided.
For Cybersecurity Audits (deadline: varies by revenue)
You'll need to conduct annual cybersecurity audits and submit a certification to CalPrivacy based on the revenue-tiered timeline described above. The audit must be thorough and the certification must be signed by a qualified individual.
Is CalPrivacy actively enforcing the CCPA, and what penalties has it already imposed?
CalPrivacy has issued over $2 million in fines in the past six months alone: $1.35 million against Tractor Supply Company (its largest fine ever), $632,500 against American Honda, $345,178 against Todd Snyder, and two more enforcement actions in January 2026 including an order requiring Datamasters to stop selling all Californians' personal information. The California Attorney General retains independent enforcement authority as well.
If you're wondering whether California actually enforces this stuff, consider what CalPrivacy has done in just the last six months.
They fined Tractor Supply Company $1.35 million for CCPA violations. They fined clothing retailer Todd Snyder $345,178 and required the company to overhaul its privacy practices. They fined American Honda Motor Co. $632,500. They forced a data broker called Background Alert to either shut down or pay a steep fine. And in January 2026, they issued two more decisions — fining Datamasters $45,000 and S&P Global $62,600 — and ordered Datamasters to stop selling all Californians' personal information entirely.
This is an agency that fined a Fortune 500 company, brought enforcement actions against more than ten data brokers in a single year, launched a bipartisan consortium with other state regulators, and now has finalized rules that give them explicit authority over how businesses use AI to make decisions about people.
They're also not the only enforcers. The California Attorney General retains independent enforcement authority under the CCPA.
What do California's CCPA ADMT regulations mean for businesses operating right now in 2026?
Risk assessments are already required as of January 1, 2026 — if your business processes personal data for profiling, targeted advertising, or AI-driven decisions and has not started documenting assessments, it is currently out of compliance. ADMT pre-use notices and opt-out mechanisms must be operational by January 1, 2027. CCPA penalties of $2,500 per unintentional violation and $7,500 per intentional violation compound rapidly across thousands of affected consumers.
If you're reading this in March 2026, here's where you stand:
Risk assessments are already required. If you're processing personal data for profiling, targeted advertising, or AI-driven decisions and you haven't started documenting your risk assessments, you're behind. The submission deadline isn't until April 2028, but the compliance obligation started January 1, 2026. If CalPrivacy comes knocking tomorrow, you need to show that you've been conducting assessments since that date.
ADMT requirements start in nine months. January 1, 2027 sounds far away until you realize you need to identify every automated decisionmaking process in your business, draft and implement pre-use notices, build opt-out mechanisms, train your team, and document everything. Nine months is not a lot of time for all of that.
The CCPA penalty baseline increased in 2025. CalPrivacy announced increases to CCPA fines and penalties effective 2025. The statute allows for penalties of $2,500 per unintentional violation and $7,500 per intentional violation. When you process data for thousands of consumers, those numbers multiply fast.
What should a business do this week to start complying with California's CCPA ADMT regulations?
Start by confirming coverage — check your gross revenue against the $25 million CCPA threshold and count how many California consumers' data you process. Then inventory every AI and automated decision-making tool that processes personal information. For each qualifying processing activity, begin documenting a risk assessment covering purpose, data involved, benefits, risks, and mitigation measures. Risk assessments are required now; ADMT notices must be ready by January 1, 2027.
First, figure out if you're covered. Check your revenue against the $25 million threshold. Count how many California consumers' data you process. If you're above the line, this applies to you.
Second, inventory your AI and automated decision-making. Every tool, every algorithm, every system that processes personal information to make or inform a decision about a consumer needs to be on a list. Our data mapping and AI system inventory template is designed specifically for this step — it gives you a structured format for documenting each system and the personal data it touches.
Third, start your risk assessments. For each processing activity involving profiling, targeted advertising, selling/sharing data, or ADMT, document the purpose, the data involved, the benefits, the risks, and your mitigation measures. Our California CCPA ADMT compliance package includes a risk assessment framework built around the regulation's specific requirements.
Fourth, start planning your ADMT notices and opt-out mechanisms. You have until January 2027, but the notice requirements are detailed and getting them right takes time. Our consumer rights kit covers opt-out processes, notice templates, and the request-handling procedures California's ADMT rules require.
None of this requires specialized legal training. It requires reading the regulation, understanding what it asks for, and writing it down. That's manageable — and now you know where to start.
What Is a Risk Assessment?
Imagine you're building a treehouse. Before you start hammering, a smart builder would look at the tree and ask: Is this branch strong enough? Could the treehouse fall and hurt someone? What's the worst thing that could happen? That's basically what a risk assessment is — it's the process of looking at something you want to do, figuring out what could go wrong, and deciding whether the benefits are worth the risks.
In the world of business and privacy, a risk assessment works the same way. When a company wants to use people's personal information — especially to make decisions about them using AI — California's new rules say the company has to stop and think carefully first. They have to write down what data they're collecting, what they're using it for, what could go wrong for the people whose data it is, and what they're doing to prevent those bad outcomes. It's like a homework assignment that forces companies to prove they've thought things through before they start.
What makes California's version especially important is that it's not just a one-time exercise. Companies have to do these assessments for every processing activity that could put people's privacy at risk — things like using AI to profile consumers, selling people's data, or using automated tools to make big decisions about someone's life. And they have to keep updating them. By April 2028, they even have to send a summary and a signed statement to California's privacy agency confirming they actually did the work.
Think of it as the difference between a restaurant that says 'trust us, the food is safe' versus one that has documented health inspections you can actually check. California wants companies to show their work — not just promise they're being careful with your data.
4 facts
- [1]CalPrivacy Press Release: California Finalizes Regulations to Strengthen Consumers' Privacy (September 23, 2025) (opens in new tab)
- [2]CCPA Updates, Cybersecurity Audits, Risk Assessments, ADMT, and Insurance Regulations (opens in new tab)
- [3]CalPrivacy Enforcement: Tractor Supply Company $1.35M Fine (September 30, 2025) (opens in new tab)
- [4]CalPrivacy Enforcement: Datamasters and S&P Global Data Broker Actions (January 8, 2026) (opens in new tab)
- [5]CalPrivacy Announcements (Penalty Increases) (opens in new tab)
Disclaimer: This article is for informational purposes only and does not constitute legal advice, legal representation, or an attorney-client relationship. Laws and regulations change frequently. You should consult a licensed attorney to verify that the information in this article is current, complete, and applicable to your specific situation before relying on it. AI Compliance Documents is not a law firm and does not practice law.
More from the blog
Texas TRAIGA Has Been Live for 4 Months. Here's What the AG Is Doing — and What You Should Be Ready For.
Texas TRAIGA has been live for 4 months. Zero public AG enforcement so far. The complaint portal launches September 1, 2026 — and what you have documented before that matters more than what you do after.
Colorado's AI Law Takes Effect June 30, 2026. Here's What It Requires.
Colorado's AI law takes effect June 30, 2026. No amending bill has been introduced. The legislature has failed to revise the law four times. The deadline is real.
Workday AI Hiring Lawsuit Could Reshape Employer Liability
A federal court is testing whether AI vendors — not just employers — can be sued for discriminatory hiring outcomes. The certified class could include hundreds of millions of applicants.
Colorado AI Compliance for HR Software Companies: What SB 24-205 Means for Your Product
HR software that screens candidates, scores performance, or ranks employees is classified as high-risk AI under Colorado's law. The June 30, 2026 deadline applies to both the companies that build these tools and the HR teams that use them.
Do I Need AI Compliance? A Decision Framework for Every Business Using AI
Not sure if AI compliance applies to your business? Walk through four questions — and know exactly which laws apply, which documents you need, and where to start.
Operating in Multiple States? Here's How AI Compliance Stacks Up Across 15 Jurisdictions
Colorado, California, Texas, Illinois, and NYC all have active AI laws — and they don't all require the same things. If you operate in multiple states, here's what applies to you and why.
Oregon Consumer Privacy Act: What Your Business Needs to Know About AI Profiling Requirements
Oregon's privacy law has been in effect since July 2024, requires data protection assessments for AI profiling, and flatly prohibits processing personal data of consumers under 16 for targeted advertising or data sales — a protection not found in most other state laws. The 30-day cure period effectively expired for most businesses on January 1, 2026 (Oregon Laws 2025, c.417).
What Is an AI Impact Assessment? The Document Every State Law Now Requires
Colorado, California, and Illinois all require some version of an AI impact assessment — but they don't call it the same thing or require the same format. Here's what every version has in common, and what each state specifically demands.
What Is a High-Risk AI System? A Plain-Language Guide for Business Owners
Three different laws. Three different definitions of 'high-risk AI.' If your business uses AI to make decisions about people, here's how to figure out which rules apply to you.
The Federal Government Quietly Removed Its AI Hiring Guidance. Four States Are Writing Their Own.
The federal government removed every page of AI hiring guidance it ever published. Over a year later, the pages are still down. Four states wrote their own — and none of them agree.
AI governance framework checklist: what every enacted state law actually requires
Colorado, Texas, and Illinois all passed AI laws with deadlines in early 2026 — and none of them are identical. Here's the one compliance checklist that covers all three at once.
You're HIPAA-Compliant. That's Not Enough Anymore.
HIPAA protects patient records. It has nothing to say about whether the AI making decisions about those patients is fair. New rules are filling that gap — and they apply to you even if your HIPAA program is airtight.
The NIST AI Risk Management Framework: What It Is and Why Colorado Made It a Legal Shield
The US government published a free framework for managing AI risk — and Colorado's AI law turns following it into a legal shield. If something goes wrong with your AI, this is the document that shifts the burden of proof.
Texas TRAIGA (HB 149): What the Texas Responsible AI Governance Act Requires and How to Comply
Texas passed an AI law that applies to every business — no exemptions for small companies, no carveout for low-risk tools. It's already in effect, and a single uncurable violation starts at $80,000.
What Does AI Compliance Actually Cost a Small Business in 2026?
AI compliance can cost $49 or $50,000 — depending on what you actually need. Here's what each option costs in real numbers, so you can stop guessing and start budgeting.
AI Compliance Penalties by State: What Happens If You Ignore the Law
"Per violation" sounds like one fine. It isn't. Here's what the penalty math actually looks like state by state — and why the numbers can compound into company-ending territory fast.
AI and HIPAA: What Healthcare Businesses Must Do Now
If an AI tool touches patient data at your healthcare organization, HIPAA applies — and most vendor contracts aren't written to cover it. Here's what you need before you deploy.
EU AI Act Compliance Checklist: What US Businesses Need Before August 2026
Europe's AI law applies to US companies — even ones with no European office. If your AI is used by anyone in the EU, the deadline is August 2026 and the fines are calculated on your global revenue.
ISO 42001: The AI Certification Your Enterprise Clients Will Soon Require
Enterprise clients are starting to require ISO 42001 certification before they'll buy AI products — the same way they require SOC 2. Here's what the standard actually requires and why getting it early is a competitive advantage.
What Is an AI Bias Audit and Does Your Business Need One?
New York City requires an annual test of any AI hiring tool to check whether it's filtering out one group of people more than others. If you hire in NYC, this isn't optional — here's what the audit actually involves.
Get your compliance documentation done
Stop reading, start complying. Our packages generate the documents you need based on the actual statutes.
Browse Compliance Packages













