Skip to main content
Back to Blog
California Just Finalized Its AI Regulations. Here's What Your Business Actually Needs to Do.
CaliforniaCCPAADMTrisk assessmentcompliance deadlineCalPrivacy

California Just Finalized Its AI Regulations. Here's What Your Business Actually Needs to Do.

AI Compliance Documents Team11 min read

Two-Sentence Summary

California finalized new rules that require businesses using AI or automated tools to make decisions about people — like who gets hired, what ads they see, or whether they get approved for something — to explain what they're doing, let people opt out, and document the risks. The rules are already in effect for risk assessments, with AI-specific notice and opt-out requirements kicking in on January 1, 2027, and the agency enforcing them has already issued millions of dollars in fines.

On September 22, 2025, the California Office of Administrative Law approved a package of regulations that had been in development for nearly three years. One week later, the California Privacy Protection Agency — now known as CalPrivacy — put out a press release that most businesses completely ignored.

That was a mistake.

These regulations cover cybersecurity audits, risk assessments, and something called Automated Decisionmaking Technology, or ADMT. If your business uses AI to make decisions about people — who gets hired, who sees which ads, who gets approved for what — you're probably covered. And the compliance clock is already ticking.

Here's what's actually happening, sourced entirely from the agency's own published documents. No speculation, no paraphrasing of paraphrases, no "we think this means..."

What Changed and When

The California Privacy Protection Agency Board adopted these regulations on July 24, 2025. The Office of Administrative Law approved them on September 22, 2025. They went into effect on January 1, 2026.

But here's the part that matters: the deadlines are staggered depending on what kind of obligation you're dealing with.

Risk assessments — compliance began January 1, 2026. This is live right now. If your business processes personal information in ways that present significant risk to consumers' privacy, you need to be conducting and documenting risk assessments today. Summaries and attestations must be submitted to CalPrivacy by April 1, 2028.

ADMT requirements — compliance begins January 1, 2027. You have about nine months. If your business uses automated decisionmaking technology to make significant decisions about consumers, you will need to provide pre-use notices and offer opt-out rights.

Cybersecurity audits — the deadlines depend on your company's revenue. Businesses over $100 million must submit certifications by April 1, 2028. Between $50 million and $100 million, you have until April 1, 2029. Under $50 million, April 1, 2030.

Who Is Covered

This applies to you if your business meets the California Consumer Privacy Act thresholds. In practical terms, your business is covered if it does business in California AND meets any one of these: annual gross revenue over $25 million, buys/sells/shares the personal information of 100,000 or more consumers or households per year, or derives 50% or more of annual revenue from selling or sharing consumers' personal information.

That revenue threshold is gross revenue, not California-specific revenue. If your business makes $25 million nationally and has even one California customer, you're subject to the CCPA.

What Documents You Actually Need

This is where it gets real. Here's what the regulations require you to create and maintain, broken into the three categories.

For Risk Assessments (deadline: now)

You need to conduct and document a risk assessment for each processing activity that presents significant risk to consumers' privacy. The regulations specify that this includes, at minimum, processing personal information for the purpose of profiling consumers, selling or sharing personal information, and using automated decisionmaking technology for significant decisions.

Each risk assessment must weigh the benefits of the processing against the risks to consumers' privacy. You need to document the assessment itself, the outcome, and your rationale. By April 1, 2028, you must submit an attestation to CalPrivacy confirming that required assessments were completed, along with a summary of your risk assessment information.

That means right now you should have: a risk assessment framework document that explains how you evaluate processing activities, individual risk assessments for each qualifying processing activity, and a system for tracking and updating these assessments over time.

For ADMT (deadline: January 1, 2027)

When your business uses ADMT to make a significant decision concerning a consumer, you must provide a pre-use notice. This notice must be provided before or at the point of using the ADMT, and it needs to explain what the technology is, how it's being used, and what kind of decision it's involved in.

Consumers must also have the ability to opt out of ADMT in certain circumstances. This means you need a documented opt-out process and a mechanism for consumers to exercise that right.

The documents you'll need here: a pre-use ADMT notice template, an opt-out mechanism and process documentation, internal policies for how your organization handles opt-out requests, and records of when and how notices were provided.

For Cybersecurity Audits (deadline: varies by revenue)

You'll need to conduct annual cybersecurity audits and submit a certification to CalPrivacy based on the revenue-tiered timeline described above. The audit must be thorough and the certification must be signed by a qualified individual.

CalPrivacy Is Not Playing Around

If you're wondering whether California actually enforces this stuff, consider what CalPrivacy has done in just the last six months.

They fined Tractor Supply Company $1.35 million for CCPA violations. They fined clothing retailer Todd Snyder $345,178 and required the company to overhaul its privacy practices. They fined American Honda Motor Co. $632,500. They forced a data broker called Background Alert to either shut down or pay a steep fine. And in January 2026, they issued two more decisions — fining Datamasters $45,000 and S&P Global $62,600 — and ordered Datamasters to stop selling all Californians' personal information entirely.

This is an agency that fined a Fortune 500 company, brought enforcement actions against more than ten data brokers in a single year, launched a bipartisan consortium with other state regulators, and now has finalized rules that give them explicit authority over how businesses use AI to make decisions about people.

They're also not the only enforcers. The California Attorney General retains independent enforcement authority under the CCPA.

What This Means for You Right Now

If you're reading this in March 2026, here's where you stand:

Risk assessments are already required. If you're processing personal data for profiling, targeted advertising, or AI-driven decisions and you haven't started documenting your risk assessments, you're behind. The submission deadline isn't until April 2028, but the compliance obligation started January 1, 2026. If CalPrivacy comes knocking tomorrow, you need to show that you've been conducting assessments since that date.

ADMT requirements start in nine months. January 1, 2027 sounds far away until you realize you need to identify every automated decisionmaking process in your business, draft and implement pre-use notices, build opt-out mechanisms, train your team, and document everything. Nine months is not a lot of time for all of that.

The CCPA penalty baseline increased in 2025. CalPrivacy announced increases to CCPA fines and penalties effective 2025. The statute allows for penalties of $2,500 per unintentional violation and $7,500 per intentional violation. When you process data for thousands of consumers, those numbers multiply fast.

What You Should Do This Week

First, figure out if you're covered. Check your revenue against the $25 million threshold. Count how many California consumers' data you process. If you're above the line, this applies to you.

Second, inventory your AI and automated decision-making. Every tool, every algorithm, every system that processes personal information to make or inform a decision about a consumer needs to be on a list. Our data mapping and AI system inventory template is designed specifically for this step — it gives you a structured format for documenting each system and the personal data it touches.

Third, start your risk assessments. For each processing activity involving profiling, targeted advertising, selling/sharing data, or ADMT, document the purpose, the data involved, the benefits, the risks, and your mitigation measures. Our California CCPA ADMT compliance package includes a risk assessment framework built around the regulation's specific requirements.

Fourth, start planning your ADMT notices and opt-out mechanisms. You have until January 2027, but the notice requirements are detailed and getting them right takes time. Our consumer rights kit covers opt-out processes, notice templates, and the request-handling procedures California's ADMT rules require.

None of this requires specialized legal training. It requires reading the regulation, understanding what it asks for, and writing it down. That's manageable — and now you know where to start.

What Is a Risk Assessment?
Imagine you're building a treehouse. Before you start hammering, a smart builder would look at the tree and ask: Is this branch strong enough? Could the treehouse fall and hurt someone? What's the worst thing that could happen? That's basically what a risk assessment is — it's the process of looking at something you want to do, figuring out what could go wrong, and deciding whether the benefits are worth the risks. In the world of business and privacy, a risk assessment works the same way. When a company wants to use people's personal information — especially to make decisions about them using AI — California's new rules say the company has to stop and think carefully first. They have to write down what data they're collecting, what they're using it for, what could go wrong for the people whose data it is, and what they're doing to prevent those bad outcomes. It's like a homework assignment that forces companies to prove they've thought things through before they start. What makes California's version especially important is that it's not just a one-time exercise. Companies have to do these assessments for every processing activity that could put people's privacy at risk — things like using AI to profile consumers, selling people's data, or using automated tools to make big decisions about someone's life. And they have to keep updating them. By April 2028, they even have to send a summary and a signed statement to California's privacy agency confirming they actually did the work. Think of it as the difference between a restaurant that says 'trust us, the food is safe' versus one that has documented health inspections you can actually check. California wants companies to show their work — not just promise they're being careful with your data.
4 facts

Disclaimer: This article is for informational purposes only and does not constitute legal advice, legal representation, or an attorney-client relationship. Laws and regulations change frequently. You should consult a licensed attorney to verify that the information in this article is current, complete, and applicable to your specific situation before relying on it. AI Compliance Documents is not a law firm and does not practice law.

More from the blog

Get your compliance documentation done

Stop reading, start complying. Our packages generate the documents you need based on the actual statutes.

Browse Compliance Packages