Skip to main content
Back to Blog
Illinois HB3773 Is Live. If You Use AI in Hiring, Here's What the Law Actually Says.
IllinoisHB3773AI employment lawIDHRcompliance deadline

Illinois HB3773 Is Live. If You Use AI in Hiring, Here's What the Law Actually Says.

AI Compliance Documents Team13 min read

Two-Sentence Summary

Illinois passed a law that says if your business uses any kind of AI tool to make decisions about hiring, firing, or promoting people in Illinois, you can't let that tool discriminate — and you have to tell your employees you're using it. The penalties for breaking this law can be up to $70,000 per violation, and the rules apply right now even though the state agency hasn't finished writing all the details yet.

Illinois House Bill 3773 became law on January 1, 2026. If you manage hiring, HR, or people operations at a company that operates in Illinois, this law applies to you. It doesn't matter where your company is headquartered. If you have employees in Illinois and you use any form of AI in employment decisions, you have obligations under this statute right now.

Here's the thing: most of the articles you'll find about this law are written by law firms billing $500 an hour or by software companies trying to sell you a platform. Both have reasons to make it sound more complicated than it is. The statute itself is actually pretty short. The problem is that it's short in a way that leaves some important questions unanswered — and the agency responsible for filling in those gaps hasn't finished doing so yet.

So let's walk through exactly what the law says, what it doesn't say, and what you need to do about it today.

Where the Law Lives

HB3773 isn't a standalone act. It amends the Illinois Human Rights Act by adding subdivision (L) to Section 2-102. The full citation is 775 ILCS 5/2-102(L). You can read the complete, current text of Section 2-102 on the Illinois General Assembly website.

The bill itself was enrolled as Public Act 103-0804, signed by Governor Pritzker, with an effective date of January 1, 2026.

There's also been a subsequent amendment — Public Act 104-0417, effective August 15, 2025 — that touched Section 2-102 as well. The current codified version reflects all amendments through the present.

What the Law Actually Says

Subdivision (L) does two things. That's it. Two things.

First, it makes it a civil rights violation for an employer to use artificial intelligence in employment decisions in a way that has a discriminatory effect on protected classes. The exact scope covers recruitment, hiring, promotion, renewal of employment, selection for training or apprenticeship, discharge, discipline, tenure, and the terms, privileges, or conditions of employment. It also specifically prohibits using zip codes as a proxy for protected classes.

Second, it makes it a civil rights violation for an employer to fail to provide notice to employees that the employer is using artificial intelligence for those purposes.

That's the entire operative text. Two paragraphs. One says don't use AI in a way that discriminates. The other says tell your employees when you're using AI for employment decisions.

What the Law Defines

The law defines "artificial intelligence" broadly. It means a machine-based system that, for explicit or implicit objectives, infers from its input how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. The definition explicitly includes generative AI.

This is not limited to fancy machine learning models. If you use any automated system that takes inputs and generates outputs that influence employment decisions — including AI-assisted resume screening, chatbot-based candidate assessments, performance analytics tools, or even AI-powered scheduling systems that affect working conditions — you are likely within scope.

What the Law Does Not Say (Yet)

Here's where it gets tricky, and where most other guides either gloss over the details or make things up.

The statute says employers must provide notice. But it delegates the specifics to the Illinois Department of Human Rights. The statute text reads: "The Department shall adopt any rules necessary for the implementation and enforcement of this subdivision, including, but not limited to, rules on the circumstances and conditions that require notice, the time period for providing notice, and the means for providing notice."

As of today — March 2026 — IDHR has not published those implementing rules. The Department's own legislative update page for this law states: "IDHR is currently developing rules to implement this law."

This means several critical questions don't have official answers yet: What exact format must the notice take? How far in advance must notice be given? Does notice need to go to applicants, current employees, or both? Does the notice need to describe the specific AI systems being used, or just acknowledge that AI is being used generally?

The statute says notice is required. The statute delegates the details to IDHR. IDHR hasn't published the details. That's where things stand.

Why "No Rules Yet" Doesn't Mean "No Obligation Yet"

Some employers are reading this situation and thinking they can wait until IDHR publishes its rules before doing anything. That's a risky bet, for two reasons.

First, the statute itself is already in effect. The notice requirement exists in law right now. IDHR's rules will clarify the details, but the underlying obligation is live as of January 1, 2026. If an employee files a complaint with IDHR alleging that their employer used AI in an employment decision without providing notice, IDHR can investigate that complaint today.

Second, the non-discrimination provision requires no implementing rules. Using AI in a way that has a discriminatory effect is already a civil rights violation. That part of the law is fully operative. If your AI-powered hiring tool disproportionately screens out candidates based on race, sex, disability, or any other protected class under the Illinois Human Rights Act, you are in violation right now, and you don't need IDHR to publish rules for that to be enforceable.

What Happens If You Violate This Law

HB3773 didn't create a new penalty structure. It plugged AI-related violations into the existing enforcement framework of the Illinois Human Rights Act, which has been around for decades. The penalties are laid out in Section 8A-104 of the Act (775 ILCS 5/8A-104).

Penalties per violation are: up to $16,000 if this is the employer's first civil rights violation, up to $42,500 if the employer has been found to have committed one other violation within the past five years, and up to $70,000 if the employer has two or more prior violations within the past seven years.

These are per-violation amounts. A separate penalty can be imposed for each specific act constituting a violation and for each aggrieved party. If you use a discriminatory AI tool in your hiring process and it affects 200 applicants, the math gets unpleasant fast.

Beyond fines, the statute allows the Illinois Human Rights Commission to order: cease and desist orders, actual damages to injured parties, hiring or reinstatement of affected employees with back pay, attorney fees and expert witness fees, and any other relief necessary to make the complainant whole.

Enforcement is handled by the Illinois Department of Human Rights through its existing charge-filing process. An individual employee or applicant can file a charge alleging a violation, or IDHR can initiate an investigation on its own.

What Documents You Should Have Right Now

Even without final IDHR rules, the statute gives you enough to work with. Here's what a reasonable compliance posture looks like today.

An AI notice for employees and applicants. The law says you must provide notice that you're using AI for employment purposes. While IDHR's forthcoming rules will specify the exact form and timing, having a written notice ready — one that identifies the use of AI, explains what employment decisions it's involved in, and provides contact information for questions — puts you ahead of the curve. When the rules come out, you adjust the notice to match. If you have no notice at all, you're non-compliant on day one. Our Illinois HB3773 compliance package includes ready-to-use notice templates built around what the statute currently requires.

An internal AI system inventory. You cannot provide notice about AI systems you don't know about. Many employers are using AI-powered tools through third-party vendors without fully understanding what's happening under the hood. Your resume screening platform, your performance review software, your scheduling tool, your background check service — any of these might use AI. You need a documented inventory of every system that touches employment decisions.

An impact assessment framework. The law prohibits AI use that has a discriminatory effect. To know whether your systems are creating disparate impact, you need a process for evaluating them. This doesn't need to be a full-blown bias audit (that's more of a NYC Local Law 144 requirement), but you should have a documented process for periodically reviewing your AI tools for discriminatory outcomes. Our AI bias audit template walks through exactly what that process looks like.

A human oversight protocol. If your AI tools flag or recommend employment actions, who reviews them? How are edge cases handled? Document this. It demonstrates that AI is being used as a tool in your decision-making process, not as the sole decision-maker.

An accommodation request process. If an employee or applicant has concerns about AI being used in decisions that affect them — particularly based on a disability or other protected characteristic — how do they raise that concern? Having a documented process shows good faith.

The Illinois-Specific Context That Matters

Illinois didn't pass this law in a vacuum. The state has one of the most active civil rights enforcement frameworks in the country. The Illinois Human Rights Act covers more protected classes than federal law, and IDHR processes thousands of charges per year.

Illinois was also one of the first states to regulate AI in any employment context — the Artificial Intelligence Video Interview Act (820 ILCS 42), effective January 1, 2020, already regulated the use of AI in video interview analysis. HB3773 goes much further by covering all employment decisions, not just video interviews.

If you're an employer with operations in Illinois, compliance isn't optional. The enforcement infrastructure is already in place. The penalties are substantial. And the fact that implementing rules haven't been published yet is not a defense against a complaint filed today.

What To Do This Week

First, audit your AI tools. Make a list of every software system, vendor platform, or automated tool that touches recruitment, hiring, performance evaluation, scheduling, discipline, or termination of Illinois employees.

Second, draft your employee AI notice. Keep it straightforward: we use AI for these purposes, here's what it does, here's who to contact. When IDHR's rules come out, update it.

Third, review your AI tools for discriminatory outcomes. If you're using an AI-powered resume screener, look at the demographics of who makes it through versus who doesn't. If you see patterns that track protected classes, you have a problem the law says you need to fix.

Fourth, document everything. The notice, the inventory, the review process, the results. If a charge is filed, your documented compliance effort is your best defense.

For complex situations, an employment lawyer is worth consulting. But the core compliance work — writing the notice, inventorying your tools, reviewing outcomes, documenting your process — is something any business owner can start on their own. The statute text is public, the requirements are clear, and now you know what they are. If you operate in multiple states, our multi-state employer AI disclosure kit covers Illinois alongside the other jurisdictions with active AI employment obligations.

What Is Disparate Impact?
Imagine your school decides to pick students for a special program based on how fast they can run a mile. The rule sounds fair — everyone gets the same test, right? But what if students with asthma or physical disabilities almost never pass the running test, even though running speed has nothing to do with the program itself? That's the basic idea behind disparate impact. A rule that looks neutral on the surface ends up unfairly screening out a specific group of people. In the workplace, disparate impact happens when an employer uses a hiring tool or policy that isn't intentionally discriminatory, but the results show a clear pattern: one group of people (based on race, gender, disability, age, or another protected characteristic) gets rejected or treated worse far more often than others. The employer might not have meant to discriminate, but the effect is still discriminatory. And under civil rights laws, that effect is what matters. This concept is especially important when it comes to AI. An AI hiring tool might be trained on data from the past, and if that past data reflects biased decisions — like favoring candidates from certain zip codes or universities — the AI will repeat those patterns automatically. The company using the tool might have no idea it's happening, but under Illinois HB3773, they're still responsible. The law specifically says that using AI in a way that 'has the effect of subjecting employees to discrimination' is a civil rights violation — even if nobody intended it. That's why businesses are expected to regularly check their AI tools for disparate impact. It's not enough to say 'we didn't mean to discriminate.' You have to look at the actual outcomes and make sure no group is being unfairly filtered out.
4 facts

Disclaimer: This article is for informational purposes only and does not constitute legal advice, legal representation, or an attorney-client relationship. Laws and regulations change frequently. You should consult a licensed attorney to verify that the information in this article is current, complete, and applicable to your specific situation before relying on it. AI Compliance Documents is not a law firm and does not practice law.

More from the blog

Get your compliance documentation done

Stop reading, start complying. Our packages generate the documents you need based on the actual statutes.

Browse Compliance Packages