| ||||||||||||||||||||
| ||||||||||||||||||||
| ||||||||||||||||||||
1 | AN ACT concerning business. | |||||||||||||||||||
2 | Be it enacted by the People of the State of Illinois, | |||||||||||||||||||
3 | represented in the General Assembly: | |||||||||||||||||||
4 | Section 1. Short title. This Act may be cited as the | |||||||||||||||||||
5 | Automated Decision Tools Act. | |||||||||||||||||||
6 | Section 5. Definitions. As used in this Act: | |||||||||||||||||||
7 | "Algorithmic discrimination" means the condition in which | |||||||||||||||||||
8 | an automated decision tool contributes to unjustified | |||||||||||||||||||
9 | differential treatment or impacts disfavoring people based on | |||||||||||||||||||
10 | their actual or perceived race, color, ethnicity, sex, | |||||||||||||||||||
11 | religion, age, national origin, limited English proficiency, | |||||||||||||||||||
12 | disability, veteran status, genetic information, reproductive | |||||||||||||||||||
13 | health, or any other classification protected by State law. | |||||||||||||||||||
14 | "Artificial intelligence" means a machine-based system or | |||||||||||||||||||
15 | technology operating on datasets that can, for a given set of | |||||||||||||||||||
16 | human-defined objectives, make predictions, recommendations, | |||||||||||||||||||
17 | or decisions influencing a real or virtual environment. | |||||||||||||||||||
18 | "Automated decision tool" means a system or service that | |||||||||||||||||||
19 | uses artificial intelligence and has been specifically | |||||||||||||||||||
20 | developed and marketed to, or specifically modified to, make, | |||||||||||||||||||
21 | or be a controlling factor in making, consequential decisions. | |||||||||||||||||||
22 | "Consequential decision" means a decision or judgment that | |||||||||||||||||||
23 | has a legal, material, or similarly significant effect on an |
| |||||||
| |||||||
1 | individual's life relating to the impact of, access to, or the | ||||||
2 | cost, terms, or availability of, any of the following: | ||||||
3 | (1) employment, worker management, or self-employment, | ||||||
4 | including, but not limited to, all of the following: | ||||||
5 | (A) pay or promotion; | ||||||
6 | (B) hiring or termination; and | ||||||
7 | (C) automated task allocation; | ||||||
8 | (2) education and vocational training, including, but | ||||||
9 | not limited to, all of the following: | ||||||
10 | (A) assessment, including, but not limited to, | ||||||
11 | detecting student cheating or plagiarism; | ||||||
12 | (B) accreditation; | ||||||
13 | (C) certification; | ||||||
14 | (D) admissions; and | ||||||
15 | (E) financial aid or scholarships; | ||||||
16 | (3) housing or lodging, including rental or short-term | ||||||
17 | housing or lodging; | ||||||
18 | (4) essential utilities, including electricity, heat, | ||||||
19 | water, Internet or telecommunications access, or | ||||||
20 | transportation; | ||||||
21 | (5) family planning, including adoption services or | ||||||
22 | reproductive services, as well as assessments related to | ||||||
23 | child protective services; | ||||||
24 | (6) healthcare or health insurance, including mental | ||||||
25 | health care, dental, or vision; | ||||||
26 | (7) financial services, including a financial service |
| |||||||
| |||||||
1 | provided by a mortgage company, mortgage broker, or | ||||||
2 | creditor; | ||||||
3 | (8) the criminal justice system, including, but not | ||||||
4 | limited to, all of the following: | ||||||
5 | (A) risk assessments for pretrial hearings; | ||||||
6 | (B) sentencing; and | ||||||
7 | (C) parole; | ||||||
8 | (9) legal services, including private arbitration or | ||||||
9 | mediation; | ||||||
10 | (10) voting; and | ||||||
11 | (11) access to benefits or services or assignment of | ||||||
12 | penalties. | ||||||
13 | "Deployer" means a person, partnership, State or local | ||||||
14 | government agency, or corporation that uses an automated | ||||||
15 | decision tool to make a consequential decision. | ||||||
16 | "Impact assessment" means a documented risk-based | ||||||
17 | evaluation of an automated decision tool that meets the | ||||||
18 | criteria of Section 10. | ||||||
19 | "Sex" includes pregnancy, childbirth, and related | ||||||
20 | conditions, gender identity, intersex status, and sexual | ||||||
21 | orientation. | ||||||
22 | "Significant update" means a new version, new release, or | ||||||
23 | other update to an automated decision tool that includes | ||||||
24 | changes to its use case, key functionality, or expected | ||||||
25 | outcomes. |
| |||||||
| |||||||
1 | Section 10. Impact assessment. | ||||||
2 | (a) On or before January 1, 2026, and annually thereafter, | ||||||
3 | a deployer of an automated decision tool shall perform an | ||||||
4 | impact assessment for any automated decision tool the deployer | ||||||
5 | uses that includes all of the following: | ||||||
6 | (1) a statement of the purpose of the automated | ||||||
7 | decision tool and its intended benefits, uses, and | ||||||
8 | deployment contexts; | ||||||
9 | (2) a description of the automated decision tool's | ||||||
10 | outputs and how they are used to make, or be a controlling | ||||||
11 | factor in making, a consequential decision; | ||||||
12 | (3) a summary of the type of data collected from | ||||||
13 | natural persons and processed by the automated decision | ||||||
14 | tool when it is used to make, or be a controlling factor in | ||||||
15 | making, a consequential decision; | ||||||
16 | (4) an analysis of potential adverse impacts on the | ||||||
17 | basis of sex, race, color, ethnicity, religion, age, | ||||||
18 | national origin, limited English proficiency, disability, | ||||||
19 | veteran status, or genetic information from the deployer's | ||||||
20 | use of the automated decision tool; | ||||||
21 | (5) a description of the safeguards implemented, or | ||||||
22 | that will be implemented, by the deployer to address any | ||||||
23 | reasonably foreseeable risks of algorithmic discrimination | ||||||
24 | arising from the use of the automated decision tool known | ||||||
25 | to the deployer at the time of the impact assessment; | ||||||
26 | (6) a description of how the automated decision tool |
| |||||||
| |||||||
1 | will be used by a natural person, or monitored when it is | ||||||
2 | used, to make, or be a controlling factor in making, a | ||||||
3 | consequential decision; and | ||||||
4 | (7) a description of how the automated decision tool | ||||||
5 | has been or will be evaluated for validity or relevance. | ||||||
6 | (b) A deployer shall, in addition to the impact assessment | ||||||
7 | required by subsection (a), perform, as soon as feasible, an | ||||||
8 | impact assessment with respect to any significant update. | ||||||
9 | (c) This Section does not apply to a deployer with fewer | ||||||
10 | than 25 employees unless, as of the end of the prior calendar | ||||||
11 | year, the deployer deployed an automated decision tool that | ||||||
12 | impacted more than 999 people per year. | ||||||
13 | Section 15. Notification and accommodations. | ||||||
14 | (a) A deployer shall, at or before the time an automated | ||||||
15 | decision tool is used to make a consequential decision, notify | ||||||
16 | any natural person who is the subject of the consequential | ||||||
17 | decision that an automated decision tool is being used to | ||||||
18 | make, or be a controlling factor in making, the consequential | ||||||
19 | decision. A deployer shall provide to a natural person | ||||||
20 | notified under this subsection all of the following: | ||||||
21 | (1) a statement of the purpose of the automated | ||||||
22 | decision tool; | ||||||
23 | (2) the contact information for the deployer; and | ||||||
24 | (3) a plain language description of the automated | ||||||
25 | decision tool that includes a description of any human |
| |||||||
| |||||||
1 | components and how any automated component is used to | ||||||
2 | inform a consequential decision. | ||||||
3 | (b) If a consequential decision is made solely based on | ||||||
4 | the output of an automated decision tool, a deployer shall, if | ||||||
5 | technically feasible, accommodate a natural person's request | ||||||
6 | to not be subject to the automated decision tool and to be | ||||||
7 | subject to an alternative selection process or accommodation. | ||||||
8 | After a request is made under this subsection, a deployer may | ||||||
9 | reasonably request, collect, and process information from a | ||||||
10 | natural person for the purposes of identifying the person and | ||||||
11 | the associated consequential decision. If the person does not | ||||||
12 | provide that information, the deployer shall not be obligated | ||||||
13 | to provide an alternative selection process or accommodation. | ||||||
14 | Section 20. Governance program. | ||||||
15 | (a) A deployer shall establish, document, implement, and | ||||||
16 | maintain a governance program that contains reasonable | ||||||
17 | administrative and technical safeguards to map, measure, | ||||||
18 | manage, and govern the reasonably foreseeable risks of | ||||||
19 | algorithmic discrimination associated with the use or intended | ||||||
20 | use of an automated decision tool. The safeguards required by | ||||||
21 | this subsection shall be appropriate to all of the following: | ||||||
22 | (1) the use or intended use of the automated decision | ||||||
23 | tool; | ||||||
24 | (2) the deployer's role as a deployer; | ||||||
25 | (3) the size, complexity, and resources of the |
| |||||||
| |||||||
1 | deployer; | ||||||
2 | (4) the nature, context, and scope of the activities | ||||||
3 | of the deployer in connection with the automated decision | ||||||
4 | tool; and | ||||||
5 | (5) the technical feasibility and cost of available | ||||||
6 | tools, assessments, and other means used by a deployer to | ||||||
7 | map, measure, manage, and govern the risks associated with | ||||||
8 | an automated decision tool. | ||||||
9 | (b) The governance program required by this Section shall | ||||||
10 | be designed to do all of the following: | ||||||
11 | (1) identify and implement safeguards to address | ||||||
12 | reasonably foreseeable risks of algorithmic discrimination | ||||||
13 | resulting from the use or intended use of an automated | ||||||
14 | decision tool; | ||||||
15 | (2) if established by a deployer, provide for the | ||||||
16 | performance of impact assessments as required by Section | ||||||
17 | 10; | ||||||
18 | (3) conduct an annual and comprehensive review of | ||||||
19 | policies, practices, and procedures to ensure compliance | ||||||
20 | with this Act; | ||||||
21 | (4) maintain for 2 years after completion the results | ||||||
22 | of an impact assessment; and | ||||||
23 | (5) evaluate and make reasonable adjustments to | ||||||
24 | administrative and technical safeguards in light of | ||||||
25 | material changes in technology, the risks associated with | ||||||
26 | the automated decision tool, the state of technical |
| |||||||
| |||||||
1 | standards, and changes in business arrangements or | ||||||
2 | operations of the deployer. | ||||||
3 | (c) A deployer shall designate at least one employee to be | ||||||
4 | responsible for overseeing and maintaining the governance | ||||||
5 | program and compliance with this Act. An employee designated | ||||||
6 | under this subsection shall have the authority to assert to | ||||||
7 | the employee's employer a good faith belief that the design, | ||||||
8 | production, or use of an automated decision tool fails to | ||||||
9 | comply with the requirements of this Act. An employer of an | ||||||
10 | employee designated under this subsection shall conduct a | ||||||
11 | prompt and complete assessment of any compliance issue raised | ||||||
12 | by that employee. | ||||||
13 | (d) This Section does not apply to a deployer with fewer | ||||||
14 | than 25 employees unless, as of the end of the prior calendar | ||||||
15 | year, the deployer deployed an automated decision tool that | ||||||
16 | impacted more than 999 people per year. | ||||||
17 | Section 25. Public statement of policy. A deployer shall | ||||||
18 | make publicly available, in a readily accessible manner, a | ||||||
19 | clear policy that provides a summary of both of the following: | ||||||
20 | (1) the types of automated decision tools currently in | ||||||
21 | use or made available to others by the deployer; and | ||||||
22 | (2) how the deployer manages the reasonably | ||||||
23 | foreseeable risks of algorithmic discrimination that may | ||||||
24 | arise from the use of the automated decision tools it | ||||||
25 | currently uses or makes available to others. |
| |||||||
| |||||||
1 | Section 30. Algorithmic discrimination. | ||||||
2 | (a) A deployer shall not use an automated decision tool | ||||||
3 | that results in algorithmic discrimination. | ||||||
4 | (b) On and after January 1, 2027, a person may bring a | ||||||
5 | civil action against a deployer for violation of this Section. | ||||||
6 | In an action brought under this subsection, the plaintiff | ||||||
7 | shall have the burden of proof to demonstrate that the | ||||||
8 | deployer's use of the automated decision tool resulted in | ||||||
9 | algorithmic discrimination that caused actual harm to the | ||||||
10 | person bringing the civil action. | ||||||
11 | (c) In addition to any other remedy at law, a deployer that | ||||||
12 | violates this Section shall be liable to a prevailing | ||||||
13 | plaintiff for any of the following: | ||||||
14 | (1) compensatory damages; | ||||||
15 | (2) declaratory relief; and | ||||||
16 | (3) reasonable attorney's fees and costs. | ||||||
17 | Section 35. Impact assessment. | ||||||
18 | (a) Within 60 days after completing an impact assessment | ||||||
19 | required by this Act, a deployer shall provide the impact | ||||||
20 | assessment to the Department of Human Rights. | ||||||
21 | (b) A deployer who knowingly violates this Section shall | ||||||
22 | be liable for an administrative fine of not more than $10,000 | ||||||
23 | per violation in an administrative enforcement action brought | ||||||
24 | by the Department of Human Rights. Each day on which an |
| |||||||
| |||||||
1 | automated decision tool is used for which an impact assessment | ||||||
2 | has not been submitted as required under this Section shall | ||||||
3 | give rise to a distinct violation of this Section. | ||||||
4 | (c) The Department of Human Rights may share impact | ||||||
5 | assessments with other State entities as appropriate. | ||||||
6 | Section 40. Civil actions. | ||||||
7 | (a) The Attorney General may bring a civil action in the | ||||||
8 | name of the people of the State of Illinois against a deployer | ||||||
9 | for a violation of this Act. | ||||||
10 | (b) A court may award in an action brought under this | ||||||
11 | Section all of the following: | ||||||
12 | (1) injunctive relief; | ||||||
13 | (2) declaratory relief; and | ||||||
14 | (3) reasonable attorney's fees and litigation costs. | ||||||
15 | (b) The Attorney General, before commencing an action | ||||||
16 | under this Section for injunctive relief, shall provide 45 | ||||||
17 | days' written notice to a deployer of the alleged violations | ||||||
18 | of this Act. The deployer may cure, within 45 days after | ||||||
19 | receiving the written notice described in this subsection, the | ||||||
20 | noticed violation and provide the person who gave the notice | ||||||
21 | an express written statement, made under penalty of perjury, | ||||||
22 | that the violation has been cured and that no further | ||||||
23 | violations shall occur. If the deployer cures the noticed | ||||||
24 | violation and provides the express written statement, a claim | ||||||
25 | for injunctive relief shall not be maintained for the noticed |
| |||||||
| |||||||
1 | violation. |