Full Text of SB2203 104th General Assembly
SB2203 104TH GENERAL ASSEMBLY | | | 104TH GENERAL ASSEMBLY
State of Illinois
2025 and 2026 SB2203 Introduced 2/7/2025, by Sen. Graciela Guzmán SYNOPSIS AS INTRODUCED: | | New Act | | 815 ILCS 505/2HHHH new | |
| Creates the Preventing Algorithmic Discrimination Act. Provides that, on or before January 1, 2027, and annually thereafter, a deployer of an automated decision tool shall perform an impact assessment for any automated decision tool the deployer uses or designs, codes, or produces that includes specified information. Provides that a deployer shall, at or before the time an automated decision tool is used to make a consequential decision, notify any natural person who is the subject of the consequential decision that an automated decision tool is being used to make, or be a controlling factor in making, the consequential decision and provide specified information. Provides that a deployer shall establish, document, implement, and maintain a governance program that contains reasonable administrative and technical safeguards to map, measure, manage, and govern the reasonably foreseeable risks of algorithmic discrimination associated with the use or intended use of an automated decision tool. Provides that, within 60 days after completing an impact assessment required by the Act, a deployer shall provide the impact assessment to the Attorney General. Amends the Consumer Fraud and Deceptive Business Practices Act to make conforming changes. |
| |
| | A BILL FOR |
| | | | SB2203 | | LRB104 10978 SPS 21060 b |
|
| 1 | | AN ACT concerning business. | 2 | | Be it enacted by the People of the State of Illinois, | 3 | | represented in the General Assembly: | 4 | | Section 1. Short title. This Act may be cited as the | 5 | | Preventing Algorithmic Discrimination Act. | 6 | | Section 5. Definitions. As used in this Act: | 7 | | "Algorithmic discrimination" means the condition in which | 8 | | an automated decision tool contributes to unjustified | 9 | | differential treatment or impacts disfavoring people based on | 10 | | their actual or perceived race, color, ethnicity, sex, | 11 | | religion, age, national origin, limited English proficiency, | 12 | | disability, veteran status, genetic information, reproductive | 13 | | health, or any other classification protected by State law. | 14 | | "Algorithmic discrimination" does not include: | 15 | | (1) the offer, license, or use of a high-risk | 16 | | artificial intelligence system by a deployer for the sole | 17 | | purpose of: | 18 | | (A) the deployer's self-testing to identify, | 19 | | mitigate, or prevent discrimination or otherwise | 20 | | ensure compliance with state and federal law; or | 21 | | (B) expanding an applicant, customer, or | 22 | | participant pool to increase diversity or redress | 23 | | historical discrimination; or |
| | | SB2203 | - 2 - | LRB104 10978 SPS 21060 b |
|
| 1 | | (2) an act or omission by or on behalf of a private | 2 | | club or other establishment that is not in fact open to the | 3 | | public, as set forth in the Civil Rights Act of 1964. | 4 | | "Artificial intelligence system" means a machine-based | 5 | | system that, for explicit or implicit objectives, infers, from | 6 | | the input it receives, how to generate outputs such as | 7 | | predictions, content, recommendations, or decisions that can | 8 | | influence physical or virtual environments. "Artificial | 9 | | intelligence system" includes a generative artificial | 10 | | intelligence system. For the purposes of this definition, | 11 | | "generative artificial intelligence system" means an automated | 12 | | computing system that, when prompted with human prompts, | 13 | | descriptions, or queries, can produce outputs that simulate | 14 | | human-produced content, including, but not limited to: | 15 | | (1) textual outputs, such as short answers, essays, | 16 | | poetry, or longer compositions or answers; | 17 | | (2) image outputs, such as fine art, photographs, | 18 | | conceptual art, diagrams, and other images; | 19 | | (3) multimedia outputs, such as audio or video in the | 20 | | form of compositions, songs, or short-form or long-form | 21 | | audio or video; and | 22 | | (4) other content that would otherwise be produced by | 23 | | human means | 24 | | "Automated decision tool" means a system or service that | 25 | | uses artificial intelligence and has been specifically | 26 | | developed and marketed to, or specifically modified to, make, |
| | | SB2203 | - 3 - | LRB104 10978 SPS 21060 b |
|
| 1 | | or be a controlling factor in making, consequential decisions. | 2 | | "Consequential decision" means a decision or judgment that | 3 | | has a legal, material, or similarly significant effect on an | 4 | | individual's life relating to the impact of, access to, or the | 5 | | cost, terms, or availability of, any of the following: | 6 | | (1) employment, worker management, or self-employment, | 7 | | including, but not limited to, all of the following: | 8 | | (A) pay or promotion; | 9 | | (B) hiring or termination; and | 10 | | (C) automated task allocation; | 11 | | (2) education and vocational training, including, but | 12 | | not limited to, all of the following: | 13 | | (A) assessment, including, but not limited to, | 14 | | detecting student cheating or plagiarism; | 15 | | (B) accreditation; | 16 | | (C) certification; | 17 | | (D) admissions; and | 18 | | (E) financial aid or scholarships; | 19 | | (3) housing or lodging, including rental or short-term | 20 | | housing or lodging; | 21 | | (4) essential utilities, including electricity, heat, | 22 | | water, Internet or telecommunications access, or | 23 | | transportation; | 24 | | (5) family planning, including adoption services or | 25 | | reproductive services, as well as assessments related to | 26 | | child protective services; |
| | | SB2203 | - 4 - | LRB104 10978 SPS 21060 b |
|
| 1 | | (6) healthcare or health insurance, including mental | 2 | | health care, dental, or vision; | 3 | | (7) financial services, including a financial service | 4 | | provided by a mortgage company, mortgage broker, or | 5 | | creditor; | 6 | | (8) the criminal justice system, including, but not | 7 | | limited to, all of the following: | 8 | | (A) risk assessments for pretrial hearings; | 9 | | (B) sentencing; and | 10 | | (C) parole; | 11 | | (9) legal services, including private arbitration or | 12 | | mediation; | 13 | | (10) voting; and | 14 | | (11) access to benefits or services or assignment of | 15 | | penalties. | 16 | | "Deployer" means a person, partnership, State or local | 17 | | government agency, or corporation that uses an automated | 18 | | decision tool to make a consequential decision. | 19 | | "Impact assessment" means a documented risk-based | 20 | | evaluation of an automated decision tool that meets the | 21 | | criteria of Section 10. | 22 | | "Sex" includes pregnancy, childbirth, and related | 23 | | conditions, gender identity, intersex status, and sexual | 24 | | orientation. | 25 | | "Significant update" means a new version, new release, or | 26 | | other update to an automated decision tool that includes |
| | | SB2203 | - 5 - | LRB104 10978 SPS 21060 b |
|
| 1 | | changes to its use case, key functionality, or expected | 2 | | outcomes. | 3 | | Section 10. Impact assessment. | 4 | | (a) On or before January 1, 2027, and annually thereafter, | 5 | | a deployer of an automated decision tool shall perform an | 6 | | impact assessment for any automated decision tool the deployer | 7 | | uses that includes all of the following: | 8 | | (1) a statement of the purpose of the automated | 9 | | decision tool and its intended benefits, uses, and | 10 | | deployment contexts; | 11 | | (2) a description of the automated decision tool's | 12 | | outputs and how they are used to make, or be a controlling | 13 | | factor in making, a consequential decision; | 14 | | (3) a summary of the type of data collected from | 15 | | natural persons and processed by the automated decision | 16 | | tool when it is used to make, or be a controlling factor in | 17 | | making, a consequential decision; | 18 | | (4) an analysis of potential adverse impacts on the | 19 | | basis of sex, race, color, ethnicity, religion, age, | 20 | | national origin, limited English proficiency, disability, | 21 | | veteran status, or genetic information from the deployer's | 22 | | use of the automated decision tool; | 23 | | (5) a description of the safeguards implemented, or | 24 | | that will be implemented, by the deployer to address any | 25 | | reasonably foreseeable risks of algorithmic discrimination |
| | | SB2203 | - 6 - | LRB104 10978 SPS 21060 b |
|
| 1 | | arising from the use of the automated decision tool known | 2 | | to the deployer at the time of the impact assessment; | 3 | | (6) a description of how the automated decision tool | 4 | | will be used by a natural person, or monitored when it is | 5 | | used, to make, or be a controlling factor in making, a | 6 | | consequential decision; and | 7 | | (7) a description of how the automated decision tool | 8 | | has been or will be evaluated for validity or relevance. | 9 | | (b) A deployer shall, in addition to the impact assessment | 10 | | required by subsection (a), perform, as soon as feasible, an | 11 | | impact assessment with respect to any significant update. | 12 | | (c) This Section does not apply to a deployer with fewer | 13 | | than 25 employees unless, as of the end of the prior calendar | 14 | | year, the deployer deployed an automated decision tool that | 15 | | impacted more than 999 people per year. | 16 | | Section 15. Notification and accommodations. | 17 | | (a) A deployer shall, at or before the time an automated | 18 | | decision tool is used to make a consequential decision, notify | 19 | | any natural person who is the subject of the consequential | 20 | | decision that an automated decision tool is being used to | 21 | | make, or be a controlling factor in making, the consequential | 22 | | decision. A deployer shall provide to a natural person | 23 | | notified under this subsection all of the following: | 24 | | (1) a statement of the purpose of the automated | 25 | | decision tool; |
| | | SB2203 | - 7 - | LRB104 10978 SPS 21060 b |
|
| 1 | | (2) the contact information for the deployer; and | 2 | | (3) a plain language description of the automated | 3 | | decision tool that includes a description of any human | 4 | | components and how any automated component is used to | 5 | | inform a consequential decision. | 6 | | (b) If a consequential decision is made solely based on | 7 | | the output of an automated decision tool, a deployer shall, if | 8 | | technically feasible, accommodate a natural person's request | 9 | | to not be subject to the automated decision tool and to be | 10 | | subject to an alternative selection process or accommodation. | 11 | | After a request is made under this subsection, a deployer may | 12 | | reasonably request, collect, and process information from a | 13 | | natural person for the purposes of identifying the person and | 14 | | the associated consequential decision. If the person does not | 15 | | provide that information, the deployer shall not be obligated | 16 | | to provide an alternative selection process or accommodation. | 17 | | Section 20. Governance program. | 18 | | (a) A deployer shall establish, document, implement, and | 19 | | maintain a governance program that contains reasonable | 20 | | administrative and technical safeguards to map, measure, | 21 | | manage, and govern the reasonably foreseeable risks of | 22 | | algorithmic discrimination associated with the use or intended | 23 | | use of an automated decision tool. The safeguards required by | 24 | | this subsection shall be appropriate to all of the following: | 25 | | (1) the use or intended use of the automated decision |
| | | SB2203 | - 8 - | LRB104 10978 SPS 21060 b |
|
| 1 | | tool; | 2 | | (2) the deployer's role as a deployer; | 3 | | (3) the size, complexity, and resources of the | 4 | | deployer; | 5 | | (4) the nature, context, and scope of the activities | 6 | | of the deployer in connection with the automated decision | 7 | | tool; and | 8 | | (5) the technical feasibility and cost of available | 9 | | tools, assessments, and other means used by a deployer to | 10 | | map, measure, manage, and govern the risks associated with | 11 | | an automated decision tool. | 12 | | (b) The governance program required by this Section shall | 13 | | be designed to do all of the following: | 14 | | (1) identify and implement safeguards to address | 15 | | reasonably foreseeable risks of algorithmic discrimination | 16 | | resulting from the use or intended use of an automated | 17 | | decision tool; | 18 | | (2) if established by a deployer, provide for the | 19 | | performance of impact assessments as required by Section | 20 | | 10; | 21 | | (3) conduct an annual and comprehensive review of | 22 | | policies, practices, and procedures to ensure compliance | 23 | | with this Act; | 24 | | (4) maintain for 2 years after completion the results | 25 | | of an impact assessment; and | 26 | | (5) evaluate and make reasonable adjustments to |
| | | SB2203 | - 9 - | LRB104 10978 SPS 21060 b |
|
| 1 | | administrative and technical safeguards in light of | 2 | | material changes in technology, the risks associated with | 3 | | the automated decision tool, the state of technical | 4 | | standards, and changes in business arrangements or | 5 | | operations of the deployer. | 6 | | (c) A deployer shall designate at least one employee to be | 7 | | responsible for overseeing and maintaining the governance | 8 | | program and compliance with this Act. An employee designated | 9 | | under this subsection shall have the authority to assert to | 10 | | the employee's employer a good faith belief that the design, | 11 | | production, or use of an automated decision tool fails to | 12 | | comply with the requirements of this Act. An employer of an | 13 | | employee designated under this subsection shall conduct a | 14 | | prompt and complete assessment of any compliance issue raised | 15 | | by that employee. | 16 | | (d) This Section does not apply to a deployer with fewer | 17 | | than 25 employees unless, as of the end of the prior calendar | 18 | | year, the deployer deployed an automated decision tool that | 19 | | impacted more than 999 people per year. | 20 | | Section 25. Public statement of policy. A deployer shall | 21 | | make publicly available, in a readily accessible manner, a | 22 | | clear policy that provides a summary of both of the following: | 23 | | (1) the types of automated decision tools currently in | 24 | | use or made available to others by the deployer; and | 25 | | (2) how the deployer manages the reasonably |
| | | SB2203 | - 10 - | LRB104 10978 SPS 21060 b |
|
| 1 | | foreseeable risks of algorithmic discrimination that may | 2 | | arise from the use of the automated decision tools it | 3 | | currently uses or makes available to others. | 4 | | Section 30. Algorithmic discrimination. | 5 | | (a) A deployer shall not use an automated decision tool | 6 | | that results in algorithmic discrimination. | 7 | | (b) On and after January 1, 2028, a person may bring a | 8 | | civil action against a deployer for violation of this Section. | 9 | | In an action brought under this subsection, the plaintiff | 10 | | shall have the burden of proof to demonstrate that the | 11 | | deployer's use of the automated decision tool resulted in | 12 | | algorithmic discrimination that caused actual harm to the | 13 | | person bringing the civil action. | 14 | | (c) In addition to any other remedy at law, a deployer that | 15 | | violates this Section shall be liable to a prevailing | 16 | | plaintiff for any of the following: | 17 | | (1) compensatory damages; | 18 | | (2) declaratory relief; and | 19 | | (3) reasonable attorney's fees and costs. | 20 | | Section 35. Impact assessment. | 21 | | (a) Within 60 days after completing an impact assessment | 22 | | required by this Act, a deployer shall provide the impact | 23 | | assessment to the Attorney General. | 24 | | (b) A deployer who knowingly violates this Section shall |
| | | SB2203 | - 11 - | LRB104 10978 SPS 21060 b |
|
| 1 | | be liable for an administrative fine of not more than $10,000 | 2 | | per violation in an administrative enforcement action brought | 3 | | by the Attorney General. Each day on which an automated | 4 | | decision tool is used for which an impact assessment has not | 5 | | been submitted as required under this Section shall give rise | 6 | | to a distinct violation of this Section. | 7 | | (c) The Attorney General may share impact assessments with | 8 | | other State entities as appropriate. | 9 | | Section 40. Enforcement. A violation of this Act | 10 | | constitutes an unlawful practice under the Consumer Fraud and | 11 | | Deceptive Business Practices Act. All remedies, penalties, and | 12 | | authority granted to the Attorney General by the Consumer | 13 | | Fraud and Deceptive Business Practices Act shall be available | 14 | | to him or her for the enforcement of this Act. | 15 | | Section 95. The Consumer Fraud and Deceptive Business | 16 | | Practices Act is amended by adding Section 2HHHH as follows: | 17 | | (815 ILCS 505/2HHHH new) | 18 | | Sec. 2HHHH. Violations of the Preventing Algorithmic | 19 | | Discrimination Act. A person who violates the Preventing | 20 | | Algorithmic Discrimination Act commits an unlawful practice | 21 | | within the meaning of this Act. |
|