|
| | 104TH GENERAL ASSEMBLY
State of Illinois
2025 and 2026 SB3502 Introduced 2/5/2026, by Sen. Rachel Ventura SYNOPSIS AS INTRODUCED: | | | Creates the Artificial Intelligence Design Requirements Act. Sets forth provisions concerning product liability actions brought against a developer of an artificial intelligence system for defective design, failure to contain adequate instructions or warnings, and failure to conform to an express warranty. Provides that a deployer of an artificial intelligence system shall be deemed to be liable as a developer for harm caused by a product if: (1) the deployer makes material and substantial change to the product or (2) the deployer intentionally misuses the product contrary to the express warranty and that use was the proximate cause of harm to the plaintiff. Sets forth provisions concerning applicability and enforcement. |
| |
| | A BILL FOR |
|
|
| | SB3502 | | LRB104 20017 SPS 33468 b |
|
|
| 1 | | AN ACT concerning civil law. |
| 2 | | Be it enacted by the People of the State of Illinois, |
| 3 | | represented in the General Assembly: |
| 4 | | Section 1. Short title. This Act may be cited as the |
| 5 | | Artificial Intelligence Design Requirements Act. |
| 6 | | Section 5. Intent. The General Assembly finds that |
| 7 | | artificial intelligence shifts decision-making power and |
| 8 | | responsibility away from persons to software-based systems, |
| 9 | | often without direct human oversight. While this technology |
| 10 | | offers significant benefits, its deployment has already caused |
| 11 | | measurable harm to individuals and businesses. |
| 12 | | The General Assembly also finds that developers of |
| 13 | | artificial intelligence have an obligation to make the systems |
| 14 | | safe when used in reasonably foreseeable ways. Deployers of |
| 15 | | these products also have an obligation to ensure that these |
| 16 | | products are used in a way that does not materially affect an |
| 17 | | individual's rights. |
| 18 | | Section 10. Definitions. As used in this Act: |
| 19 | | "Artificial intelligence" has the meaning set forth in |
| 20 | | Section 2-101 of the Illinois Human Rights Act. |
| 21 | | "Consequential decision" means a decision that either has |
| 22 | | a legal or similarly significant effect on an individual's |
|
| | SB3502 | - 2 - | LRB104 20017 SPS 33468 b |
|
|
| 1 | | access to the criminal justice system, housing, employment, |
| 2 | | credit, education, health care, or insurance. |
| 3 | | "Deployer" means a person, including a developer, who uses |
| 4 | | or operates an artificial intelligence system for use by a |
| 5 | | deployer or for use by third parties. "Deployer" does not |
| 6 | | include an individual or business with fewer than 20 employees |
| 7 | | or less than 10,000 users of its product. |
| 8 | | "Design" means the intended or known physical and material |
| 9 | | characteristics of a product and shall include any intended or |
| 10 | | known formulation or content of the product and the usual |
| 11 | | result of the intended development or other process used to |
| 12 | | produce the product, which includes, but is not limited to, |
| 13 | | unexpected skills or behaviors that appear in a product. |
| 14 | | "Developer" means a person who designs, codes, produces, |
| 15 | | owns, or substantially modifies an artificial intelligence |
| 16 | | system for use by a developer or for use by a third party. |
| 17 | | "Developer" does not include a person who uses an open source |
| 18 | | artificial intelligence system but does not substantially |
| 19 | | modify the system. |
| 20 | | "Express warranty" means any material, positive statement, |
| 21 | | affirmation of fact, promise, or description relating to a |
| 22 | | product, including any sample or model of a product. |
| 23 | | "Harm" means: |
| 24 | | (1) damage to property other than the product itself; |
| 25 | | (2) personal physical, financial or reputational |
| 26 | | injury, illness, or death; |
|
| | SB3502 | - 3 - | LRB104 20017 SPS 33468 b |
|
|
| 1 | | (3) mental or psychological anguish, emotional harm, |
| 2 | | or distortion of a person's behavior that would be highly |
| 3 | | offensive to a reasonable person; or |
| 4 | | (4) any loss of consortium or services or other loss |
| 5 | | deriving from any type of harm described in this Act. |
| 6 | | "High-impact artificial intelligence system" means any |
| 7 | | artificial intelligence system, regardless of the number of |
| 8 | | parameters and supervision structure, that is: |
| 9 | | (1) used, reasonably foreseeable as being used, or is |
| 10 | | a controlling factor in making a consequential decision; |
| 11 | | (2) used, or reasonably foreseeable as being used, to |
| 12 | | categorize groups of persons by protected characteristics, |
| 13 | | such as race, ethnic origin, or religious belief; |
| 14 | | (3) used, or reasonably foreseeable as being used, in |
| 15 | | the direct management or operation of critical |
| 16 | | infrastructure; |
| 17 | | (4) used, or reasonably foreseeable as being used, in |
| 18 | | a vehicle, a medical device, or in the safety system of a |
| 19 | | vehicle or medical device; |
| 20 | | (5) used, or reasonably foreseeable as being used, to |
| 21 | | engage in a synthetic relationship; and |
| 22 | | (6) exhibits, or could be easily modified to exhibit, |
| 23 | | high levels of performance at tasks that pose a serious |
| 24 | | risk to economic security, public health or safety, or any |
| 25 | | combination of those matters. |
| 26 | | "Material fact" means any specific characteristic or |
|
| | SB3502 | - 4 - | LRB104 20017 SPS 33468 b |
|
|
| 1 | | quality of the product. "Material fact" does not include a |
| 2 | | general opinion about, or praise of, the product or its |
| 3 | | quality. |
| 4 | | "Method of development" includes, but is not limited to, |
| 5 | | the selection of training data for the product, and training, |
| 6 | | testing, auditing, and fine-tuning the product. |
| 7 | | "Person" means any individual, corporation, company, |
| 8 | | association, firm, partnership, society, joint stock company, |
| 9 | | or any other entity, including any government entity or |
| 10 | | unincorporated association of persons. |
| 11 | | "Product" means a high-impact artificial intelligence |
| 12 | | system or a generative artificial intelligence system. |
| 13 | | "Product release" means the specific way in which a |
| 14 | | product is integrated and made accessible within a production |
| 15 | | environment, including how it interacts with data sources, |
| 16 | | delivers predictions or results, and is accessed by users. |
| 17 | | "Synthetic relationship" means a series of interactions |
| 18 | | between an individual and an artificial intelligence system |
| 19 | | that mimics human interaction and emotional responses. |
| 20 | | Section 15. Developer accountability for harm to consumers |
| 21 | | or businesses. |
| 22 | | (a) In any products liability action, a developer shall be |
| 23 | | liable to a plaintiff only if the plaintiff establishes: |
| 24 | | (1) that the developer failed to exercise reasonable |
| 25 | | care with respect to the design of the product and the |
|
| | SB3502 | - 5 - | LRB104 20017 SPS 33468 b |
|
|
| 1 | | failure to exercise reasonable care was a proximate cause |
| 2 | | of harm to the plaintiff; |
| 3 | | (2) that the developer failed to exercise reasonable |
| 4 | | care with respect to providing adequate instructions or |
| 5 | | warnings applicable to the product that allegedly caused |
| 6 | | the harm that is the subject of the complaint and the |
| 7 | | failure to provide adequate instructions or warnings was a |
| 8 | | proximate cause of harm to the plaintiff; and |
| 9 | | (3) that the developer failed to exercise reasonable |
| 10 | | care with respect to providing an express warranty |
| 11 | | applicable to the product that allegedly caused the harm |
| 12 | | that is the subject of the complaint; the product failed |
| 13 | | to conform to the warranty; and the failure of the product |
| 14 | | to conform to the warranty caused harm to the plaintiff. |
| 15 | | (b) In any action alleging that a product is unreasonably |
| 16 | | dangerous because of a defective design, the plaintiff shall |
| 17 | | prove by a preponderance of the evidence that, at the time the |
| 18 | | product left the developer's control: |
| 19 | | (1) the developer knew or, in light of then-existing |
| 20 | | scientific and technical knowledge, reasonably should have |
| 21 | | known of the danger that caused the plaintiff's harm; |
| 22 | | (2) the developer accounted for both intended uses and |
| 23 | | reasonably foreseeable unintended uses of their systems; |
| 24 | | and |
| 25 | | (3) there existed a technologically feasible and |
| 26 | | practical alternative design, including, but not limited |
|
| | SB3502 | - 6 - | LRB104 20017 SPS 33468 b |
|
|
| 1 | | to, product release and the method of development, that |
| 2 | | would have reduced or avoided a foreseeable risk of harm |
| 3 | | without significantly impairing the usefulness of the |
| 4 | | product to the group of persons who are the intended and |
| 5 | | legitimate users of the product. |
| 6 | | (c) In any action alleging that a product is defective |
| 7 | | because it failed to contain adequate instructions or |
| 8 | | warnings: |
| 9 | | (1) An adequate warning or instruction is one that a |
| 10 | | reasonably prudent person in the same or similar |
| 11 | | circumstances would have provided with respect to the |
| 12 | | danger and communicates sufficient information on the |
| 13 | | dangers and safe use of the product, taking into account |
| 14 | | the characteristics of, and the ordinary knowledge common |
| 15 | | to an ordinary consumer who uses the product. |
| 16 | | (2) The plaintiff shall prove by a preponderance of |
| 17 | | the evidence that, at the time the product left the |
| 18 | | developer's control, the developer knew or, in light of |
| 19 | | then-existing scientific and technical knowledge, |
| 20 | | reasonably should have known of the danger that caused the |
| 21 | | plaintiff's harm. |
| 22 | | (3) A developer shall not be liable for failure to |
| 23 | | instruct or warn about a danger that is known or open and |
| 24 | | obvious to the user or consumer of the product, or should |
| 25 | | have been known or open and obvious to the user or consumer |
| 26 | | of the product, taking into account the characteristics |
|
| | SB3502 | - 7 - | LRB104 20017 SPS 33468 b |
|
|
| 1 | | of, and the ordinary knowledge common to, the persons who |
| 2 | | ordinarily use or consume the product. |
| 3 | | A danger is presumed to not be open and obvious to a user |
| 4 | | or consumer of the product under 17 years old. |
| 5 | | (d) A product may be unreasonably dangerous because it did |
| 6 | | not conform to an express warranty only if the plaintiff |
| 7 | | proves by a preponderance of the evidence that: |
| 8 | | (1) the plaintiff reasonably relied on an express |
| 9 | | warranty made by the developer about a material fact |
| 10 | | concerning the safety of the product; |
| 11 | | (2) this express warranty proved to be untrue; and |
| 12 | | (3) the plaintiff would not have been harmed if the |
| 13 | | representation had been true. |
| 14 | | Section 20. Deployer accountability for harm to consumer |
| 15 | | or business. |
| 16 | | (a) A deployer shall be deemed to be liable as a developer |
| 17 | | under Section 15 for harm caused by a product if: |
| 18 | | (1) the deployer makes material and substantial change |
| 19 | | to the product; or |
| 20 | | (2) the deployer intentionally misuses the product |
| 21 | | contrary to the express warranty and that use was the |
| 22 | | proximate cause of harm to the plaintiff. |
| 23 | | (b) For the purposes of this Section, a use of a product |
| 24 | | that is intended by the developer of the product does not |
| 25 | | constitute a misuse or material or substantial change of the |
|
| | SB3502 | - 8 - | LRB104 20017 SPS 33468 b |
|
|
| 1 | | product. If a developer does not specify an intended use for |
| 2 | | the product, intended use shall be inferred by the targeted |
| 3 | | market and manner of distribution. |
| 4 | | (c) Any deployer licensing a product shall not be liable |
| 5 | | to a plaintiff for violations subsection (a) of Section 15 by |
| 6 | | another solely by reason of ownership or use of the product. |
| 7 | | Section 25. Applicability and enforcement. |
| 8 | | (a) This Act shall supplement any common law tort |
| 9 | | liability and any product liability laws of this State. This |
| 10 | | Act does not prohibit any product liability cause of action |
| 11 | | involving a generative artificial intelligence system or a |
| 12 | | high-impact artificial intelligence system brought under a |
| 13 | | different claim pursuant to product liability common law or |
| 14 | | statute. This Section does not supersede any product liability |
| 15 | | cause of action under State law except to the extent that the |
| 16 | | law would directly conflict with the provisions of this Act. |
| 17 | | (b) Products used strictly for peer-reviewed scientific |
| 18 | | research are exempt from this Act. |
| 19 | | (c) In a liability action brought under this Act, the |
| 20 | | court shall apply a comparative negligence standard, whereby a |
| 21 | | plaintiff's recovery shall be diminished in proportion to the |
| 22 | | percentage of fault attributable to the plaintiff, but the |
| 23 | | recovery shall not be barred regardless of the plaintiff's |
| 24 | | degree of fault, and developers and deployers may be held |
| 25 | | jointly and severally liable for the portion of harm that |
|
| | SB3502 | - 9 - | LRB104 20017 SPS 33468 b |
|
|
| 1 | | contributed to the plaintiff's injury. |
| 2 | | In a liability action brought under this Act, the damages |
| 3 | | for which a deployer is otherwise liable shall be reduced by |
| 4 | | the percentage of responsibility for the plaintiff's harm |
| 5 | | attributable to violations of Section 15 by any person if the |
| 6 | | defendant establishes that the percentage of the plaintiff's |
| 7 | | harm was proximately caused by a violation under Section 15. |
| 8 | | (d) In any products liability action brought under this |
| 9 | | Act, a court shall recognize a rebuttable presumption that a |
| 10 | | product is not defective if and only if that deployer: |
| 11 | | (1) has conducted a documented testing, evaluation, |
| 12 | | verification, validation, and auditing of that system |
| 13 | | consistent with industry best practices, such as the |
| 14 | | latest version of the National Institute of Standards and |
| 15 | | Technology Artificial Intelligence Risk Management |
| 16 | | Framework; |
| 17 | | (2) has mitigated foreseeable risks to the extent |
| 18 | | possible and has considered alternatives; |
| 19 | | (3) has disclosed foreseeable risks and mitigation |
| 20 | | tactics directly to deployers and consumers using the |
| 21 | | product; |
| 22 | | (4) has maintained and made available upon request by |
| 23 | | the Attorney General an artificial intelligence data sheet |
| 24 | | that includes, at a minimum, the following information: |
| 25 | | (A) information on the intended contexts and uses |
| 26 | | of the artificial intelligence model in accordance |
|
| | SB3502 | - 10 - | LRB104 20017 SPS 33468 b |
|
|
| 1 | | with the map guidelines articulated in the National |
| 2 | | Institute of Standards and Technology's latest |
| 3 | | Artificial Intelligence Risk Management Framework; |
| 4 | | (B) information regarding the datasets upon which |
| 5 | | the artificial intelligence was trained, including |
| 6 | | sources, volume, whether the dataset is proprietary, |
| 7 | | and how the datasets further the intended purpose of |
| 8 | | the product; |
| 9 | | (C) information accounting for foreseeable risks |
| 10 | | identified in the management guidelines of the |
| 11 | | National Institute of Standards and Technology's |
| 12 | | latest Artificial Intelligence Risk Management |
| 13 | | Framework and information about steps taken to manage |
| 14 | | those risks; and |
| 15 | | (D) information concerning the results of |
| 16 | | red-teaming testing and steps taken to mitigate |
| 17 | | identified risks, based on guidance developed by the |
| 18 | | National Institute of Standards and Technology; |
| 19 | | (5) has, if the product is designed for or is |
| 20 | | reasonably likely to be used by individuals under 17 years |
| 21 | | old, documented assessments of use of the product's impact |
| 22 | | on cognitive and emotional development, implemented |
| 23 | | age-gating or content restrictions for a product that |
| 24 | | poses foreseeable risks, and provided to deployers and |
| 25 | | direct consumers and their guardians clear, accessible |
| 26 | | disclosures about potential risks; and |
|
| | SB3502 | - 11 - | LRB104 20017 SPS 33468 b |
|
|
| 1 | | (6) has prominently included in the terms and |
| 2 | | conditions of a product the information included in an |
| 3 | | artificial intelligence data sheet, that deployers of the |
| 4 | | product may rely upon when making fit-for-use and |
| 5 | | deployment decisions. |
| 6 | | (e) In any products liability action brought under this |
| 7 | | Act, a court shall recognize a rebuttable presumption that a |
| 8 | | product is not defective if and only if that deployer has |
| 9 | | designed and implemented a risk management policy that: |
| 10 | | (1) specifies the principles, processes, and personnel |
| 11 | | that the deployer shall use in maintaining the risk |
| 12 | | management policy to identify, mitigate, and document any |
| 13 | | risk, especially those impacting individuals under 17 |
| 14 | | years old, that is a reasonably foreseeable consequence of |
| 15 | | deploying or using the system; |
| 16 | | (2) is consistent with industry best practices, such |
| 17 | | as the latest version of the National Institute of |
| 18 | | Standards and Technology Artificial Intelligence Risk |
| 19 | | Management Framework; |
| 20 | | (3) is reasonable considering: |
| 21 | | (A) the size and complexity of the deployer; |
| 22 | | (B) the nature and scope of the system, including |
| 23 | | the intended uses and unintended uses and the |
| 24 | | modifications made to the system by the deployer; and |
| 25 | | (C) the data that the system, once deployed, |
| 26 | | processes as inputs; and |
|
| | SB3502 | - 12 - | LRB104 20017 SPS 33468 b |
|
|
| 1 | | (4) is electronically available to its employees and |
| 2 | | to the Attorney General upon request. |
| 3 | | (f) This Act applies with respect to any action commenced |
| 4 | | on or after the effective date of this Act without regard to |
| 5 | | whether the harm that is the subject of the action or the |
| 6 | | conduct that caused the harm occurred before that date. |
| 7 | | Section 97. Severability. The provisions of this Act are |
| 8 | | severable under Section 1.31 of the Statute on Statutes. |