|
| | 104TH GENERAL ASSEMBLY
State of Illinois
2025 and 2026 SB3368 Introduced 2/4/2026, by Sen. Sue Rezin SYNOPSIS AS INTRODUCED: | | | Creates the Chatbot Response Liability Act. Provides that a proprietor of a chatbot that is used as an alternative to a human representative or that provides any substantive response, information, advice, or action may not disclaim liability if the chatbot provides materially misleading, incorrect, contradictory, or harmful information that results in financial loss or other demonstrable harm or that results in bodily harm to the covered user or any third party. Provides that a proprietor of a chatbot shall provide clear, conspicuous, and explicit notice to covered users that the covered users are interacting with an artificial intelligence chatbot program rather than a human. Sets forth requirements for proprietors of companion chatbots, including parental consent for the use of companion chatbots by minors. Requires the Attorney General to adopt rules to determine commercially reasonable and technically feasible methods for proprietors of companion chatbots to comply with the Act. Effective one year after becoming law. |
| |
| | A BILL FOR |
|
|
| | SB3368 | | LRB104 15427 SPS 28582 b |
|
|
| 1 | | AN ACT concerning business. |
| 2 | | Be it enacted by the People of the State of Illinois, |
| 3 | | represented in the General Assembly: |
| 4 | | Section 1. Short title. This Act may be cited as the |
| 5 | | Chatbot Response Liability Act. |
| 6 | | Section 5. Definitions. In this Act: |
| 7 | | "Artificial intelligence" means a machine-based system or |
| 8 | | combination of systems, that for explicit and implicit |
| 9 | | objectives, infers, from the input it receives, how to |
| 10 | | generate outputs, such as predictions, content, |
| 11 | | recommendations, or decisions that can influence physical or |
| 12 | | virtual environments. |
| 13 | | "Chatbot" means an artificial intelligence system, |
| 14 | | software program, or technological application that simulates |
| 15 | | human-like interaction through text messages, audio messages, |
| 16 | | or a combination of text messages and audio messages, to |
| 17 | | provide information and services to users. |
| 18 | | "Companion chatbot" means a chatbot that is designed to |
| 19 | | provide human-like interaction that (i) simulates an |
| 20 | | interpersonal relationship with a user or group of users as |
| 21 | | its primary function or (ii) uses previous user interactions |
| 22 | | when simulating an interpersonal relationship in future |
| 23 | | interactions. |
|
| | SB3368 | - 2 - | LRB104 15427 SPS 28582 b |
|
|
| 1 | | "Covered user" means a person located in this State who |
| 2 | | uses a chatbot. |
| 3 | | "Human-like interaction" means any form of communication |
| 4 | | or interaction that approximates human behavior, including |
| 5 | | nonhuman behavior that could be attributed to a human actor, |
| 6 | | such as a human actor role playing as a fictional nonhuman |
| 7 | | character, an animal, or other interactive entity. |
| 8 | | "Interpersonal relationship" includes, but is not limited |
| 9 | | to, romantic, platonic, familial, adversarial, professional, |
| 10 | | official, therapeutic, or stranger relationships that are |
| 11 | | between the covered user and a fictional or nonfictional |
| 12 | | character or group of characters. |
| 13 | | "Minor" means an individual under the age of 18. |
| 14 | | "Proprietor" means any person, business, company, |
| 15 | | organization, institution, or government entity that owns, |
| 16 | | operates, or deploys a chatbot used to interact with users. |
| 17 | | "Proprietor" does not include third-party developers that |
| 18 | | license their technology to a proprietor. |
| 19 | | Section 10. Liability for misleading information. |
| 20 | | (a) A proprietor of a chatbot that is used as an |
| 21 | | alternative to a human representative, or otherwise as an |
| 22 | | agent of the proprietor to provide any substantive response, |
| 23 | | information, advice, or action, may not disclaim liability if |
| 24 | | a chatbot provides materially misleading, incorrect, |
| 25 | | contradictory, or harmful information to a covered user that |
|
| | SB3368 | - 3 - | LRB104 15427 SPS 28582 b |
|
|
| 1 | | results in financial loss or other demonstrable harm to a |
| 2 | | covered user. No liability shall be imposed if the proprietor |
| 3 | | has corrected the information and substantially or completely |
| 4 | | cured the harm to the covered user within 30 days after the |
| 5 | | proprietor is notified of the harm. |
| 6 | | (b) The proprietor of a chatbot shall be responsible for |
| 7 | | ensuring the chatbot accurately provides information aligned |
| 8 | | with the formal policies, product details, disclosures, and |
| 9 | | terms of service offered to covered users. |
| 10 | | (c) A proprietor may not waive or disclaim liability by |
| 11 | | notifying consumers that the consumers are interacting with a |
| 12 | | nonhuman chatbot system. |
| 13 | | Section 15. Liability for bodily harm. A proprietor of a |
| 14 | | chatbot or another person or entity that directs the |
| 15 | | proprietor's chatbot to provide any substantive response, |
| 16 | | information, advice, or action may not disclaim liability if a |
| 17 | | chatbot provides materially misleading, incorrect, |
| 18 | | contradictory, or harmful information to a covered user that |
| 19 | | results in bodily harm to the covered user or any third party, |
| 20 | | including, but not limited to, any form of self-harm. |
| 21 | | Section 20. Notice requirements. A proprietor of a chatbot |
| 22 | | shall provide clear, conspicuous, and explicit notice to |
| 23 | | covered users that the covered users are interacting with an |
| 24 | | artificial intelligence chatbot program rather than a human. |
|
| | SB3368 | - 4 - | LRB104 15427 SPS 28582 b |
|
|
| 1 | | The text of the notice shall appear in the same language and in |
| 2 | | a size easily readable by the average viewer and no smaller |
| 3 | | than the largest font size of other text appearing on the |
| 4 | | website on which the chatbot is used. |
| 5 | | Section 25. Requirements for proprietors of companion |
| 6 | | chatbots. |
| 7 | | (a) A proprietor of a companion chatbot shall use |
| 8 | | commercially reasonable and technically feasible methods to: |
| 9 | | (1) prevent the companion chatbot from promoting, |
| 10 | | causing, or aiding self-harm; and |
| 11 | | (2) determine whether a covered user is expressing |
| 12 | | thoughts of self-harm and, upon making the determination, |
| 13 | | prohibit continued use of the companion chatbot for a |
| 14 | | period of at least 24 hours, and prominently display a |
| 15 | | means to contact a suicide crisis organization to the |
| 16 | | covered user. |
| 17 | | (b) If a proprietor of a companion chatbot fails to comply |
| 18 | | with the provisions of subsection (a), the proprietor shall be |
| 19 | | liable to covered users who inflict self-harm upon themselves, |
| 20 | | in whole or in part, as a result of the proprietor's companion |
| 21 | | chatbot promoting, causing, or aiding the covered user to |
| 22 | | inflict self-harm. |
| 23 | | (c) Regardless of the proprietor's compliance with |
| 24 | | subsection (a), a proprietor shall be liable to covered users |
| 25 | | who inflict self-harm upon themselves, in whole or in part, if |
|
| | SB3368 | - 5 - | LRB104 15427 SPS 28582 b |
|
|
| 1 | | the proprietor: |
| 2 | | (1) has actual knowledge that the companion chatbot is |
| 3 | | promoting, causing, or aiding self-harm; or |
| 4 | | (2) has actual knowledge that a covered user is |
| 5 | | expressing thoughts of self-harm, fails to prohibit |
| 6 | | continued use of the companion chatbot for a period of at |
| 7 | | least 24 hours, and fails to prominently display a means |
| 8 | | to contact a suicide crisis organization to the covered |
| 9 | | user. |
| 10 | | (d) A proprietor of a companion chatbot may not waive or |
| 11 | | disclaim liability under this Section. |
| 12 | | Section 30. Parental consent for the use of companion |
| 13 | | chatbots by minors. |
| 14 | | (a) A proprietor of a companion chatbot shall use |
| 15 | | commercially reasonable and technically feasible methods to |
| 16 | | determine whether a covered user is a minor. |
| 17 | | (b) If the proprietor of a companion chatbot determines |
| 18 | | that a covered user is a minor or has actual knowledge that a |
| 19 | | covered user is a minor, the proprietor shall: |
| 20 | | (1) stop the covered user's use of the companion |
| 21 | | chatbot until the proprietor has obtained verifiable |
| 22 | | parental consent to provide a companion chatbot to the |
| 23 | | minor user; and |
| 24 | | (2) prohibit the covered user's continued use of the |
| 25 | | companion chatbot for a period of at least 3 days and |
|
| | SB3368 | - 6 - | LRB104 15427 SPS 28582 b |
|
|
| 1 | | prominently display a means to contact a suicide crisis |
| 2 | | organization to the covered user if, using commercially |
| 3 | | reasonable and technically feasible methods, the |
| 4 | | proprietor determines that, or has actual knowledge that, |
| 5 | | a covered user is expressing thoughts of self-harm. |
| 6 | | (c) A proprietor shall be strictly liable for any harm |
| 7 | | caused if: |
| 8 | | (1) the provider fails to comply with subsection (a) |
| 9 | | or (b); and |
| 10 | | (2) a minor covered user inflicts self-harm upon |
| 11 | | themselves, in whole or in part, as a result of the |
| 12 | | proprietor's companion chatbot. |
| 13 | | (d) A proprietor of a companion chatbot may not waive or |
| 14 | | disclaim liability under this Section. |
| 15 | | Section 35. Implementation of commercially reasonable and |
| 16 | | technically feasible methods. A proprietor of a companion |
| 17 | | chatbot shall implement and engage in the ongoing |
| 18 | | implementation of commercially reasonable and technically |
| 19 | | feasible methods to discover vulnerabilities in the |
| 20 | | proprietor's system, including any methods used to determine |
| 21 | | whether a covered user is a minor. |
| 22 | | Section 40. Determination of commercially reasonable and |
| 23 | | technically feasible methods. |
| 24 | | (a) The Attorney General shall adopt rules to determine |
|
| | SB3368 | - 7 - | LRB104 15427 SPS 28582 b |
|
|
| 1 | | commercially reasonable and technically feasible methods for |
| 2 | | proprietors of companion chatbots to comply with this Act. |
| 3 | | (b) In adopting rules related to the commercially |
| 4 | | reasonable and technically feasible methods for proprietors of |
| 5 | | companion chatbots to comply with this Act, the Attorney |
| 6 | | General shall consider the size, financial resources, and |
| 7 | | technical capabilities of the proprietor, the costs and |
| 8 | | effectiveness of available (i) age determination techniques |
| 9 | | for users of companion chatbots; (ii) techniques to prevent |
| 10 | | the promotion, aid, or encouragement of self-harm; (iii) |
| 11 | | techniques to determine whether a user is expressing thoughts |
| 12 | | of self-harm; and (iv) techniques to discover vulnerabilities |
| 13 | | in the proprietor's system. The Attorney General shall also |
| 14 | | consider that prevalent practices of the industry of the |
| 15 | | proprietor and the impact of the techniques listed in this |
| 16 | | subsection on the user's safety, utility, and experience. |
| 17 | | (c) The rules adopted under this Section shall determine |
| 18 | | the appropriate levels of accuracy that would be commercially |
| 19 | | reasonable and technically feasible for proprietors to achieve |
| 20 | | in determining (i) whether a user is a minor, (ii) whether the |
| 21 | | proprietor's companion chatbot is promoting, aiding, or |
| 22 | | encouraging self-harm, and (iii) whether a user is expressing |
| 23 | | thoughts of self-harm. |
| 24 | | Section 45. Determination of methods of obtaining |
| 25 | | verifiable parental consent. The Attorney General shall adopt |
|
| | SB3368 | - 8 - | LRB104 15427 SPS 28582 b |
|
|
| 1 | | rules to determine methods of obtaining verifiable parental |
| 2 | | consent as described in paragraph (1) of subsection (b) of |
| 3 | | Section 30. |
| 4 | | Section 50. Deletion of information collected. |
| 5 | | All information collected for the purpose of determining a |
| 6 | | user's age or obtaining verifiable parental consent under this |
| 7 | | Act shall not be used for any purposes other than determining a |
| 8 | | user's age or obtaining verifiable parental consent and shall |
| 9 | | be deleted immediately after an attempt to determine a user's |
| 10 | | age or obtain verifiable parental consent, except if the |
| 11 | | information is necessary for compliance with any other |
| 12 | | applicable State or federal law. |
| 13 | | Section 55. Limitations. Nothing in this Act shall be |
| 14 | | construed as requiring any proprietor to give a parent who |
| 15 | | grants verifiable parental consent any additional or special |
| 16 | | access to or control over the data or accounts of a covered |
| 17 | | user. |
| 18 | | Section 99. Effective date. This Act takes effect one year |
| 19 | | after becoming law. |