104TH GENERAL ASSEMBLY
State of Illinois
2025 and 2026
SB3368

 

Introduced 2/4/2026, by Sen. Sue Rezin

 

SYNOPSIS AS INTRODUCED:
 
New Act

    Creates the Chatbot Response Liability Act. Provides that a proprietor of a chatbot that is used as an alternative to a human representative or that provides any substantive response, information, advice, or action may not disclaim liability if the chatbot provides materially misleading, incorrect, contradictory, or harmful information that results in financial loss or other demonstrable harm or that results in bodily harm to the covered user or any third party. Provides that a proprietor of a chatbot shall provide clear, conspicuous, and explicit notice to covered users that the covered users are interacting with an artificial intelligence chatbot program rather than a human. Sets forth requirements for proprietors of companion chatbots, including parental consent for the use of companion chatbots by minors. Requires the Attorney General to adopt rules to determine commercially reasonable and technically feasible methods for proprietors of companion chatbots to comply with the Act. Effective one year after becoming law.


LRB104 15427 SPS 28582 b

 

 

A BILL FOR

 

SB3368LRB104 15427 SPS 28582 b

1    AN ACT concerning business.
 
2    Be it enacted by the People of the State of Illinois,
3represented in the General Assembly:
 
4    Section 1. Short title. This Act may be cited as the
5Chatbot Response Liability Act.
 
6    Section 5. Definitions. In this Act:
7    "Artificial intelligence" means a machine-based system or
8combination of systems, that for explicit and implicit
9objectives, infers, from the input it receives, how to
10generate outputs, such as predictions, content,
11recommendations, or decisions that can influence physical or
12virtual environments.
13    "Chatbot" means an artificial intelligence system,
14software program, or technological application that simulates
15human-like interaction through text messages, audio messages,
16or a combination of text messages and audio messages, to
17provide information and services to users.
18    "Companion chatbot" means a chatbot that is designed to
19provide human-like interaction that (i) simulates an
20interpersonal relationship with a user or group of users as
21its primary function or (ii) uses previous user interactions
22when simulating an interpersonal relationship in future
23interactions.

 

 

SB3368- 2 -LRB104 15427 SPS 28582 b

1    "Covered user" means a person located in this State who
2uses a chatbot.
3    "Human-like interaction" means any form of communication
4or interaction that approximates human behavior, including
5nonhuman behavior that could be attributed to a human actor,
6such as a human actor role playing as a fictional nonhuman
7character, an animal, or other interactive entity.
8    "Interpersonal relationship" includes, but is not limited
9to, romantic, platonic, familial, adversarial, professional,
10official, therapeutic, or stranger relationships that are
11between the covered user and a fictional or nonfictional
12character or group of characters.
13    "Minor" means an individual under the age of 18.
14    "Proprietor" means any person, business, company,
15organization, institution, or government entity that owns,
16operates, or deploys a chatbot used to interact with users.
17"Proprietor" does not include third-party developers that
18license their technology to a proprietor.
 
19    Section 10. Liability for misleading information.
20    (a) A proprietor of a chatbot that is used as an
21alternative to a human representative, or otherwise as an
22agent of the proprietor to provide any substantive response,
23information, advice, or action, may not disclaim liability if
24a chatbot provides materially misleading, incorrect,
25contradictory, or harmful information to a covered user that

 

 

SB3368- 3 -LRB104 15427 SPS 28582 b

1results in financial loss or other demonstrable harm to a
2covered user. No liability shall be imposed if the proprietor
3has corrected the information and substantially or completely
4cured the harm to the covered user within 30 days after the
5proprietor is notified of the harm.
6    (b) The proprietor of a chatbot shall be responsible for
7ensuring the chatbot accurately provides information aligned
8with the formal policies, product details, disclosures, and
9terms of service offered to covered users.
10    (c) A proprietor may not waive or disclaim liability by
11notifying consumers that the consumers are interacting with a
12nonhuman chatbot system.
 
13    Section 15. Liability for bodily harm. A proprietor of a
14chatbot or another person or entity that directs the
15proprietor's chatbot to provide any substantive response,
16information, advice, or action may not disclaim liability if a
17chatbot provides materially misleading, incorrect,
18contradictory, or harmful information to a covered user that
19results in bodily harm to the covered user or any third party,
20including, but not limited to, any form of self-harm.
 
21    Section 20. Notice requirements. A proprietor of a chatbot
22shall provide clear, conspicuous, and explicit notice to
23covered users that the covered users are interacting with an
24artificial intelligence chatbot program rather than a human.

 

 

SB3368- 4 -LRB104 15427 SPS 28582 b

1The text of the notice shall appear in the same language and in
2a size easily readable by the average viewer and no smaller
3than the largest font size of other text appearing on the
4website on which the chatbot is used.
 
5    Section 25. Requirements for proprietors of companion
6chatbots.
7    (a) A proprietor of a companion chatbot shall use
8commercially reasonable and technically feasible methods to:
9        (1) prevent the companion chatbot from promoting,
10    causing, or aiding self-harm; and
11        (2) determine whether a covered user is expressing
12    thoughts of self-harm and, upon making the determination,
13    prohibit continued use of the companion chatbot for a
14    period of at least 24 hours, and prominently display a
15    means to contact a suicide crisis organization to the
16    covered user.
17    (b) If a proprietor of a companion chatbot fails to comply
18with the provisions of subsection (a), the proprietor shall be
19liable to covered users who inflict self-harm upon themselves,
20in whole or in part, as a result of the proprietor's companion
21chatbot promoting, causing, or aiding the covered user to
22inflict self-harm.
23    (c) Regardless of the proprietor's compliance with
24subsection (a), a proprietor shall be liable to covered users
25who inflict self-harm upon themselves, in whole or in part, if

 

 

SB3368- 5 -LRB104 15427 SPS 28582 b

1the proprietor:
2        (1) has actual knowledge that the companion chatbot is
3    promoting, causing, or aiding self-harm; or
4        (2) has actual knowledge that a covered user is
5    expressing thoughts of self-harm, fails to prohibit
6    continued use of the companion chatbot for a period of at
7    least 24 hours, and fails to prominently display a means
8    to contact a suicide crisis organization to the covered
9    user.
10    (d) A proprietor of a companion chatbot may not waive or
11disclaim liability under this Section.
 
12    Section 30. Parental consent for the use of companion
13chatbots by minors.
14    (a) A proprietor of a companion chatbot shall use
15commercially reasonable and technically feasible methods to
16determine whether a covered user is a minor.
17    (b) If the proprietor of a companion chatbot determines
18that a covered user is a minor or has actual knowledge that a
19covered user is a minor, the proprietor shall:
20        (1) stop the covered user's use of the companion
21    chatbot until the proprietor has obtained verifiable
22    parental consent to provide a companion chatbot to the
23    minor user; and
24        (2) prohibit the covered user's continued use of the
25    companion chatbot for a period of at least 3 days and

 

 

SB3368- 6 -LRB104 15427 SPS 28582 b

1    prominently display a means to contact a suicide crisis
2    organization to the covered user if, using commercially
3    reasonable and technically feasible methods, the
4    proprietor determines that, or has actual knowledge that,
5    a covered user is expressing thoughts of self-harm.
6    (c) A proprietor shall be strictly liable for any harm
7caused if:
8        (1) the provider fails to comply with subsection (a)
9    or (b); and
10        (2) a minor covered user inflicts self-harm upon
11    themselves, in whole or in part, as a result of the
12    proprietor's companion chatbot.
13    (d) A proprietor of a companion chatbot may not waive or
14disclaim liability under this Section.
 
15    Section 35. Implementation of commercially reasonable and
16technically feasible methods. A proprietor of a companion
17chatbot shall implement and engage in the ongoing
18implementation of commercially reasonable and technically
19feasible methods to discover vulnerabilities in the
20proprietor's system, including any methods used to determine
21whether a covered user is a minor.
 
22    Section 40. Determination of commercially reasonable and
23technically feasible methods.
24    (a) The Attorney General shall adopt rules to determine

 

 

SB3368- 7 -LRB104 15427 SPS 28582 b

1commercially reasonable and technically feasible methods for
2proprietors of companion chatbots to comply with this Act.
3    (b) In adopting rules related to the commercially
4reasonable and technically feasible methods for proprietors of
5companion chatbots to comply with this Act, the Attorney
6General shall consider the size, financial resources, and
7technical capabilities of the proprietor, the costs and
8effectiveness of available (i) age determination techniques
9for users of companion chatbots; (ii) techniques to prevent
10the promotion, aid, or encouragement of self-harm; (iii)
11techniques to determine whether a user is expressing thoughts
12of self-harm; and (iv) techniques to discover vulnerabilities
13in the proprietor's system. The Attorney General shall also
14consider that prevalent practices of the industry of the
15proprietor and the impact of the techniques listed in this
16subsection on the user's safety, utility, and experience.
17    (c) The rules adopted under this Section shall determine
18the appropriate levels of accuracy that would be commercially
19reasonable and technically feasible for proprietors to achieve
20in determining (i) whether a user is a minor, (ii) whether the
21proprietor's companion chatbot is promoting, aiding, or
22encouraging self-harm, and (iii) whether a user is expressing
23thoughts of self-harm.
 
24    Section 45. Determination of methods of obtaining
25verifiable parental consent. The Attorney General shall adopt

 

 

SB3368- 8 -LRB104 15427 SPS 28582 b

1rules to determine methods of obtaining verifiable parental
2consent as described in paragraph (1) of subsection (b) of
3Section 30.
 
4    Section 50. Deletion of information collected.
5All information collected for the purpose of determining a
6user's age or obtaining verifiable parental consent under this
7Act shall not be used for any purposes other than determining a
8user's age or obtaining verifiable parental consent and shall
9be deleted immediately after an attempt to determine a user's
10age or obtain verifiable parental consent, except if the
11information is necessary for compliance with any other
12applicable State or federal law.
 
13    Section 55. Limitations. Nothing in this Act shall be
14construed as requiring any proprietor to give a parent who
15grants verifiable parental consent any additional or special
16access to or control over the data or accounts of a covered
17user.
 
18    Section 99. Effective date. This Act takes effect one year
19after becoming law.