mayo 12 2025

(A)Identify Yourself: State Bills Would Require Notification When Interacting with AI

Share

As the communication capabilities of artificial intelligence (“AI”)-powered chatbots and automated voice assistants (“callbots”) improve, it is becoming increasingly difficult to tell human from machine. At the same time, businesses are rapidly adopting chatbots and callbots for consumer communications. Consumers, therefore, are more likely than ever to find themselves chatting with bots, sometimes unknowingly.

In an effort to reduce the number of consumers communicating information with AI-powered systems when they believe they are interacting with human agents—including sharing potentially sensitive financial or other information—several states have introduced legislation this year that would require entities to disclose when consumers interact with these chatbots and/or callbots in commercial transactions. Specifically, Alabama, Hawaii, Illinois, Maine, and Massachusetts all introduced bills in 2025 that would make failing to provide the required notification an Unfair or Deceptive Acts or Practices (UDAP) violation. These bills would subject companies using chatbots and callbots in their consumer communication strategies to risk of investigation or enforcement by Attorneys General or similar government officers and, in some cases, to risk of private actions. 

Below is an overview of each state’s bill, presented not as a comprehensive survey of the current regulatory landscape for AI-powered communications, but as a signal of current and upcoming trends in increasing regulation of this aspect of consumer transactions. While the exact parameters of each bill differ, the driving principal is consistent: if a reasonable consumer would be misled to believe they are communicating with a human, they must be notified that they are communicating with AI.

Alabama

Under Alabama House Bill 516, it would be a deceptive act or practice to engage in a commercial transaction or trade practice with a consumer through a chatbot, AI agent, or other technology that engages in a textual or aural conversation that may mislead or deceive a reasonable person to believe they are communicating with a human, and either (i) the consumer is not notified in a clear and conspicuous manner that the consumer is communicating with a non-human, or (ii) the consumer may otherwise reasonably believe they are communicating with a human.

Hawaii

Under Hawaii House Bill 639, it would be an unfair or deceptive act or practice in a commercial transactions or trade practice to use an AI chatbot or other technology capable of mimicking human behavior, and that engages in textual or spoken conversation with a consumer that may mislead or deceive a reasonable person to believe they are engaging with a human without first disclosing to the consumer in a clear and conspicuous fashion that the consumer is interacting with a chatbot. However, small businesses—which are presently undefined—that unknowingly utilize AI chatbots in their operations will not be deemed to be engaged in an unfair or deceptive act or practice, unless the small business has been provided clear and adequate notice of the requirements under the bill and fails to comply after being afforded a reasonable opportunity to do so.

Illinois

Under Illinois House Bill 3021, it would be an unlawful practice—whether or not a consumer is actually misled, deceived, or damaged—to engage in a commercial transaction or trade practice with a consumer in which the consumer is communicating with a chatbot, AI agent, or other technology that engages in a textual or aural conversation, and both (i) the communication may mislead or deceive a reasonable consumer to believe that the consumer is communicating with a human, and (ii) the consumer is not notified in a clear and conspicuous manner that the consumer is communicating with an AI system, and not a human.

Maine

Under Maine House Paper 1154, it would be an unfair trade practice to use an AI chatbot or any other computer technology to engage in a commercial transaction or trade practice with a consumer in a manner that may mislead or deceive a reasonable consumer into believing that the consumer is engaging with a human if either (i) the consumer is not notified in a clear and conspicuous manner that the consumer is not engaging with a human, or (ii) the consumer reasonably believes that the consumer is engaging with a human.

Massachusetts

Under Massachusetts Senate Bill 243, it would be an unfair and deceptive act or practice to engage in a commercial transaction or trade practice with a consumer of any kind in which the consumer is interacting with a chatbot, artificial intelligence agent, or other computer technology that engages in a textual or aural conversation that may mislead or deceive a reasonable person to believe they are engaging with a human—regardless of whether such consumer is in fact misled, deceived or damaged—unless the consumer is notified in a clear and conspicuous manner that they are communicating with a computer, rather than a human.

Conclusion

With states focusing on ensuring that consumers know with whom—or what—they are communicating, it is important that companies deploying AI-powered chatbots and callbots in the consumer financial space stay up-to-date on pending legislation, and ensure that their use of bots is ready to comply with future state laws.

Servicios e Industrias Relacionadas

Stay Up To Date With Our Insights

See how we use a multidisciplinary, integrated approach to meet our clients' needs.
Subscribe