outubro 17 2025

New Obligations Under the California AI Transparency Act and Companion Chatbot Law Add to the Compliance List

Share

On October 13, 2025, California Governor Gavin Newsom signed into law AB853, which updates the California AI Transparency Act by extending the compliance deadline for the existing AI Transparency Act—which includes requirements for latent and manifest disclosures in AI-generated content—to August 2, 2026, and creating obligations on large online platforms, generative artificial intelligence (AI) system-hosting platforms, and capture device manufacturers. Governor Newsom also signed into law SB243, the Companion Chatbot law, which is intended to protect children from the potential harmful effects of companion chatbots by requiring operators to, among other things, be transparent with children that they are interacting with AI, rather than a human. In this Legal Update, we provide an overview of these new laws, which add to the long list of other AI laws developing on a state-by-state basis, and issue-by-issue internally in California.

I. AB853 – Updates to California AI Transparency Act

AB853 amends the existing California AI Transparency Act that originally only applied to “covered providers,” which are persons that create, code, or otherwise produce a generative AI system that has over one million monthly visitors or users and which is publicly accessible within California. AB853 extends the deadline for covered providers to include latent and manifest disclosures in AI-generated content and make available an AI-detection tool (as described in detail in our September Legal Update) from January 1, 2026 to August 2, 2026. In addition, AB853 creates obligations for large online platforms, generative AI system hosting platforms and capture device manufacturers, as described below.

The law will be enforced by the California Attorney General, a city attorney, or a county counsel, and carries a civil penalty of $5,000 per violation, along with attorneys’ fees and costs. The law also notes that each day that a covered provider, large online platform, or capture device manufacturer is in violation of the law is considered a discrete violation.

A. Obligations on Large Online Platforms

Under AB853, a “large online platform” is defined as a public-facing social media platform, file-sharing platform, mass messaging platform, or standalone search engine that distributes content to users who did not create, or collaborate in creating, the content, which exceeded two million unique monthly users during the preceding 12 months. It does not include broadband internet access or telecommunications services.

Beginning January 1, 2027, a large online platform must:

  1. Detect whether any provenance data is embedded into or attached to content distributed on the large online platform;
  2. Provide a user interface to disclose the availability of system provenance data that reliably indicates that the content was generated or substantially altered by a generative AI system, or captured by a capture device; and
  3. Allow a user to inspect all available system provenance data in an easily accessible manner by any of the following means: (A) directly through the large online platform’s user interface; (B) allow the user to download a version of the content with its attached system provenance data; and (C) provide a link to the content’s system provenance data displayed on an internet website or in another application provided either by the large online platform or a third party.

B. Generative AI System Hosting Platforms

Under AB853, a “generative AI hosting platform” is a website or application that makes the source code or model weights of a generative AI system available for download by a California resident, regardless of whether the terms of that use include compensation.
Beginning January 1, 2027, a generative AI system hosting platform must not knowingly make available a generative AI system that does not place the disclosures required under the existing AI Transparency Act, which requires content created or altered by a generative AI system to include:

  1. If a user opts to do so, a manifest disclosure that: (A) identifies content as AI-generated content; (B) is clear, conspicuous, appropriate for the medium of the content, and understandable to a reasonable person; and (C) is permanent or extraordinarily difficult to remove to the extent technically feasible; and
  2. A latent disclosure that: (A) to the extent technically feasible and reasonable, conveys the name of the system’s provider, the name and version of the generative AI system, the time and date of the content’s creation or alteration and a unique identifier; (B) is detectable by the system’s provider’s AI detection tool; (C) is consistent with widely accepted industry standards; and (D) is permanent or extraordinarily difficult to remove, to the extent technically feasible.

C. Capture Device Manufacturers

Under AB853, a “capture device manufacturer” is a person who produces for sale in California a capture device; i.e., a device that can record photographs, audio, or video content such as video and still photography cameras, mobile phones with built-in cameras or microphones, and voice recorders. It does not include a person exclusively engaged in the assembly of a capture device.

For any capture device first produced for sale on or after January 1, 2028, its capture device manufacturer is required to do the following:

  1. Provide a user with the option to include a latent disclosure in content captured by the capture device that conveys all of the following information: (A) the name of the capture device manufacturer; (B) the name and version number of the capture device that created or altered the content; and (C) the time and date of the content’s creation or alteration; and
  2. Embed latent disclosures in content captured by the device by default.

II. SB243 – Companion Chatbot Law

The Companion Chatbot law creates transparency, safety protocol and reporting obligations on operators of “companion chatbots,” which are defined as an AI system with a natural language interface that provides adaptive, human-like responses to user inputs, and which is capable of meeting a user’s social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions. This generally excludes chatbots used for customer service, business functions, video games, and voice command and virtual assistant systems.

The law has a private right of action with statutory damages of $1,000 per violation, along with remedies of injunctive relief and attorneys’ fees and costs.

What do operators need to do to comply?

Obligation Details
Maintain and Publish Protocols for All Users An operator must maintain a protocol for preventing suicidal ideation, suicide, or self-harm content to users (e.g., by providing a notification to users that refers them to crisis service providers, including a suicide hotline or crisis text line, if users express suicidal ideation, suicide, or self-harm) and publish protocol details on their internet website.
Requirements for Known Minors

An operator must:

(A) Disclose to any user the operator knows to be a minor that the user is interacting with AI;
(B) Provide notifications at least every three hours reminding them to take a break and that the companion chatbot is not human; and
(C) Institute reasonable measures to prevent the companion chatbot from producing sexual material or directly stating that the minor should engage in sexually explicit conduct.

Reporting to the California Office of Suicide Prevention

Beginning July 1, 2027, an operator must annually report to the California Office of Suicide Prevention all of the following:

(A) The number of times the operator has issued a crisis service provider referral notification.
(B) Protocols put in place to detect, remove, and respond to instances of suicidal ideation by users. Methods for measuring suicidal ideation must be based in evidence.
(C) Protocols put in place to prohibit a companion chatbot response about suicidal ideation or actions with the user.

Disclosure for All Users An operator must disclose that companion chatbots may not be suitable for some minors.

Stay Up To Date With Our Insights

See how we use a multidisciplinary, integrated approach to meet our clients' needs.
Subscribe