As social media companies and digital services providers increasingly cater to younger audiences, state legislatures are placing greater emphasis on regulating how companies handle children’s personal information. This Legal Update explores the recent developments in state and federal children’s privacy legislation, examining how states are shaping the future of online safety for minors and shedding light on amendments to the federal Children’s Online Privacy Protection Act (“COPPA”).
Several states, including California, Connecticut, Florida, Georgia, Louisiana, New York, Tennessee, and Utah, have passed legislation focused on regulating the collection, use, and disclosure of children’s data in connection with social media use. Below is a brief summary of notable requirements and trends across each state law.
This year, children’s privacy bills related to social media regulations continue to be introduced in other state legislatures. For instance, Utah’s App Store Accountability Act recently passed the State Senate and would require app store providers to verify users’ ages. South Carolina’s Social Media Regulation Act would require social media companies to make commercially reasonable efforts to verify the age of South Carolina account holders and require parental consent for users under the age of 18 to have an account. Similar children’s privacy bills have also been introduced in Alabama (HB 276), Arizona (HB 2861), Arkansas (HB 1082 and HB 1083), Colorado (SB 86), Connecticut (SB 1295) Iowa (HF 278), New York (S 4600 and S 4609), and Tennessee (SB 811 and HB 825).
Last year, multiple states enacted laws requiring age-appropriate design codes to improve online privacy protections for children. The success of these laws has varied.
States are increasingly enhancing online privacy protections for children through “harmful content age verification” laws. These laws require companies to implement reasonable age verification measures before granting children access to potentially harmful content (such as pornography, violence, or other mature themes) or face liability for failing to do so. As of January 2025, 19 states have passed laws requiring age verification to access potentially harm content: Alabama, Arkansas, Florida, Georgia Idaho, Indiana, Kansas, Kentucky, Louisiana, Mississippi, Montana, Nebraska, North Carolina, Oklahoma, South Carolina, Tennessee, Texas, Utah, and Virginia.
On January 15, 2025, Texas Attorney General Ken Paxton defended Texas’ law (HB 1181) before the Supreme Court. The case centers on whether the law, which requires that websites with harmful content verify users’ ages to prevent minors from accessing such content, infringes on the First Amendment. The Court has not yet issued its opinion on the matter.
States’ privacy measures for children extend beyond social media regulation. For example, Texas passed the Securing Children Online through Parental Empowerment (SCOPE) Act last year, which applies to all digital service providers. Effective September 1, 2024, the SCOPE Act prohibits digital service providers from sharing, disclosing, or selling a minor’s personal identifying information without parental consent. It also requires companies to provide parents with tools to manage and control the privacy settings on their children’s accounts. These protections extend to how minors interact with AI products.
Similarly, the New York Child Data Protection Act (CDPA) will prohibit websites, mobile applications, and other online operators from collecting, using, disclosing, or selling personal information of children under the age of 18 unless:
Companies will be subject to the CDPA if they have both: (a) actual knowledge that data is from a minor user; and (b) the website, online service, online application, mobile application, or device is “primarily directed to minors.” The CDPA comes into effect on June 20, 2025.
Other states have passed COPPA-style laws that impose additional restrictions on processing of minors’ data for targeted advertising, including New Hampshire and New Jersey. Similarly, Maryland’s Online Data Privacy Act prohibits the sale or processing or personal data for targeted advertising if the business knew or should have known the consumer is under 18.
Virginia amended its general consumer privacy law to address children’s privacy protections. The amendment to the Consumer Data Protection Act, effective January 1, 2025, requires parental consent for processing personal information of a known child1 under 13 and requires data protection assessments for online services directed to known children. Similarly, Colorado amended its privacy law to strengthen protections for minors’ data. Companies are prohibited from processing minors’ data for targeted advertising and must exercise reasonable care to avoid any heightened risk of harm to minors. The Colorado privacy law amendment will take effect on October 1, 2025.
California is leading the way in enacting legislation to protect children from the risks associated with Artificial Intelligence (AI). On February 20, 2025, the California legislature introduced AB 1064, known as the Leading Ethical Development of AI (LEAD) Act. Among its provisions, the LEAD Act would require parental consent before using a child's personal information to train an AI model and mandate that developers conduct risk-level assessments to classify AI systems based on their potential harm to children. It would also prohibit systems involving facial recognition, emotion detection, and social scoring. Additionally, the LEAD Act would establish an advisory body, the LEAD for Kids Standards Board, to oversee AI technologies used by children.
COPAA aims to protect children’s privacy online and imposes various requirements on online content providers. On January 16, 2025, the FTC finalized updates to COPPA, which originally took effect in 2000 and had not been revised since 2013. The new changes will become effective 60 days after publication in the Federal Register, with a compliance date set for one year after publication. The updates include several revised definitions, new retention requirements, and expanded consent requirements. Additionally, there will be increased transparency regarding compliance with the COPPA Safe Harbor Programs.
The revisions to COPPA were unanimously approved by a 5-0 vote and include updates to address new technology and data collection practices, such as:
a) does not target children as its primary audience, and
b) does not collect personal information from any visitor (other than for certain limited purposes outlined in the statute) before collecting age information or use technology to determine whether the visitor is a child.
Further, to qualify as a mixed audience website or online service, any collection of age information or other means of determining whether a visitor is a child must be done in a neutral manner without defaulting to a set age or encouraging visitors to falsify age information.
There have been other attempts to pass federal legislation regarding children’s privacy rights and children’s online safety in recent years, including the Kids Online Safety Act, which was introduced in 2022 and passed the Senate (packaged together with an update to COPPA), but did not pass last Congress. More recently, on February 19, 2025, the Senate Judiciary Committee held a hearing on children online safety and efforts to boost safeguards for children.
Children’s privacy is also a subject of enforcement scrutiny by state attorneys general and the FTC. For example, the Texas Attorney General has launched investigations into several technology companies regarding their handling of minors’ data and potential violations of the SCOPE Act. In his press release about the investigation, Attorney General Ken Paxton warned, “[t]echnology companies are on notice that [the Texas Attorney General’s] office is vigorously enforcing Texas’s strong data privacy laws. These investigations are a critical step toward ensuring that social media and AI companies comply with our laws designed to protect children from exploitation and harm.” 3
The FTC has been actively enforcing COPPA violations against website operators. From January 2023 to January 2025, the FTC published six enforcement actions related to COPPA investigations on its website. Earlier this year, the FTC settled with an application owner for $20 million for allowing children under the age of 16 to make in-app purchases without parental consent and deceiving children about the costs of such purchases.
1 The Virginia Consumer Data Protection Act does not define a “known child” but defines a child as any natural person younger than 13 years of age. § 59.1-575.
3 https://www.texasattorneygeneral.gov/news/releases/attorney-general-ken-paxton-launches-investigations-characterai-reddit-instagram-discord-and-other
Mayer Brown is a global legal services provider comprising associated legal practices that are separate entities, including Mayer Brown LLP (Illinois, USA), Mayer Brown International LLP (England & Wales), Mayer Brown Hong Kong LLP (a Hong Kong limited liability partnership) and Tauil & Chequer Advogados (a Brazilian law partnership) (collectively, the “Mayer Brown Practices”). The Mayer Brown Practices are established in various jurisdictions and may be a legal person or a partnership. PK Wong & Nair LLC (“PKWN”) is the constituent Singapore law practice of our licensed joint law venture in Singapore, Mayer Brown PK Wong & Nair Pte. Ltd. Mayer Brown Hong Kong LLP operates in temporary association with Johnson Stokes & Master (“JSM”). More information about the individual Mayer Brown Practices, PKWN and the association between Mayer Brown Hong Kong LLP and JSM (including how information may be shared) can be found in the Legal Notices section of our website.
“Mayer Brown” and the Mayer Brown logo are trademarks of Mayer Brown.
Attorney Advertising. Prior results do not guarantee a similar outcome.